Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add_special option for server tokenize endpoint #7059

Merged
merged 1 commit into from May 8, 2024

Conversation

JohanAR
Copy link
Contributor

@JohanAR JohanAR commented May 3, 2024

No description provided.

@slaren slaren requested a review from phymbert May 3, 2024 13:05
Copy link
Collaborator

@phymbert phymbert left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I do not use the tokenize endpoint on my end.

But is it worth the effort to add a test case ?

Copy link
Contributor

github-actions bot commented May 3, 2024

📈 llama.cpp server for bench-server-baseline on Standard_NC4as_T4_v3 for phi-2-q4_0: 548 iterations 🚀

Expand details for performance related PR only
  • Concurrent users: 8, duration: 10m
  • HTTP request : avg=8617.66ms p(95)=20828.13ms fails=, finish reason: stop=476 truncated=72
  • Prompt processing (pp): avg=97.13tk/s p(95)=406.17tk/s
  • Token generation (tg): avg=32.89tk/s p(95)=46.41tk/s
  • ggml-org/models/phi-2/ggml-model-q4_0.gguf parallel=8 ctx-size=16384 ngl=33 batch-size=2048 ubatch-size=256 pp=1024 pp+tg=2048 branch=master commit=22fddbb625c7ed309a478cc6c956d067b3856093

prompt_tokens_seconds

More
---
config:
    xyChart:
        titleFontSize: 12
        width: 900
        height: 600
    themeVariables:
        xyChart:
            titleColor: "#000000"
---
xychart-beta
    title "llama.cpp bench-server-baseline on Standard_NC4as_T4_v3
 duration=10m 548 iterations"
    y-axis "llamacpp:prompt_tokens_seconds"
    x-axis "llamacpp:prompt_tokens_seconds" 1714846375 --> 1714847005
    line [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 695.98, 695.98, 695.98, 695.98, 695.98, 723.94, 723.94, 723.94, 723.94, 723.94, 729.66, 729.66, 729.66, 729.66, 729.66, 759.75, 759.75, 759.75, 759.75, 759.75, 806.43, 806.43, 806.43, 806.43, 806.43, 804.36, 804.36, 804.36, 804.36, 804.36, 801.47, 801.47, 801.47, 801.47, 801.47, 834.75, 834.75, 834.75, 834.75, 834.75, 833.49, 833.49, 833.49, 833.49, 833.49, 845.06, 845.06, 845.06, 845.06, 845.06, 862.92, 862.92, 862.92, 862.92, 862.92, 857.67, 857.67, 857.67, 857.67, 857.67, 858.48, 858.48, 858.48, 858.48, 858.48, 882.73, 882.73, 882.73, 882.73, 882.73, 892.25, 892.25, 892.25, 892.25, 892.25, 885.12, 885.12, 885.12, 885.12, 885.12, 891.62, 891.62, 891.62, 891.62, 891.62, 900.44, 900.44, 900.44, 900.44, 900.44, 899.79, 899.79, 899.79, 899.79, 899.79, 893.38, 893.38, 893.38, 893.38, 893.38, 895.62, 895.62, 895.62, 895.62, 895.62, 898.59, 898.59, 898.59, 898.59, 898.59, 890.05, 890.05, 890.05, 890.05, 890.05, 887.62, 887.62, 887.62, 887.62, 887.62, 890.38, 890.38, 890.38, 890.38, 890.38, 881.0, 881.0, 881.0, 881.0, 881.0, 877.45, 877.45, 877.45, 877.45, 877.45, 874.73, 874.73, 874.73, 874.73, 874.73, 871.15, 871.15, 871.15, 871.15, 871.15, 876.77, 876.77, 876.77, 876.77, 876.77, 876.34, 876.34, 876.34, 876.34, 876.34, 873.85, 873.85, 873.85, 873.85, 873.85, 876.76, 876.76, 876.76, 876.76, 876.76, 886.6, 886.6, 886.6, 886.6, 886.6, 892.94, 892.94, 892.94, 892.94, 892.94, 891.58, 891.58, 891.58, 891.58, 891.58, 890.32, 890.32, 890.32, 890.32, 890.32, 888.24, 888.24, 888.24, 888.24, 888.24, 889.46, 889.46, 889.46, 889.46, 889.46, 889.85, 889.85, 889.85, 889.85, 889.85, 889.53, 889.53, 889.53, 889.53, 889.53, 865.4, 865.4, 865.4, 865.4, 865.4, 861.97, 861.97, 861.97, 861.97, 861.97, 838.54, 838.54, 838.54, 838.54, 838.54, 837.56, 837.56, 837.56, 837.56, 837.56, 835.05, 835.05, 835.05, 835.05, 835.05, 844.19, 844.19, 844.19, 844.19, 844.19, 843.83, 843.83, 843.83, 843.83, 843.83, 842.14, 842.14, 842.14, 842.14, 842.14, 838.65, 838.65, 838.65, 838.65, 838.65, 843.0, 843.0, 843.0, 843.0, 843.0, 847.29, 847.29, 847.29, 847.29, 847.29, 845.88, 845.88, 845.88, 845.88, 845.88, 852.93, 852.93, 852.93, 852.93, 852.93, 853.8, 853.8, 853.8, 853.8, 853.8, 853.51, 853.51, 853.51, 853.51, 853.51, 853.1, 853.1, 853.1, 853.1, 853.1, 852.17, 852.17, 852.17, 852.17, 852.17, 853.43, 853.43, 853.43, 853.43, 853.43, 853.77, 853.77, 853.77, 853.77, 853.77, 856.13, 856.13, 856.13, 856.13, 856.13, 856.13]
                    
predicted_tokens_seconds
More
---
config:
    xyChart:
        titleFontSize: 12
        width: 900
        height: 600
    themeVariables:
        xyChart:
            titleColor: "#000000"
---
xychart-beta
    title "llama.cpp bench-server-baseline on Standard_NC4as_T4_v3
 duration=10m 548 iterations"
    y-axis "llamacpp:predicted_tokens_seconds"
    x-axis "llamacpp:predicted_tokens_seconds" 1714846375 --> 1714847005
    line [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 46.25, 46.25, 46.25, 46.25, 46.25, 43.25, 43.25, 43.25, 43.25, 43.25, 32.39, 32.39, 32.39, 32.39, 32.39, 30.19, 30.19, 30.19, 30.19, 30.19, 31.32, 31.32, 31.32, 31.32, 31.32, 32.23, 32.23, 32.23, 32.23, 32.23, 33.97, 33.97, 33.97, 33.97, 33.97, 34.39, 34.39, 34.39, 34.39, 34.39, 34.44, 34.44, 34.44, 34.44, 34.44, 34.31, 34.31, 34.31, 34.31, 34.31, 34.51, 34.51, 34.51, 34.51, 34.51, 34.4, 34.4, 34.4, 34.4, 34.4, 33.54, 33.54, 33.54, 33.54, 33.54, 32.43, 32.43, 32.43, 32.43, 32.43, 32.16, 32.16, 32.16, 32.16, 32.16, 32.5, 32.5, 32.5, 32.5, 32.5, 32.66, 32.66, 32.66, 32.66, 32.66, 32.22, 32.22, 32.22, 32.22, 32.22, 32.03, 32.03, 32.03, 32.03, 32.03, 31.82, 31.82, 31.82, 31.82, 31.82, 31.67, 31.67, 31.67, 31.67, 31.67, 31.79, 31.79, 31.79, 31.79, 31.79, 31.84, 31.84, 31.84, 31.84, 31.84, 31.71, 31.71, 31.71, 31.71, 31.71, 31.76, 31.76, 31.76, 31.76, 31.76, 31.91, 31.91, 31.91, 31.91, 31.91, 31.54, 31.54, 31.54, 31.54, 31.54, 31.07, 31.07, 31.07, 31.07, 31.07, 31.1, 31.1, 31.1, 31.1, 31.1, 31.34, 31.34, 31.34, 31.34, 31.34, 31.52, 31.52, 31.52, 31.52, 31.52, 31.59, 31.59, 31.59, 31.59, 31.59, 31.66, 31.66, 31.66, 31.66, 31.66, 31.62, 31.62, 31.62, 31.62, 31.62, 31.56, 31.56, 31.56, 31.56, 31.56, 31.46, 31.46, 31.46, 31.46, 31.46, 31.28, 31.28, 31.28, 31.28, 31.28, 31.22, 31.22, 31.22, 31.22, 31.22, 31.35, 31.35, 31.35, 31.35, 31.35, 31.56, 31.56, 31.56, 31.56, 31.56, 31.65, 31.65, 31.65, 31.65, 31.65, 31.69, 31.69, 31.69, 31.69, 31.69, 31.59, 31.59, 31.59, 31.59, 31.59, 30.83, 30.83, 30.83, 30.83, 30.83, 30.79, 30.79, 30.79, 30.79, 30.79, 29.83, 29.83, 29.83, 29.83, 29.83, 29.83, 29.83, 29.83, 29.83, 29.83, 29.84, 29.84, 29.84, 29.84, 29.84, 29.96, 29.96, 29.96, 29.96, 29.96, 30.05, 30.05, 30.05, 30.05, 30.05, 30.15, 30.15, 30.15, 30.15, 30.15, 30.15, 30.15, 30.15, 30.15, 30.15, 30.06, 30.06, 30.06, 30.06, 30.06, 29.9, 29.9, 29.9, 29.9, 29.9, 29.96, 29.96, 29.96, 29.96, 29.96, 29.99, 29.99, 29.99, 29.99, 29.99, 30.17, 30.17, 30.17, 30.17, 30.17, 30.25, 30.25, 30.25, 30.25, 30.25, 30.36, 30.36, 30.36, 30.36, 30.36, 30.43, 30.43, 30.43, 30.43, 30.43, 30.43, 30.43, 30.43, 30.43, 30.43, 30.46]
                    

Details

kv_cache_usage_ratio

More
---
config:
    xyChart:
        titleFontSize: 12
        width: 900
        height: 600
    themeVariables:
        xyChart:
            titleColor: "#000000"
---
xychart-beta
    title "llama.cpp bench-server-baseline on Standard_NC4as_T4_v3
 duration=10m 548 iterations"
    y-axis "llamacpp:kv_cache_usage_ratio"
    x-axis "llamacpp:kv_cache_usage_ratio" 1714846375 --> 1714847005
    line [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.18, 0.18, 0.18, 0.18, 0.18, 0.4, 0.4, 0.4, 0.4, 0.4, 0.29, 0.29, 0.29, 0.29, 0.29, 0.14, 0.14, 0.14, 0.14, 0.14, 0.15, 0.15, 0.15, 0.15, 0.15, 0.13, 0.13, 0.13, 0.13, 0.13, 0.12, 0.12, 0.12, 0.12, 0.12, 0.14, 0.14, 0.14, 0.14, 0.14, 0.18, 0.18, 0.18, 0.18, 0.18, 0.13, 0.13, 0.13, 0.13, 0.13, 0.24, 0.24, 0.24, 0.24, 0.24, 0.15, 0.15, 0.15, 0.15, 0.15, 0.18, 0.18, 0.18, 0.18, 0.18, 0.27, 0.27, 0.27, 0.27, 0.27, 0.14, 0.14, 0.14, 0.14, 0.14, 0.11, 0.11, 0.11, 0.11, 0.11, 0.24, 0.24, 0.24, 0.24, 0.24, 0.25, 0.25, 0.25, 0.25, 0.25, 0.27, 0.27, 0.27, 0.27, 0.27, 0.17, 0.17, 0.17, 0.17, 0.17, 0.14, 0.14, 0.14, 0.14, 0.14, 0.15, 0.15, 0.15, 0.15, 0.15, 0.31, 0.31, 0.31, 0.31, 0.31, 0.1, 0.1, 0.1, 0.1, 0.1, 0.17, 0.17, 0.17, 0.17, 0.17, 0.31, 0.31, 0.31, 0.31, 0.31, 0.37, 0.37, 0.37, 0.37, 0.37, 0.22, 0.22, 0.22, 0.22, 0.22, 0.12, 0.12, 0.12, 0.12, 0.12, 0.14, 0.14, 0.14, 0.14, 0.14, 0.15, 0.15, 0.15, 0.15, 0.15, 0.15, 0.15, 0.15, 0.15, 0.15, 0.19, 0.19, 0.19, 0.19, 0.19, 0.23, 0.23, 0.23, 0.23, 0.23, 0.18, 0.18, 0.18, 0.18, 0.18, 0.34, 0.34, 0.34, 0.34, 0.34, 0.16, 0.16, 0.16, 0.16, 0.16, 0.21, 0.21, 0.21, 0.21, 0.21, 0.1, 0.1, 0.1, 0.1, 0.1, 0.11, 0.11, 0.11, 0.11, 0.11, 0.1, 0.1, 0.1, 0.1, 0.1, 0.38, 0.38, 0.38, 0.38, 0.38, 0.52, 0.52, 0.52, 0.52, 0.52, 0.5, 0.5, 0.5, 0.5, 0.5, 0.5, 0.5, 0.5, 0.5, 0.5, 0.09, 0.09, 0.09, 0.09, 0.09, 0.17, 0.17, 0.17, 0.17, 0.17, 0.11, 0.11, 0.11, 0.11, 0.11, 0.14, 0.14, 0.14, 0.14, 0.14, 0.12, 0.12, 0.12, 0.12, 0.12, 0.16, 0.16, 0.16, 0.16, 0.16, 0.31, 0.31, 0.31, 0.31, 0.31, 0.2, 0.2, 0.2, 0.2, 0.2, 0.26, 0.26, 0.26, 0.26, 0.26, 0.15, 0.15, 0.15, 0.15, 0.15, 0.11, 0.11, 0.11, 0.11, 0.11, 0.1, 0.1, 0.1, 0.1, 0.1, 0.16, 0.16, 0.16, 0.16, 0.16, 0.13, 0.13, 0.13, 0.13, 0.13, 0.15, 0.15, 0.15, 0.15, 0.15, 0.21, 0.21, 0.21, 0.21, 0.21, 0.31]
                    
requests_processing
More
---
config:
    xyChart:
        titleFontSize: 12
        width: 900
        height: 600
    themeVariables:
        xyChart:
            titleColor: "#000000"
---
xychart-beta
    title "llama.cpp bench-server-baseline on Standard_NC4as_T4_v3
 duration=10m 548 iterations"
    y-axis "llamacpp:requests_processing"
    x-axis "llamacpp:requests_processing" 1714846375 --> 1714847005
    line [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 4.0, 4.0, 4.0, 4.0, 4.0, 8.0, 8.0, 8.0, 8.0, 8.0, 5.0, 5.0, 5.0, 5.0, 5.0, 4.0, 4.0, 4.0, 4.0, 4.0, 4.0, 4.0, 4.0, 4.0, 4.0, 5.0, 5.0, 5.0, 5.0, 5.0, 1.0, 1.0, 1.0, 1.0, 1.0, 4.0, 4.0, 4.0, 4.0, 4.0, 8.0, 8.0, 8.0, 8.0, 8.0, 4.0, 4.0, 4.0, 4.0, 4.0, 6.0, 6.0, 6.0, 6.0, 6.0, 7.0, 7.0, 7.0, 7.0, 7.0, 1.0, 1.0, 1.0, 1.0, 1.0, 2.0, 2.0, 2.0, 2.0, 2.0, 5.0, 5.0, 5.0, 5.0, 5.0, 4.0, 4.0, 4.0, 4.0, 4.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 6.0, 6.0, 6.0, 6.0, 6.0, 7.0, 7.0, 7.0, 7.0, 7.0, 6.0, 6.0, 6.0, 6.0, 6.0, 5.0, 5.0, 5.0, 5.0, 5.0, 7.0, 7.0, 7.0, 7.0, 7.0, 1.0, 1.0, 1.0, 1.0, 1.0, 7.0, 7.0, 7.0, 7.0, 7.0, 8.0, 8.0, 8.0, 8.0, 8.0, 5.0, 5.0, 5.0, 5.0, 5.0, 3.0, 3.0, 3.0, 3.0, 3.0, 8.0, 8.0, 8.0, 8.0, 8.0, 3.0, 3.0, 3.0, 3.0, 3.0, 4.0, 4.0, 4.0, 4.0, 4.0, 7.0, 7.0, 7.0, 7.0, 7.0, 8.0, 8.0, 8.0, 8.0, 8.0, 4.0, 4.0, 4.0, 4.0, 4.0, 8.0, 8.0, 8.0, 8.0, 8.0, 7.0, 7.0, 7.0, 7.0, 7.0, 8.0, 8.0, 8.0, 8.0, 8.0, 4.0, 4.0, 4.0, 4.0, 4.0, 5.0, 5.0, 5.0, 5.0, 5.0, 2.0, 2.0, 2.0, 2.0, 2.0, 7.0, 7.0, 7.0, 7.0, 7.0, 8.0, 8.0, 8.0, 8.0, 8.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 8.0, 8.0, 8.0, 8.0, 8.0, 7.0, 7.0, 7.0, 7.0, 7.0, 5.0, 5.0, 5.0, 5.0, 5.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 3.0, 3.0, 3.0, 3.0, 3.0, 5.0, 5.0, 5.0, 5.0, 5.0, 6.0, 6.0, 6.0, 6.0, 6.0, 4.0, 4.0, 4.0, 4.0, 4.0, 2.0, 2.0, 2.0, 2.0, 2.0, 4.0, 4.0, 4.0, 4.0, 4.0, 7.0, 7.0, 7.0, 7.0, 7.0, 3.0, 3.0, 3.0, 3.0, 3.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 3.0, 3.0, 3.0, 3.0, 3.0, 6.0, 6.0, 6.0, 6.0, 6.0, 3.0]
                    

@JohanAR
Copy link
Contributor Author

JohanAR commented May 3, 2024

I do not use the tokenize endpoint on my end.

But is it worth the effort to add a test case ?

I'm using the /completion endpoint with lists of tokens instead of strings, and I thought it would be nice if the client didn't need to know how the model represents the BOS token, now that there's a bunch of new models which don't use <s>

It's a pretty small feature, but I can look into adding a test if you prefer.

@teleprint-me
Copy link
Contributor

teleprint-me commented May 4, 2024

As an FYI, train-text-from-scratch uses the BOS by default. I've been depending on these tokens because I use sentencepiece models most of the time and sentencepiece uses these tokens out of the box. Not a big deal, I just think it's worth mentioning.

@ngxson
Copy link
Collaborator

ngxson commented May 4, 2024

Yeah this seems to be an useful thing, since sometimes we want to tokenize a part of text (BOS should not be added in this case)

For extra safe, I'd suggest a test case like this: Tokenize a text with add_special = true and then with add_special = false, then compare if the output is different or not (with add_special = true we should see more tokens)

@JohanAR
Copy link
Contributor Author

JohanAR commented May 4, 2024

@ngxson currently trying to understand the python behave framework and how to formulate the tests :) I'm not sure how to express something like "when the following string is tokenized, the resulting length must be longer than when this other string is tokenized" in behave, but perhaps I could figure it out.

Was thinking I would look up the BOS token for the test model (I'm guessing it's <s>) and then verify that /tokenize with {content="<s>"}, {content="<s>", add_special=False} and {content="", add_special=True} all tokenize to the same sequence. Though perhaps it's not a good idea to hard-code a dependence to a particular tokenizer, i.e. making a case which fails if the test model is substituted for one which has <|begin_of_text|> or <BOS_TOKEN> instead.

Update: The above did not work, the tokenizer in tinyllamas/stories260K.gguf behaves strange. I'll hard-code the BOS token as [1] in the tests and check for that instead, not really worse than hard-coding it to <s> IMO

@ngxson
Copy link
Collaborator

ngxson commented May 4, 2024

Update: The above did not work, the tokenizer in tinyllamas/stories260K.gguf behaves strange. I'll hard-code the BOS token as [1] in the tests and check for that instead, not really worse than hard-coding it to <s> IMO

Instead of hard-coding, another idea would be to input the BOS as a behave step, something like:

    Given a server listening on localhost:8080
    And   a model file tinyllamas/stories260K.gguf from HF repo ggml-org/models
    And   a model file test-model.gguf
    And   a model alias tinyllama-2
    ...
    And BOS token is 1

The handler for that is defined at the beginning of steps.py

@JohanAR
Copy link
Contributor Author

JohanAR commented May 4, 2024

@ngxson yea, perhaps that is better. Changed it to do that.

@JohanAR JohanAR requested a review from phymbert May 4, 2024 20:15
@ggerganov ggerganov merged commit 911b390 into ggerganov:master May 8, 2024
64 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants