Skip to content

server : improve "prompt" handling#7847

Merged
ggerganov merged 1 commit intomasterfrom
gg/server-fix-prompt
Jun 10, 2024
Merged

server : improve "prompt" handling#7847
ggerganov merged 1 commit intomasterfrom
gg/server-fix-prompt

Conversation

@ggerganov
Copy link
Copy Markdown
Member

@ggerganov ggerganov commented Jun 10, 2024

fix #7842

Also try to simplify the prompt state in server_slot to be represented as a std::string instead of json

@ggerganov ggerganov force-pushed the gg/server-fix-prompt branch from d0b0946 to 9e4d62e Compare June 10, 2024 06:31
@github-actions
Copy link
Copy Markdown
Contributor

📈 llama.cpp server for bench-server-baseline on Standard_NC4as_T4_v3 for phi-2-q4_0: 547 iterations 🚀

Expand details for performance related PR only
  • Concurrent users: 8, duration: 10m
  • HTTP request : avg=8555.55ms p(95)=20941.76ms fails=, finish reason: stop=492 truncated=55
  • Prompt processing (pp): avg=100.46tk/s p(95)=448.46tk/s
  • Token generation (tg): avg=34.63tk/s p(95)=46.87tk/s
  • ggml-org/models/phi-2/ggml-model-q4_0.gguf parallel=8 ctx-size=16384 ngl=33 batch-size=2048 ubatch-size=256 pp=1024 pp+tg=2048 branch=gg/server-fix-prompt commit=9e4d62e6abb9096fa93f6d7756547ec495888eb8

prompt_tokens_seconds

More
---
config:
    xyChart:
        titleFontSize: 12
        width: 900
        height: 600
    themeVariables:
        xyChart:
            titleColor: "#000000"
---
xychart-beta
    title "llama.cpp bench-server-baseline on Standard_NC4as_T4_v3
 duration=10m 547 iterations"
    y-axis "llamacpp:prompt_tokens_seconds"
    x-axis "llamacpp:prompt_tokens_seconds" 1718002435 --> 1718003059
    line [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 573.72, 573.72, 573.72, 573.72, 573.72, 792.51, 792.51, 792.51, 792.51, 792.51, 796.43, 796.43, 796.43, 796.43, 796.43, 878.4, 878.4, 878.4, 878.4, 878.4, 903.73, 903.73, 903.73, 903.73, 903.73, 893.73, 893.73, 893.73, 893.73, 893.73, 905.52, 905.52, 905.52, 905.52, 905.52, 921.03, 921.03, 921.03, 921.03, 921.03, 935.62, 935.62, 935.62, 935.62, 935.62, 948.68, 948.68, 948.68, 948.68, 948.68, 942.96, 942.96, 942.96, 942.96, 942.96, 925.49, 925.49, 925.49, 925.49, 925.49, 933.05, 933.05, 933.05, 933.05, 933.05, 818.45, 818.45, 818.45, 818.45, 818.45, 823.31, 823.31, 823.31, 823.31, 823.31, 827.29, 827.29, 827.29, 827.29, 827.29, 830.86, 830.86, 830.86, 830.86, 830.86, 827.92, 827.92, 827.92, 827.92, 827.92, 807.13, 807.13, 807.13, 807.13, 807.13, 812.55, 812.55, 812.55, 812.55, 812.55, 819.07, 819.07, 819.07, 819.07, 819.07, 822.09, 822.09, 822.09, 822.09, 822.09, 798.77, 798.77, 798.77, 798.77, 798.77, 800.53, 800.53, 800.53, 800.53, 800.53, 802.41, 802.41, 802.41, 802.41, 802.41, 818.99, 818.99, 818.99, 818.99, 818.99, 821.11, 821.11, 821.11, 821.11, 821.11, 819.25, 819.25, 819.25, 819.25, 819.25, 821.65, 821.65, 821.65, 821.65, 821.65, 824.88, 824.88, 824.88, 824.88, 824.88, 824.32, 824.32, 824.32, 824.32, 824.32, 828.49, 828.49, 828.49, 828.49, 828.49, 838.63, 838.63, 838.63, 838.63, 838.63, 844.65, 844.65, 844.65, 844.65, 844.65, 844.97, 844.97, 844.97, 844.97, 844.97, 838.71, 838.71, 838.71, 838.71, 838.71, 837.0, 837.0, 837.0, 837.0, 837.0, 837.07, 837.07, 837.07, 837.07, 837.07, 841.63, 841.63, 841.63, 841.63, 841.63, 842.64, 842.64, 842.64, 842.64, 842.64, 852.23, 852.23, 852.23, 852.23, 852.23, 847.39, 847.39, 847.39, 847.39, 847.39, 849.88, 849.88, 849.88, 849.88, 849.88, 849.88, 849.88, 849.88, 849.88, 849.88, 848.62, 848.62, 848.62, 848.62, 848.62, 833.51, 833.51, 833.51, 833.51, 833.51, 838.64, 838.64, 838.64, 838.64, 838.64, 840.54, 840.54, 840.54, 840.54, 840.54, 840.03, 840.03, 840.03, 840.03, 840.03, 844.92, 844.92, 844.92, 844.92, 844.92, 844.66, 844.66, 844.66, 844.66, 844.66, 846.48, 846.48, 846.48, 846.48, 846.48, 850.77, 850.77, 850.77, 850.77, 850.77, 846.59, 846.59, 846.59, 846.59, 846.59, 851.39, 851.39, 851.39, 851.39, 851.39, 852.28, 852.28, 852.28, 852.28, 852.28, 852.12, 852.12, 852.12, 852.12, 852.12, 852.13, 852.13, 852.13, 852.13, 852.13, 852.13, 852.13, 852.13, 852.13, 852.13, 852.06, 852.06, 852.06, 852.06, 852.06, 852.29, 852.29, 852.29]
                    
Loading
predicted_tokens_seconds
More
---
config:
    xyChart:
        titleFontSize: 12
        width: 900
        height: 600
    themeVariables:
        xyChart:
            titleColor: "#000000"
---
xychart-beta
    title "llama.cpp bench-server-baseline on Standard_NC4as_T4_v3
 duration=10m 547 iterations"
    y-axis "llamacpp:predicted_tokens_seconds"
    x-axis "llamacpp:predicted_tokens_seconds" 1718002435 --> 1718003059
    line [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 45.13, 45.13, 45.13, 45.13, 45.13, 33.98, 33.98, 33.98, 33.98, 33.98, 28.83, 28.83, 28.83, 28.83, 28.83, 31.43, 31.43, 31.43, 31.43, 31.43, 31.89, 31.89, 31.89, 31.89, 31.89, 33.0, 33.0, 33.0, 33.0, 33.0, 33.94, 33.94, 33.94, 33.94, 33.94, 34.34, 34.34, 34.34, 34.34, 34.34, 34.81, 34.81, 34.81, 34.81, 34.81, 34.45, 34.45, 34.45, 34.45, 34.45, 33.9, 33.9, 33.9, 33.9, 33.9, 33.35, 33.35, 33.35, 33.35, 33.35, 32.56, 32.56, 32.56, 32.56, 32.56, 32.33, 32.33, 32.33, 32.33, 32.33, 31.51, 31.51, 31.51, 31.51, 31.51, 30.02, 30.02, 30.02, 30.02, 30.02, 29.63, 29.63, 29.63, 29.63, 29.63, 30.16, 30.16, 30.16, 30.16, 30.16, 30.3, 30.3, 30.3, 30.3, 30.3, 30.32, 30.32, 30.32, 30.32, 30.32, 30.33, 30.33, 30.33, 30.33, 30.33, 30.47, 30.47, 30.47, 30.47, 30.47, 30.64, 30.64, 30.64, 30.64, 30.64, 30.55, 30.55, 30.55, 30.55, 30.55, 30.99, 30.99, 30.99, 30.99, 30.99, 30.8, 30.8, 30.8, 30.8, 30.8, 30.89, 30.89, 30.89, 30.89, 30.89, 31.01, 31.01, 31.01, 31.01, 31.01, 31.29, 31.29, 31.29, 31.29, 31.29, 31.34, 31.34, 31.34, 31.34, 31.34, 31.47, 31.47, 31.47, 31.47, 31.47, 31.62, 31.62, 31.62, 31.62, 31.62, 31.67, 31.67, 31.67, 31.67, 31.67, 31.55, 31.55, 31.55, 31.55, 31.55, 31.26, 31.26, 31.26, 31.26, 31.26, 30.84, 30.84, 30.84, 30.84, 30.84, 30.84, 30.84, 30.84, 30.84, 30.84, 31.04, 31.04, 31.04, 31.04, 31.04, 31.21, 31.21, 31.21, 31.21, 31.21, 31.3, 31.3, 31.3, 31.3, 31.3, 31.37, 31.37, 31.37, 31.37, 31.37, 31.37, 31.37, 31.37, 31.37, 31.37, 30.99, 30.99, 30.99, 30.99, 30.99, 30.99, 30.99, 30.99, 30.99, 30.99, 30.66, 30.66, 30.66, 30.66, 30.66, 28.85, 28.85, 28.85, 28.85, 28.85, 28.86, 28.86, 28.86, 28.86, 28.86, 28.88, 28.88, 28.88, 28.88, 28.88, 28.97, 28.97, 28.97, 28.97, 28.97, 28.97, 28.97, 28.97, 28.97, 28.97, 29.1, 29.1, 29.1, 29.1, 29.1, 29.13, 29.13, 29.13, 29.13, 29.13, 29.11, 29.11, 29.11, 29.11, 29.11, 28.95, 28.95, 28.95, 28.95, 28.95, 28.87, 28.87, 28.87, 28.87, 28.87, 28.93, 28.93, 28.93, 28.93, 28.93, 29.09, 29.09, 29.09, 29.09, 29.09, 29.25, 29.25, 29.25, 29.25, 29.25, 29.33, 29.33, 29.33, 29.33, 29.33, 29.49, 29.49, 29.49, 29.49, 29.49, 29.57, 29.57, 29.57]
                    
Loading

Details

kv_cache_usage_ratio

More
---
config:
    xyChart:
        titleFontSize: 12
        width: 900
        height: 600
    themeVariables:
        xyChart:
            titleColor: "#000000"
---
xychart-beta
    title "llama.cpp bench-server-baseline on Standard_NC4as_T4_v3
 duration=10m 547 iterations"
    y-axis "llamacpp:kv_cache_usage_ratio"
    x-axis "llamacpp:kv_cache_usage_ratio" 1718002435 --> 1718003059
    line [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.18, 0.18, 0.18, 0.18, 0.18, 0.28, 0.28, 0.28, 0.28, 0.28, 0.15, 0.15, 0.15, 0.15, 0.15, 0.17, 0.17, 0.17, 0.17, 0.17, 0.24, 0.24, 0.24, 0.24, 0.24, 0.1, 0.1, 0.1, 0.1, 0.1, 0.11, 0.11, 0.11, 0.11, 0.11, 0.1, 0.1, 0.1, 0.1, 0.1, 0.21, 0.21, 0.21, 0.21, 0.21, 0.3, 0.3, 0.3, 0.3, 0.3, 0.16, 0.16, 0.16, 0.16, 0.16, 0.3, 0.3, 0.3, 0.3, 0.3, 0.26, 0.26, 0.26, 0.26, 0.26, 0.36, 0.36, 0.36, 0.36, 0.36, 0.39, 0.39, 0.39, 0.39, 0.39, 0.35, 0.35, 0.35, 0.35, 0.35, 0.12, 0.12, 0.12, 0.12, 0.12, 0.18, 0.18, 0.18, 0.18, 0.18, 0.22, 0.22, 0.22, 0.22, 0.22, 0.24, 0.24, 0.24, 0.24, 0.24, 0.1, 0.1, 0.1, 0.1, 0.1, 0.16, 0.16, 0.16, 0.16, 0.16, 0.32, 0.32, 0.32, 0.32, 0.32, 0.1, 0.1, 0.1, 0.1, 0.1, 0.13, 0.13, 0.13, 0.13, 0.13, 0.15, 0.15, 0.15, 0.15, 0.15, 0.18, 0.18, 0.18, 0.18, 0.18, 0.11, 0.11, 0.11, 0.11, 0.11, 0.12, 0.12, 0.12, 0.12, 0.12, 0.14, 0.14, 0.14, 0.14, 0.14, 0.14, 0.14, 0.14, 0.14, 0.14, 0.17, 0.17, 0.17, 0.17, 0.17, 0.26, 0.26, 0.26, 0.26, 0.26, 0.25, 0.25, 0.25, 0.25, 0.25, 0.26, 0.26, 0.26, 0.26, 0.26, 0.31, 0.31, 0.31, 0.31, 0.31, 0.16, 0.16, 0.16, 0.16, 0.16, 0.09, 0.09, 0.09, 0.09, 0.09, 0.17, 0.17, 0.17, 0.17, 0.17, 0.15, 0.15, 0.15, 0.15, 0.15, 0.32, 0.32, 0.32, 0.32, 0.32, 0.46, 0.46, 0.46, 0.46, 0.46, 0.63, 0.63, 0.63, 0.63, 0.63, 0.71, 0.71, 0.71, 0.71, 0.71, 0.72, 0.72, 0.72, 0.72, 0.72, 0.23, 0.23, 0.23, 0.23, 0.23, 0.19, 0.19, 0.19, 0.19, 0.19, 0.22, 0.22, 0.22, 0.22, 0.22, 0.13, 0.13, 0.13, 0.13, 0.13, 0.19, 0.19, 0.19, 0.19, 0.19, 0.21, 0.21, 0.21, 0.21, 0.21, 0.22, 0.22, 0.22, 0.22, 0.22, 0.35, 0.35, 0.35, 0.35, 0.35, 0.18, 0.18, 0.18, 0.18, 0.18, 0.12, 0.12, 0.12, 0.12, 0.12, 0.14, 0.14, 0.14, 0.14, 0.14, 0.11, 0.11, 0.11, 0.11, 0.11, 0.16, 0.16, 0.16, 0.16, 0.16, 0.1, 0.1, 0.1, 0.1, 0.1, 0.11, 0.11, 0.11, 0.11, 0.11, 0.17, 0.17, 0.17]
                    
Loading
requests_processing
More
---
config:
    xyChart:
        titleFontSize: 12
        width: 900
        height: 600
    themeVariables:
        xyChart:
            titleColor: "#000000"
---
xychart-beta
    title "llama.cpp bench-server-baseline on Standard_NC4as_T4_v3
 duration=10m 547 iterations"
    y-axis "llamacpp:requests_processing"
    x-axis "llamacpp:requests_processing" 1718002435 --> 1718003059
    line [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 6.0, 6.0, 6.0, 6.0, 6.0, 8.0, 8.0, 8.0, 8.0, 8.0, 5.0, 5.0, 5.0, 5.0, 5.0, 3.0, 3.0, 3.0, 3.0, 3.0, 3.0, 3.0, 3.0, 3.0, 3.0, 2.0, 2.0, 2.0, 2.0, 2.0, 3.0, 3.0, 3.0, 3.0, 3.0, 6.0, 6.0, 6.0, 6.0, 6.0, 4.0, 4.0, 4.0, 4.0, 4.0, 6.0, 6.0, 6.0, 6.0, 6.0, 4.0, 4.0, 4.0, 4.0, 4.0, 4.0, 4.0, 4.0, 4.0, 4.0, 4.0, 4.0, 4.0, 4.0, 4.0, 8.0, 8.0, 8.0, 8.0, 8.0, 7.0, 7.0, 7.0, 7.0, 7.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 3.0, 3.0, 3.0, 3.0, 3.0, 3.0, 3.0, 3.0, 3.0, 3.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 3.0, 3.0, 3.0, 3.0, 3.0, 7.0, 7.0, 7.0, 7.0, 7.0, 6.0, 6.0, 6.0, 6.0, 6.0, 0.0, 0.0, 0.0, 0.0, 0.0, 5.0, 5.0, 5.0, 5.0, 5.0, 3.0, 3.0, 3.0, 3.0, 3.0, 4.0, 4.0, 4.0, 4.0, 4.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 8.0, 8.0, 8.0, 8.0, 8.0, 4.0, 4.0, 4.0, 4.0, 4.0, 4.0, 4.0, 4.0, 4.0, 4.0, 6.0, 6.0, 6.0, 6.0, 6.0, 5.0, 5.0, 5.0, 5.0, 5.0, 3.0, 3.0, 3.0, 3.0, 3.0, 3.0, 3.0, 3.0, 3.0, 3.0, 6.0, 6.0, 6.0, 6.0, 6.0, 7.0, 7.0, 7.0, 7.0, 7.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 2.0, 2.0, 2.0, 2.0, 2.0, 6.0, 6.0, 6.0, 6.0, 6.0, 8.0, 8.0, 8.0, 8.0, 8.0, 4.0, 4.0, 4.0, 4.0, 4.0, 7.0, 7.0, 7.0, 7.0, 7.0, 5.0, 5.0, 5.0, 5.0, 5.0, 6.0, 6.0, 6.0, 6.0, 6.0, 7.0, 7.0, 7.0, 7.0, 7.0, 5.0, 5.0, 5.0, 5.0, 5.0, 3.0, 3.0, 3.0, 3.0, 3.0, 4.0, 4.0, 4.0, 4.0, 4.0, 6.0, 6.0, 6.0, 6.0, 6.0, 1.0, 1.0, 1.0, 1.0, 1.0, 5.0, 5.0, 5.0, 5.0, 5.0, 3.0, 3.0, 3.0, 3.0, 3.0, 3.0, 3.0, 3.0]
                    
Loading

@ggerganov ggerganov merged commit d9da0e4 into master Jun 10, 2024
@man4j
Copy link
Copy Markdown

man4j commented Jun 11, 2024

oh, this broke my project because I'm passing an array of integers (tokens). ((((

@ggerganov
Copy link
Copy Markdown
Member Author

Sorry about that - I forgot about this use case. Will revert the change related to that

@sasha0552 sasha0552 mentioned this pull request Jun 19, 2024
4 tasks
Seunghhon pushed a commit to Seunghhon/llama.cpp that referenced this pull request Apr 26, 2026
phuongncn pushed a commit to phuongncn/llama.cpp-gx10-dgx-sparks-deepseekv4 that referenced this pull request Apr 28, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Bug: embeddings endpoint broken

2 participants