Skip to content

Windows support for AVX512_BF16 and associated bug fixes for BF16 model#7258

Merged
mofosyne merged 1 commit intoggml-org:masterfrom
Srihari-mcw:bf16_cmake_plus_windows_build
May 20, 2024
Merged

Windows support for AVX512_BF16 and associated bug fixes for BF16 model#7258
mofosyne merged 1 commit intoggml-org:masterfrom
Srihari-mcw:bf16_cmake_plus_windows_build

Conversation

@Srihari-mcw
Copy link
Copy Markdown
Collaborator

The BF16 model code path doesn't go through AVX512_BF16 code path by default in windows. When the flag is enabled the following error is faced.

errors with explicit type casting

The PR does the following :

  • Adding cmake support to enable AVX512_BF16 in windows
  • Updates SIMD instructions to enable build and execution on windows platform.
  • Macro based explicit type casting is done to BF16 vector types
  • Shows status of AVX512_BF16 flag

Note : __m512bh is implemented as __m512i in windows internally

@mofosyne mofosyne added bugfix fixes an issue or bug build Compilation issues Review Complexity : High Generally require indepth knowledge of LLMs or GPUs labels May 13, 2024
@github-actions
Copy link
Copy Markdown
Contributor

📈 llama.cpp server for bench-server-baseline on Standard_NC4as_T4_v3 for phi-2-q4_0: 554 iterations 🚀

Expand details for performance related PR only
  • Concurrent users: 8, duration: 10m
  • HTTP request : avg=8404.53ms p(95)=20107.54ms fails=, finish reason: stop=513 truncated=41
  • Prompt processing (pp): avg=96.9tk/s p(95)=436.52tk/s
  • Token generation (tg): avg=49.65tk/s p(95)=49.14tk/s
  • ggml-org/models/phi-2/ggml-model-q4_0.gguf parallel=8 ctx-size=16384 ngl=33 batch-size=2048 ubatch-size=256 pp=1024 pp+tg=2048 branch=bf16_cmake_plus_windows_build commit=29d5012042e9e68b017747200db8871fc3d22b0e

prompt_tokens_seconds

More
---
config:
    xyChart:
        titleFontSize: 12
        width: 900
        height: 600
    themeVariables:
        xyChart:
            titleColor: "#000000"
---
xychart-beta
    title "llama.cpp bench-server-baseline on Standard_NC4as_T4_v3
 duration=10m 554 iterations"
    y-axis "llamacpp:prompt_tokens_seconds"
    x-axis "llamacpp:prompt_tokens_seconds" 1715613327 --> 1715613953
    line [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 635.31, 635.31, 635.31, 635.31, 635.31, 715.06, 715.06, 715.06, 715.06, 715.06, 741.0, 741.0, 741.0, 741.0, 741.0, 798.72, 798.72, 798.72, 798.72, 798.72, 815.6, 815.6, 815.6, 815.6, 815.6, 833.25, 833.25, 833.25, 833.25, 833.25, 845.96, 845.96, 845.96, 845.96, 845.96, 857.75, 857.75, 857.75, 857.75, 857.75, 856.75, 856.75, 856.75, 856.75, 856.75, 886.65, 886.65, 886.65, 886.65, 886.65, 913.37, 913.37, 913.37, 913.37, 913.37, 940.76, 940.76, 940.76, 940.76, 940.76, 949.55, 949.55, 949.55, 949.55, 949.55, 947.69, 947.69, 947.69, 947.69, 947.69, 946.02, 946.02, 946.02, 946.02, 946.02, 945.43, 945.43, 945.43, 945.43, 945.43, 940.95, 940.95, 940.95, 940.95, 940.95, 909.14, 909.14, 909.14, 909.14, 909.14, 908.06, 908.06, 908.06, 908.06, 908.06, 910.93, 910.93, 910.93, 910.93, 910.93, 911.57, 911.57, 911.57, 911.57, 911.57, 910.74, 910.74, 910.74, 910.74, 910.74, 831.46, 831.46, 831.46, 831.46, 831.46, 829.87, 829.87, 829.87, 829.87, 829.87, 830.01, 830.01, 830.01, 830.01, 830.01, 840.74, 840.74, 840.74, 840.74, 840.74, 840.16, 840.16, 840.16, 840.16, 840.16, 840.66, 840.66, 840.66, 840.66, 840.66, 842.19, 842.19, 842.19, 842.19, 842.19, 845.12, 845.12, 845.12, 845.12, 845.12, 840.97, 840.97, 840.97, 840.97, 840.97, 840.33, 840.33, 840.33, 840.33, 840.33, 844.03, 844.03, 844.03, 844.03, 844.03, 848.3, 848.3, 848.3, 848.3, 848.3, 853.25, 853.25, 853.25, 853.25, 853.25, 861.87, 861.87, 861.87, 861.87, 861.87, 860.03, 860.03, 860.03, 860.03, 860.03, 856.5, 856.5, 856.5, 856.5, 856.5, 858.99, 858.99, 858.99, 858.99, 858.99, 861.04, 861.04, 861.04, 861.04, 861.04, 861.7, 861.7, 861.7, 861.7, 861.7, 806.1, 806.1, 806.1, 806.1, 806.1, 784.52, 784.52, 784.52, 784.52, 784.52, 783.49, 783.49, 783.49, 783.49, 783.49, 781.89, 781.89, 781.89, 781.89, 781.89, 782.74, 782.74, 782.74, 782.74, 782.74, 788.8, 788.8, 788.8, 788.8, 788.8, 788.67, 788.67, 788.67, 788.67, 788.67, 792.44, 792.44, 792.44, 792.44, 792.44, 795.16, 795.16, 795.16, 795.16, 795.16, 798.34, 798.34, 798.34, 798.34, 798.34, 802.72, 802.72, 802.72, 802.72, 802.72, 802.23, 802.23, 802.23, 802.23, 802.23, 807.71, 807.71, 807.71, 807.71, 807.71, 809.84, 809.84, 809.84, 809.84, 809.84, 810.12, 810.12, 810.12, 810.12, 810.12, 811.96, 811.96, 811.96, 811.96, 811.96, 813.56, 813.56, 813.56, 813.56, 813.56, 817.81, 817.81, 817.81, 817.81, 817.81, 816.04, 816.04, 816.04, 816.04]
                    
Loading
predicted_tokens_seconds
More
---
config:
    xyChart:
        titleFontSize: 12
        width: 900
        height: 600
    themeVariables:
        xyChart:
            titleColor: "#000000"
---
xychart-beta
    title "llama.cpp bench-server-baseline on Standard_NC4as_T4_v3
 duration=10m 554 iterations"
    y-axis "llamacpp:predicted_tokens_seconds"
    x-axis "llamacpp:predicted_tokens_seconds" 1715613327 --> 1715613953
    line [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 45.55, 45.55, 45.55, 45.55, 45.55, 28.18, 28.18, 28.18, 28.18, 28.18, 29.23, 29.23, 29.23, 29.23, 29.23, 32.58, 32.58, 32.58, 32.58, 32.58, 33.67, 33.67, 33.67, 33.67, 33.67, 35.69, 35.69, 35.69, 35.69, 35.69, 36.43, 36.43, 36.43, 36.43, 36.43, 36.37, 36.37, 36.37, 36.37, 36.37, 36.01, 36.01, 36.01, 36.01, 36.01, 36.03, 36.03, 36.03, 36.03, 36.03, 34.53, 34.53, 34.53, 34.53, 34.53, 34.25, 34.25, 34.25, 34.25, 34.25, 33.24, 33.24, 33.24, 33.24, 33.24, 32.62, 32.62, 32.62, 32.62, 32.62, 30.49, 30.49, 30.49, 30.49, 30.49, 29.98, 29.98, 29.98, 29.98, 29.98, 30.01, 30.01, 30.01, 30.01, 30.01, 30.3, 30.3, 30.3, 30.3, 30.3, 30.0, 30.0, 30.0, 30.0, 30.0, 30.11, 30.11, 30.11, 30.11, 30.11, 30.07, 30.07, 30.07, 30.07, 30.07, 30.13, 30.13, 30.13, 30.13, 30.13, 30.29, 30.29, 30.29, 30.29, 30.29, 30.44, 30.44, 30.44, 30.44, 30.44, 30.72, 30.72, 30.72, 30.72, 30.72, 30.57, 30.57, 30.57, 30.57, 30.57, 30.5, 30.5, 30.5, 30.5, 30.5, 30.73, 30.73, 30.73, 30.73, 30.73, 30.91, 30.91, 30.91, 30.91, 30.91, 30.97, 30.97, 30.97, 30.97, 30.97, 31.16, 31.16, 31.16, 31.16, 31.16, 31.31, 31.31, 31.31, 31.31, 31.31, 31.34, 31.34, 31.34, 31.34, 31.34, 31.14, 31.14, 31.14, 31.14, 31.14, 31.1, 31.1, 31.1, 31.1, 31.1, 30.7, 30.7, 30.7, 30.7, 30.7, 30.53, 30.53, 30.53, 30.53, 30.53, 30.35, 30.35, 30.35, 30.35, 30.35, 30.44, 30.44, 30.44, 30.44, 30.44, 30.63, 30.63, 30.63, 30.63, 30.63, 30.86, 30.86, 30.86, 30.86, 30.86, 30.9, 30.9, 30.9, 30.9, 30.9, 30.82, 30.82, 30.82, 30.82, 30.82, 30.5, 30.5, 30.5, 30.5, 30.5, 29.94, 29.94, 29.94, 29.94, 29.94, 29.21, 29.21, 29.21, 29.21, 29.21, 29.16, 29.16, 29.16, 29.16, 29.16, 29.15, 29.15, 29.15, 29.15, 29.15, 29.3, 29.3, 29.3, 29.3, 29.3, 29.31, 29.31, 29.31, 29.31, 29.31, 29.39, 29.39, 29.39, 29.39, 29.39, 29.41, 29.41, 29.41, 29.41, 29.41, 29.3, 29.3, 29.3, 29.3, 29.3, 29.17, 29.17, 29.17, 29.17, 29.17, 29.27, 29.27, 29.27, 29.27, 29.27, 29.44, 29.44, 29.44, 29.44, 29.44, 29.54, 29.54, 29.54, 29.54, 29.54, 29.64, 29.64, 29.64, 29.64, 29.64, 29.77, 29.77, 29.77, 29.77, 29.77, 29.82, 29.82, 29.82, 29.82]
                    
Loading

Details

kv_cache_usage_ratio

More
---
config:
    xyChart:
        titleFontSize: 12
        width: 900
        height: 600
    themeVariables:
        xyChart:
            titleColor: "#000000"
---
xychart-beta
    title "llama.cpp bench-server-baseline on Standard_NC4as_T4_v3
 duration=10m 554 iterations"
    y-axis "llamacpp:kv_cache_usage_ratio"
    x-axis "llamacpp:kv_cache_usage_ratio" 1715613327 --> 1715613953
    line [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.34, 0.34, 0.34, 0.34, 0.34, 0.2, 0.2, 0.2, 0.2, 0.2, 0.13, 0.13, 0.13, 0.13, 0.13, 0.18, 0.18, 0.18, 0.18, 0.18, 0.09, 0.09, 0.09, 0.09, 0.09, 0.13, 0.13, 0.13, 0.13, 0.13, 0.17, 0.17, 0.17, 0.17, 0.17, 0.19, 0.19, 0.19, 0.19, 0.19, 0.18, 0.18, 0.18, 0.18, 0.18, 0.16, 0.16, 0.16, 0.16, 0.16, 0.25, 0.25, 0.25, 0.25, 0.25, 0.36, 0.36, 0.36, 0.36, 0.36, 0.38, 0.38, 0.38, 0.38, 0.38, 0.36, 0.36, 0.36, 0.36, 0.36, 0.21, 0.21, 0.21, 0.21, 0.21, 0.16, 0.16, 0.16, 0.16, 0.16, 0.15, 0.15, 0.15, 0.15, 0.15, 0.31, 0.31, 0.31, 0.31, 0.31, 0.21, 0.21, 0.21, 0.21, 0.21, 0.17, 0.17, 0.17, 0.17, 0.17, 0.19, 0.19, 0.19, 0.19, 0.19, 0.13, 0.13, 0.13, 0.13, 0.13, 0.3, 0.3, 0.3, 0.3, 0.3, 0.13, 0.13, 0.13, 0.13, 0.13, 0.15, 0.15, 0.15, 0.15, 0.15, 0.15, 0.15, 0.15, 0.15, 0.15, 0.11, 0.11, 0.11, 0.11, 0.11, 0.12, 0.12, 0.12, 0.12, 0.12, 0.19, 0.19, 0.19, 0.19, 0.19, 0.16, 0.16, 0.16, 0.16, 0.16, 0.17, 0.17, 0.17, 0.17, 0.17, 0.18, 0.18, 0.18, 0.18, 0.18, 0.22, 0.22, 0.22, 0.22, 0.22, 0.21, 0.21, 0.21, 0.21, 0.21, 0.17, 0.17, 0.17, 0.17, 0.17, 0.36, 0.36, 0.36, 0.36, 0.36, 0.32, 0.32, 0.32, 0.32, 0.32, 0.23, 0.23, 0.23, 0.23, 0.23, 0.12, 0.12, 0.12, 0.12, 0.12, 0.12, 0.12, 0.12, 0.12, 0.12, 0.13, 0.13, 0.13, 0.13, 0.13, 0.48, 0.48, 0.48, 0.48, 0.48, 0.7, 0.7, 0.7, 0.7, 0.7, 0.55, 0.55, 0.55, 0.55, 0.55, 0.42, 0.42, 0.42, 0.42, 0.42, 0.16, 0.16, 0.16, 0.16, 0.16, 0.19, 0.19, 0.19, 0.19, 0.19, 0.18, 0.18, 0.18, 0.18, 0.18, 0.14, 0.14, 0.14, 0.14, 0.14, 0.21, 0.21, 0.21, 0.21, 0.21, 0.19, 0.19, 0.19, 0.19, 0.19, 0.3, 0.3, 0.3, 0.3, 0.3, 0.17, 0.17, 0.17, 0.17, 0.17, 0.18, 0.18, 0.18, 0.18, 0.18, 0.15, 0.15, 0.15, 0.15, 0.15, 0.1, 0.1, 0.1, 0.1, 0.1, 0.13, 0.13, 0.13, 0.13, 0.13, 0.14, 0.14, 0.14, 0.14, 0.14, 0.17, 0.17, 0.17, 0.17, 0.17, 0.13, 0.13, 0.13, 0.13]
                    
Loading
requests_processing
More
---
config:
    xyChart:
        titleFontSize: 12
        width: 900
        height: 600
    themeVariables:
        xyChart:
            titleColor: "#000000"
---
xychart-beta
    title "llama.cpp bench-server-baseline on Standard_NC4as_T4_v3
 duration=10m 554 iterations"
    y-axis "llamacpp:requests_processing"
    x-axis "llamacpp:requests_processing" 1715613327 --> 1715613953
    line [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 7.0, 7.0, 7.0, 7.0, 7.0, 5.0, 5.0, 5.0, 5.0, 5.0, 6.0, 6.0, 6.0, 6.0, 6.0, 4.0, 4.0, 4.0, 4.0, 4.0, 4.0, 4.0, 4.0, 4.0, 4.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 6.0, 6.0, 6.0, 6.0, 6.0, 1.0, 1.0, 1.0, 1.0, 1.0, 4.0, 4.0, 4.0, 4.0, 4.0, 3.0, 3.0, 3.0, 3.0, 3.0, 7.0, 7.0, 7.0, 7.0, 7.0, 6.0, 6.0, 6.0, 6.0, 6.0, 8.0, 8.0, 8.0, 8.0, 8.0, 7.0, 7.0, 7.0, 7.0, 7.0, 3.0, 3.0, 3.0, 3.0, 3.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 5.0, 5.0, 5.0, 5.0, 5.0, 8.0, 8.0, 8.0, 8.0, 8.0, 6.0, 6.0, 6.0, 6.0, 6.0, 4.0, 4.0, 4.0, 4.0, 4.0, 5.0, 5.0, 5.0, 5.0, 5.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 8.0, 8.0, 8.0, 8.0, 8.0, 7.0, 7.0, 7.0, 7.0, 7.0, 4.0, 4.0, 4.0, 4.0, 4.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 4.0, 4.0, 4.0, 4.0, 4.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 8.0, 8.0, 8.0, 8.0, 8.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 2.0, 2.0, 2.0, 2.0, 2.0, 5.0, 5.0, 5.0, 5.0, 5.0, 4.0, 4.0, 4.0, 4.0, 4.0, 2.0, 2.0, 2.0, 2.0, 2.0, 6.0, 6.0, 6.0, 6.0, 6.0, 8.0, 8.0, 8.0, 8.0, 8.0, 6.0, 6.0, 6.0, 6.0, 6.0, 7.0, 7.0, 7.0, 7.0, 7.0, 6.0, 6.0, 6.0, 6.0, 6.0, 5.0, 5.0, 5.0, 5.0, 5.0, 7.0, 7.0, 7.0, 7.0, 7.0, 3.0, 3.0, 3.0, 3.0, 3.0, 6.0, 6.0, 6.0, 6.0, 6.0, 5.0, 5.0, 5.0, 5.0, 5.0, 8.0, 8.0, 8.0, 8.0, 8.0, 5.0, 5.0, 5.0, 5.0, 5.0, 3.0, 3.0, 3.0, 3.0, 3.0, 4.0, 4.0, 4.0, 4.0, 4.0, 2.0, 2.0, 2.0, 2.0, 2.0, 3.0, 3.0, 3.0, 3.0, 3.0, 8.0, 8.0, 8.0, 8.0, 8.0, 3.0, 3.0, 3.0, 3.0, 3.0, 5.0, 5.0, 5.0, 5.0]
                    
Loading

@mofosyne mofosyne added the merge ready A maintainer can use this label to indicate that they consider the changes final and ready to merge. label May 18, 2024
@mofosyne mofosyne merged commit 33c8d50 into ggml-org:master May 20, 2024
Seunghhon pushed a commit to Seunghhon/llama.cpp that referenced this pull request Apr 26, 2026
phuongncn pushed a commit to phuongncn/llama.cpp-gx10-dgx-sparks-deepseekv4 that referenced this pull request Apr 28, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bugfix fixes an issue or bug build Compilation issues merge ready A maintainer can use this label to indicate that they consider the changes final and ready to merge. Review Complexity : High Generally require indepth knowledge of LLMs or GPUs

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants