Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

support w8a8 fp8 kernel with CUTLASS #3047

Merged
merged 43 commits into from
Jan 26, 2025
Merged

Conversation

HandH1998
Copy link
Collaborator

@HandH1998 HandH1998 commented Jan 22, 2025

Support sm89 and sm90 fp8 GEMM implementation with cutlass for w8a8 fp8 quantization. Co-author @yych0745 @b0urnee

Benchmark

GPU: sm89-L40

meta-llama/Llama-3.1-8B-Instruct, TP=1
meta-llama/Llama-3.1-8B-Instruct N=6144 K=4096: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    2335.643631    2320.141654     3.432746e+03     3.408572e+03
1        16.0   37453.714583   37047.131842     5.444867e+04     5.398155e+04
2        64.0  140461.229080  140706.620944     2.180305e+05     2.147746e+05
3       128.0  257811.991664  256989.356229     3.874000e+05     3.842580e+05
4       256.0  376313.351504  379773.505853     5.874578e+05     5.872437e+05
5       512.0  447136.519385  446516.782850     6.756751e+05     6.770952e+05
6      1024.0  482133.886369  481701.346705     1.032074e+06     1.035225e+06
7      2048.0  617316.171648  617997.076916     1.124229e+06     1.123886e+06
meta-llama/Llama-3.1-8B-Instruct N=4096 K=4096: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    2263.797057    2268.081107     3.261910e+03     3.300406e+03
1        16.0   36133.003454   36181.700594     5.131274e+04     5.194819e+04
2        64.0  134368.475116  134167.021606     2.036936e+05     2.071514e+05
3       128.0  241536.871262  241754.375737     3.879599e+05     3.927845e+05
4       256.0  335690.192464  335952.751829     6.724313e+05     6.703327e+05
5       512.0  468837.743465  468071.471871     7.060308e+05     7.109387e+05
6      1024.0  569090.020058  568224.320731     9.510664e+05     9.468734e+05
7      2048.0  644460.657798  643422.938028     1.073605e+06     1.071930e+06
meta-llama/Llama-3.1-8B-Instruct N=28672 K=4096: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    3251.435417    3251.435417     3.960042e+03     3.967533e+03
1        16.0   51953.926665   51862.170313     6.197327e+04     6.201418e+04
2        64.0  202858.119242  202792.452706     2.340029e+05     2.344993e+05
3       128.0  371766.081189  372263.185166     4.448521e+05     4.450101e+05
4       256.0  511785.827174  510465.185362     8.086827e+05     8.097281e+05
5       512.0  683342.596298  683311.530627     1.171346e+06     1.170889e+06
6      1024.0  772193.475032  770314.082851     1.245122e+06     1.249883e+06
7      2048.0  807450.319843  806832.777481     1.284455e+06     1.289246e+06
meta-llama/Llama-3.1-8B-Instruct N=4096 K=14336: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    3346.381498    3357.094898     3.876571e+03     3.875036e+03
1        16.0   53432.488376   53529.904778     5.960898e+04     5.981391e+04
2        64.0  205502.381198  205524.842140     2.425915e+05     2.440407e+05
3       128.0  357042.333471  356974.489972     4.560402e+05     4.585441e+05
4       256.0  599685.250064  598348.574805     7.900415e+05     7.920395e+05
5       512.0  856283.287484  855990.852359     9.696149e+05     9.663737e+05
6      1024.0  947641.397961  945317.356212     1.182576e+06     1.182297e+06
7      2048.0  998997.195229  999130.019559     1.227226e+06     1.226976e+06
meta-llama/Llama-3.3-70B-Instruct, TP=1
meta-llama/Llama-3.3-70B-Instruct N=10240 K=8192: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    3253.614550    3256.898816     3.729824e+03     3.726510e+03
1        16.0   52009.425300   52082.074110     5.966657e+04     5.963476e+04
2        64.0  201117.648715  200666.648430     2.459927e+05     2.447592e+05
3       128.0  347600.470615  347758.067667     4.573674e+05     4.570171e+05
4       256.0  439077.266091  439562.536256     7.589281e+05     7.580173e+05
5       512.0  700872.914099  699412.085314     8.387809e+05     8.425984e+05
6      1024.0  875844.533804  875737.431067     1.166928e+06     1.166231e+06
7      2048.0  926634.594022  929482.026046     1.236749e+06     1.235859e+06
meta-llama/Llama-3.3-70B-Instruct N=8192 K=8192: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    2935.825033    2932.233588     3.743472e+03     3.755201e+03
1        16.0   46793.068366   46858.410902     5.980215e+04     6.001606e+04
2        64.0  180714.800780  180714.800780     2.438255e+05     2.438808e+05
3       128.0  334571.529489  334832.357100     4.593828e+05     4.610594e+05
4       256.0  571896.601234  571402.065822     7.887668e+05     7.860961e+05
5       512.0  722160.379203  722676.783179     9.876362e+05     9.886591e+05
6      1024.0  803128.093374  803860.841401     1.152579e+06     1.150611e+06
7      2048.0  848440.380713  848356.572263     1.211502e+06     1.209881e+06
meta-llama/Llama-3.3-70B-Instruct N=57344 K=8192: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   4.213974e+03   4.213218e+03     4.792805e+03     4.794957e+03
1        16.0   6.712015e+04   6.711057e+04     7.433545e+04     7.426200e+04
2        64.0   2.609314e+05   2.609767e+05     3.024201e+05     3.025784e+05
3       128.0   4.866249e+05   4.865619e+05     5.900504e+05     5.897263e+05
4       256.0   7.632087e+05   7.647909e+05     1.031091e+06     1.027479e+06
5       512.0   9.861622e+05   9.844428e+05     1.237742e+06     1.244170e+06
6      1024.0   1.011339e+06   1.011782e+06     1.242717e+06     1.233951e+06
7      2048.0   1.048891e+06   1.029872e+06     1.299468e+06     1.303567e+06
meta-llama/Llama-3.3-70B-Instruct N=8192 K=28672: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   3.754198e+03   3.752878e+03     4.027523e+03     4.025314e+03
1        16.0   5.998088e+04   5.993497e+04     6.337541e+04     6.335405e+04
2        64.0   2.318064e+05   2.317064e+05     2.585417e+05     2.584350e+05
3       128.0   4.357289e+05   4.356026e+05     4.910704e+05     4.901737e+05
4       256.0   8.085111e+05   8.086199e+05     8.983833e+05     8.995660e+05
5       512.0   1.049326e+06   1.050114e+06     1.111677e+06     1.107255e+06
6      1024.0   1.114376e+06   1.114252e+06     1.220703e+06     1.220034e+06
7      2048.0   1.148550e+06   1.148539e+06     1.235977e+06     1.205723e+06
mistralai/Mistral-Large-Instruct-2407, TP=1
mistralai/Mistral-Large-Instruct-2407 N=14336 K=12288: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   3.432467e+03   3.428459e+03     3.958830e+03     3.959186e+03
1        16.0   5.471054e+04   5.469780e+04     6.319643e+04     6.319359e+04
2        64.0   2.117403e+05   2.117085e+05     2.497508e+05     2.498172e+05
3       128.0   3.895232e+05   3.882756e+05     4.655908e+05     4.655524e+05
4       256.0   6.770807e+05   6.774062e+05     7.883888e+05     7.848763e+05
5       512.0   8.422005e+05   8.415875e+05     9.766549e+05     9.856194e+05
6      1024.0   9.956066e+05   9.949697e+05     1.209380e+06     1.197546e+06
7      2048.0   1.082159e+06   1.081965e+06     1.293513e+06     1.285329e+06
mistralai/Mistral-Large-Instruct-2407 N=12288 K=12288: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   3.388714e+03   3.391149e+03     3.893587e+03     3.899419e+03
1        16.0   5.398680e+04   5.411741e+04     6.210522e+04     6.221077e+04
2        64.0   2.073123e+05   2.076330e+05     2.502089e+05     2.505722e+05
3       128.0   3.708392e+05   3.701290e+05     4.818783e+05     4.817102e+05
4       256.0   5.846736e+05   5.857012e+05     8.793512e+05     8.779133e+05
5       512.0   9.354438e+05   9.346296e+05     1.064471e+06     1.068266e+06
6      1024.0   9.831200e+05   9.826202e+05     1.207028e+06     1.210885e+06
7      2048.0   1.090053e+06   1.089899e+06     1.328874e+06     1.281611e+06
mistralai/Mistral-Large-Instruct-2407 N=57344 K=12288: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   4.270531e+03   4.269547e+03     5.045189e+03     5.047936e+03
1        16.0   6.807107e+04   6.809903e+04     7.888190e+04     7.877058e+04
2        64.0   2.626585e+05   2.626769e+05     3.167197e+05     3.166575e+05
3       128.0   4.815902e+05   4.817548e+05     6.178790e+05     6.175575e+05
4       256.0   7.904407e+05   7.898524e+05     1.064660e+06     1.057470e+06
5       512.0   1.069469e+06   1.055021e+06     1.239426e+06     1.246689e+06
6      1024.0   1.108008e+06   1.114632e+06     1.272585e+06     1.270215e+06
7      2048.0   1.117103e+06   1.124127e+06     1.319721e+06     1.301896e+06
mistralai/Mistral-Large-Instruct-2407 N=12288 K=28672: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   4.009100e+03   4.007275e+03     4.418456e+03     4.418456e+03
1        16.0   6.407997e+04   6.403629e+04     7.060675e+04     7.062798e+04
2        64.0   2.485995e+05   2.485666e+05     2.811032e+05     2.811312e+05
3       128.0   4.643719e+05   4.637321e+05     5.510366e+05     5.516703e+05
4       256.0   6.990629e+05   6.989220e+05     9.447365e+05     9.443606e+05
5       512.0   1.101777e+06   1.103368e+06     1.134454e+06     1.142212e+06
6      1024.0   1.129842e+06   1.129708e+06     1.219828e+06     1.231848e+06
7      2048.0   1.201410e+06   1.186195e+06     1.309654e+06     1.293316e+06
Qwen/Qwen2.5-7B-Instruct, TP=1
Qwen/Qwen2.5-7B-Instruct N=4608 K=3584: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    2138.448433    2139.556446      3228.572423      3282.467323
1        16.0   34109.192158   34056.445583     50842.250107     51496.106383
2        64.0  127424.311042  127732.250351    180271.499555    180271.499555
3       128.0  238518.053152  239003.410239    308285.813300    311464.957039
4       256.0  290575.059248  293560.699783    488048.052880    492137.819162
5       512.0  346366.998398  346140.168249    514910.885075    514034.570012
6      1024.0  353460.536739  352062.638032    776289.409157    777431.163310
7      2048.0  473139.594508  471280.703422    988070.658014    988879.394834
Qwen/Qwen2.5-7B-Instruct N=3584 K=3584: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    2044.374321    2064.082297      3139.503453      3148.737357
1        16.0   32647.644334   32877.409539     49316.114308     49316.114308
2        64.0  122278.149137  123231.155011    196886.553109    196886.553109
3       128.0  217916.313835  218669.753638    370359.597407    369693.477933
4       256.0  304856.602749  304011.168386    639594.139164    644861.366535
5       512.0  402052.950251  401121.254825    607685.319171    606788.412022
6      1024.0  466363.162187  465505.036401    836417.337916    840263.946685
7      2048.0  531007.198118  531092.968791    933586.856043    934316.210884
Qwen/Qwen2.5-7B-Instruct N=37888 K=3584: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    3376.162970    3375.827516     4.005236e+03     4.009729e+03
1        16.0   53895.344977   53809.932473     6.295494e+04     6.310119e+04
2        64.0  211881.685970  211912.669158     2.298448e+05     2.295049e+05
3       128.0  393401.387368  393330.193088     4.409848e+05     4.419265e+05
4       256.0  524282.094814  522422.846451     8.217652e+05     8.203690e+05
5       512.0  672272.436430  672142.485601     1.190414e+06     1.166725e+06
6      1024.0  741693.247381  741463.849922     1.234807e+06     1.229741e+06
7      2048.0  775677.964751  775440.017196     1.259386e+06     1.264034e+06
Qwen/Qwen2.5-7B-Instruct N=3584 K=18944: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    3528.954796    3529.688454     3.932871e+03     3.929230e+03
1        16.0   56299.410260   56299.410260     6.098997e+04     6.101738e+04
2        64.0  216146.714955  216664.022560     2.461712e+05     2.461154e+05
3       128.0  339804.017276  340603.048476     4.616887e+05     4.640553e+05
4       256.0  603278.294096  603110.831812     7.360741e+05     7.332794e+05
5       512.0  836058.438367  835174.667606     8.741089e+05     8.752532e+05
6      1024.0  888223.942173  887135.924183     1.052349e+06     1.047624e+06
7      2048.0  920455.665471  920309.423748     1.088906e+06     1.088224e+06
Qwen/Qwen2.5-32B-Instruct, TP=1
Qwen/Qwen2.5-32B-Instruct N=7168 K=5120: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    2458.059457    2465.324062     3.587153e+03     3.587854e+03
1        16.0   39392.266442   39328.951307     5.734960e+04     5.730483e+04
2        64.0  151864.465350  151943.050405     2.295778e+05     2.293984e+05
3       128.0  278983.332194  279647.583305     4.369493e+05     4.346853e+05
4       256.0  451825.296835  451434.544082     7.088230e+05     7.052055e+05
5       512.0  550900.489345  550965.051307     8.337319e+05     8.326975e+05
6      1024.0  601354.141696  600700.565553     9.857489e+05     9.819886e+05
7      2048.0  646494.995550  645362.719599     1.166500e+06     1.163791e+06
Qwen/Qwen2.5-32B-Instruct N=5120 K=5120: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    2505.922601    2505.443525     3.521892e+03     3.490011e+03
1        16.0   39964.878704   39972.497246     5.560331e+04     5.504874e+04
2        64.0  150132.908695  150025.516673     1.988959e+05     1.988959e+05
3       128.0  279042.962391  278718.502162     3.396529e+05     3.368571e+05
4       256.0  385366.422765  388040.123292     5.412534e+05     5.408172e+05
5       512.0  443240.102744  442889.111481     6.057348e+05     6.027970e+05
6      1024.0  457875.672901  456987.157383     9.009385e+05     9.000324e+05
7      2048.0  604752.354152  604371.160285     1.131222e+06     1.129984e+06
Qwen/Qwen2.5-32B-Instruct N=55296 K=5120: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    4184.052057    4182.816017     4.278121e+03     4.277863e+03
1        16.0   66602.331929   66633.679973     6.741906e+04     6.734689e+04
2        64.0  260070.099631  260032.757547     2.712911e+05     2.707075e+05
3       128.0  477853.578323  478004.827114     5.330694e+05     5.323804e+05
4       256.0  671179.023050  672174.882139     9.266293e+05     9.254463e+05
5       512.0  821337.200160  821597.864509     1.226848e+06     1.188390e+06
6      1024.0  877325.774643  877623.115235     1.261458e+06     1.237700e+06
7      2048.0  907640.825041  898223.912447     1.231214e+06     1.257846e+06
Qwen/Qwen2.5-32B-Instruct N=5120 K=27648: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    3719.789679    3720.963273     4.038350e+03     4.050367e+03
1        16.0   59317.122326   59354.431871     6.210829e+04     6.207425e+04
2        64.0  229642.607383  229479.744465     2.321375e+05     2.323279e+05
3       128.0  408028.295570  408543.503885     4.015184e+05     4.015539e+05
4       256.0  671697.828156  671897.087405     5.981685e+05     5.998716e+05
5       512.0  699630.126030  700062.617129     7.047459e+05     7.055691e+05
6      1024.0  709742.321457  709853.515137     1.036419e+06     1.036153e+06
7      2048.0  955360.264550  954913.403308     1.242332e+06     1.219978e+06
Qwen/Qwen2.5-72B-Instruct, TP=1
Qwen/Qwen2.5-72B-Instruct N=10240 K=8192: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    3257.404550    3257.657593     3.727835e+03     3.729824e+03
1        16.0   51977.201122   51977.201122     5.965066e+04     5.965596e+04
2        64.0  200756.681005  201283.536963     2.448038e+05     2.458801e+05
3       128.0  347848.190302  347240.772652     4.575623e+05     4.572506e+05
4       256.0  439400.649232  439544.572707     7.589818e+05     7.579103e+05
5       512.0  700781.412247  699525.977321     8.382571e+05     8.424662e+05
6      1024.0  876166.132415  875791.045684     1.160308e+06     1.162129e+06
7      2048.0  925536.398160  926035.261994     1.209958e+06     1.215849e+06
Qwen/Qwen2.5-72B-Instruct N=8192 K=8192: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    2927.117999    2932.746012     3.740968e+03     3.755201e+03
1        16.0   46793.068366   46850.235011     5.980215e+04     5.978883e+04
2        64.0  180441.501580  180653.994530     2.426683e+05     2.425039e+05
3       128.0  334989.029497  334806.251179     4.612575e+05     4.588921e+05
4       256.0  572239.476390  571934.686062     7.881155e+05     7.894191e+05
5       512.0  722190.742130  722828.840604     9.892283e+05     9.902547e+05
6      1024.0  803203.203467  803278.257655     1.151999e+06     1.153159e+06
7      2048.0  848649.896219  848754.770880     1.211801e+06     1.210435e+06
Qwen/Qwen2.5-72B-Instruct N=59136 K=8192: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   4.226764e+03   4.226027e+03     4.824262e+03     4.824070e+03
1        16.0   6.723637e+04   6.729007e+04     7.477280e+04     7.484645e+04
2        64.0   2.614702e+05   2.615143e+05     3.025939e+05     3.027830e+05
3       128.0   4.911326e+05   4.911171e+05     5.875722e+05     5.883750e+05
4       256.0   7.851556e+05   7.878890e+05     9.980426e+05     9.927381e+05
5       512.0   9.643252e+05   9.625887e+05     1.208277e+06     1.208677e+06
6      1024.0   1.002881e+06   1.012059e+06     1.266845e+06     1.253130e+06
7      2048.0   1.022718e+06   1.028590e+06     1.277927e+06     1.299583e+06
Qwen/Qwen2.5-72B-Instruct N=8192 K=29568: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   3.641279e+03   3.640075e+03     4.011944e+03     4.009554e+03
1        16.0   5.811195e+04   5.813985e+04     6.379382e+04     6.379382e+04
2        64.0   2.279163e+05   2.281108e+05     2.547727e+05     2.551081e+05
3       128.0   4.385902e+05   4.387888e+05     4.932988e+05     4.938646e+05
4       256.0   8.137321e+05   8.147156e+05     9.060968e+05     9.037198e+05
5       512.0   1.056724e+06   1.057103e+06     1.105617e+06     1.096429e+06
6      1024.0   1.119510e+06   1.119439e+06     1.206366e+06     1.211351e+06
7      2048.0   1.151686e+06   1.152360e+06     1.197490e+06     1.175783e+06
deepseek-ai/DeepSeek-Coder-V2-Lite-Instruct, TP=1
deepseek-ai/DeepSeek-Coder-V2-Lite-Instruct N=3072 K=2048: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    1442.023776    1448.662978      2469.777006      2465.905852
1        16.0   23051.251702   23072.380415     38342.678243     38313.497900
2        64.0   86353.232700   86058.014304    156105.222606    155743.040797
3       128.0  152096.488730  153253.991599    291426.537365    291005.422040
4       256.0  201476.473434  200873.570095    471054.376815    480610.380100
5       512.0  240700.128236  244833.738304    440888.342442    442827.365134
6      1024.0  277186.157453  278407.667630    702882.181040    696500.652691
7      2048.0  398961.353775  398024.944805    956654.356069    951147.406916
deepseek-ai/DeepSeek-Coder-V2-Lite-Instruct N=4096 K=2048: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    1549.807229    1539.005116     2.785742e+03      2717.181349
1        16.0   24687.476443   24732.958508     4.364451e+04     42592.162175
2        64.0   90161.512344   90040.577462     1.734502e+05    168656.400412
3       128.0  166460.633372  166254.488598     3.268424e+05    320025.025710
4       256.0  211002.739732  211002.739732     6.102295e+05    594686.577640
5       512.0  283453.146894  284090.453355     6.302840e+05    632138.880851
6      1024.0  358598.989082  359378.935339     8.463388e+05    839229.572638
7      2048.0  423586.640300  423336.212437     1.001052e+06    994217.958648
deepseek-ai/DeepSeek-Coder-V2-Lite-Instruct N=2048 K=2048: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    1354.205261    1332.696320      2127.448152      2093.477158
1        16.0   21445.765554   21459.477785     32086.637135     31250.114616
2        64.0   76714.566775   76019.535862    130976.093016    127493.350295
3       128.0  117970.553634  117970.553634    242767.615150    237611.499523
4       256.0  176529.245247  177228.376290    300001.106813    304768.430857
5       512.0  188487.882196  190493.783419    333024.474892    336678.351870
6      1024.0  292564.414099  291294.818690    751051.754401    759013.432903
7      2048.0  360706.625827  360464.498374    903283.424873    906715.000236
deepseek-ai/DeepSeek-Coder-V2-Lite-Instruct N=576 K=2048: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0     700.674577     702.342863       951.561281       881.865508
1        16.0   11079.210923   11105.279853     14046.857254     13963.739126
2        64.0   34832.059406   34767.911479     50210.041101     52587.677196
3       128.0   48099.303014   48657.156416     76900.103804     76124.904256
4       256.0   78417.345909   78580.546626    106360.429221    105616.652963
5       512.0   66154.977305   66828.234241    109641.964958    111134.517609
6      1024.0  122243.472830  123796.566323    220002.637283    223419.826016
7      2048.0  195131.531192  197459.471262    383329.459579    379238.683863
deepseek-ai/DeepSeek-Coder-V2-Lite-Instruct N=21888 K=2048: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    2457.661019    2459.278748     3.736464e+03     3.720960e+03
1        16.0   39108.214605   39129.542763     5.905508e+04     5.905508e+04
2        64.0  151286.613311  151286.613311     2.208067e+05     2.206709e+05
3       128.0  273843.353212  274131.109821     4.077880e+05     4.058846e+05
4       256.0  337679.966863  336452.615464     7.449648e+05     7.399702e+05
5       512.0  464224.603862  462708.798383     1.082665e+06     1.080118e+06
6      1024.0  513805.621065  512635.300660     1.115655e+06     1.113977e+06
7      2048.0  556340.543243  555170.030592     1.159670e+06     1.155322e+06
deepseek-ai/DeepSeek-Coder-V2-Lite-Instruct N=2048 K=10944: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    2918.533407    2913.980278     3.478326e+03     3.480487e+03
1        16.0   46406.491724   46334.545917     5.252334e+04     5.255413e+04
2        64.0  174600.471726  174430.625621     2.174829e+05     2.167600e+05
3       128.0  203853.566138  204230.853880     3.837660e+05     3.863500e+05
4       256.0  373864.337180  374254.493377     4.294963e+05     4.285980e+05
5       512.0  447308.237081  448707.397724     5.028102e+05     5.048007e+05
6      1024.0  780583.610904  780902.280868     1.057124e+06     1.052857e+06
7      2048.0  873873.248957  873307.989002     1.154193e+06     1.153497e+06
deepseek-ai/DeepSeek-Coder-V2-Lite-Instruct N=2816 K=2048: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    1409.720444    1404.229768      2368.052621      2368.052621
1        16.0   22467.676285   22467.676285     37337.062036     37397.576376
2        64.0   82777.772803   83075.804868    150566.418161    150321.194122
3       128.0  144327.160098  143096.454055    274694.097419    274082.301903
4       256.0  194003.601342  195338.027839    451883.558438    437946.449485
5       512.0  228777.003404  231031.837833    418344.309531    416691.722746
6      1024.0  256336.658800  256961.092600    645717.309560    649123.250683
7      2048.0  368912.181708  367489.228010    896905.853582    888942.388580
deepseek-ai/DeepSeek-Coder-V2-Lite-Instruct N=2048 K=1408: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    1023.636608    1021.461726      1758.907283      1673.206523
1        16.0   16436.513048   16401.466912     26016.757552     26958.953310
2        64.0   58422.439023   58719.756502    106837.333153    109889.829444
3       128.0   94771.513051   95015.394251    191112.741101    196398.842954
4       256.0  138808.204601  139017.254696    261494.201954    260755.533619
5       512.0  140498.411630  142394.844035    299456.477104    298126.623923
6      1024.0  211290.309934  212048.708876    680607.972867    675317.427373
7      2048.0  269633.821280  270250.560504    798766.523329    787270.428038
meta-llama/Llama-3.1-8B-Instruct, TP=4
meta-llama/Llama-3.1-8B-Instruct N=1536 K=4096: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    1835.537909    1822.776375      2496.914190      2512.869035
1        16.0   29080.179810   28863.412706     33380.499614     33469.275628
2        64.0   99728.166081  101181.490499    156328.543115    156328.543115
3       128.0  111120.957074  110998.441302    242445.723110    242154.144549
4       256.0  183421.710692  183630.801631    233721.615913    236327.654622
5       512.0  232708.656973  232574.258807    320113.136283    319985.954891
6      1024.0  401497.860916  401698.110760    657204.944879    655333.313685
7      2048.0  435707.155387  436474.549322    729698.480623    726407.819810
meta-llama/Llama-3.1-8B-Instruct N=4096 K=1024: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0     907.908303     905.948210      2081.523826      2085.662076
1        16.0   14482.664645   14507.699694     31403.944142     31084.090093
2        64.0   52331.747854   52495.413680    126682.319903    125733.395236
3       128.0   99764.683169  100062.046640    231522.871910    232323.995245
4       256.0  121248.991240  120867.028504    425620.505982    423606.513478
5       512.0  160962.856160  161301.219097    554316.887225    554316.887225
6      1024.0  206015.162020  205876.979852    675214.397837    673943.595717
7      2048.0  247441.239595  247071.320301    799602.622684    802589.581005
meta-llama/Llama-3.1-8B-Instruct N=7168 K=4096: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    2217.467937    2216.463719     3.462702e+03     3.467609e+03
1        16.0   35420.640627   35560.050809     5.516902e+04     5.522090e+04
2        64.0  136893.760008  137293.803267     2.225577e+05     2.222945e+05
3       128.0  256661.786445  255892.913587     4.249836e+05     4.240247e+05
4       256.0  401812.608950  404580.728153     7.257299e+05     7.244709e+05
5       512.0  491571.426855  491185.978475     7.817295e+05     7.838488e+05
6      1024.0  538436.363717  538108.771084     9.474552e+05     9.467393e+05
7      2048.0  580662.367892  575847.261293     1.137405e+06     1.132777e+06
meta-llama/Llama-3.1-8B-Instruct N=4096 K=3584: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    2030.159258    2038.049999     3.208504e+03     3.205701e+03
1        16.0   32428.738977   32464.591918     5.071541e+04     5.075925e+04
2        64.0  120840.426875  121245.830111     2.007810e+05     1.987426e+05
3       128.0  226696.052725  226969.855071     3.822844e+05     3.791990e+05
4       256.0  315214.737334  315320.514893     7.001902e+05     6.914313e+05
5       512.0  432124.721431  433320.325462     6.843811e+05     6.851296e+05
6      1024.0  524727.176122  523411.877259     9.321976e+05     9.308124e+05
7      2048.0  594318.776581  594318.776581     1.057426e+06     1.056831e+06
meta-llama/Llama-3.3-70B-Instruct, TP=4
meta-llama/Llama-3.3-70B-Instruct N=2560 K=8192: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    2679.202861    2676.467564      3483.853702      3476.923175
1        16.0   42714.459117   42801.632145     52203.608791     52106.334247
2        64.0  160634.173328  160942.346744    216633.177538    216983.379323
3       128.0  220186.881993  219754.291500    388947.902794    392014.953844
4       256.0  387041.280963  386150.526137    477417.485839    476993.317972
5       512.0  492120.709205  491850.191081    595236.911304    597888.259005
6      1024.0  500562.810809  501779.119083    688073.403777    688161.616910
7      2048.0  699366.521926  699685.505443    960643.558528    960815.431680
meta-llama/Llama-3.3-70B-Instruct N=8192 K=2048: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    1504.240920    1501.548987     3.154382e+03     3.183102e+03
1        16.0   24067.854720   24059.228184     5.047011e+04     5.096830e+04
2        64.0   93132.495944   93132.495944     1.936538e+05     1.913763e+05
3       128.0  179960.456177  180747.888629     3.640691e+05     3.645635e+05
4       256.0  266965.929078  267764.636510     6.646064e+05     6.708332e+05
5       512.0  324080.850708  323714.550186     7.132685e+05     7.165998e+05
6      1024.0  376249.424007  376381.280427     9.723893e+05     9.687712e+05
7      2048.0  415736.771234  412582.547229     1.073736e+06     1.071793e+06
meta-llama/Llama-3.3-70B-Instruct N=14336 K=8192: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    3315.109070    3319.794069     3.924866e+03     3.931172e+03
1        16.0   52964.004392   53002.843467     6.271402e+04     6.282304e+04
2        64.0  205822.875339  205642.683229     2.461893e+05     2.465284e+05
3       128.0  364602.800304  364373.045209     4.641627e+05     4.642201e+05
4       256.0  602054.572403  606132.708886     7.823733e+05     7.826991e+05
5       512.0  752869.769632  753549.003061     9.141009e+05     9.150190e+05
6      1024.0  889307.793179  891734.346970     1.196490e+06     1.196229e+06
7      2048.0  986453.387055  986073.204687     1.308638e+06     1.300445e+06
meta-llama/Llama-3.3-70B-Instruct N=8192 K=7168: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    2803.339283    2801.734430     3.712032e+03     3.729004e+03
1        16.0   44853.428522   44785.014766     5.925767e+04     5.947523e+04
2        64.0  174401.790158  174742.346558     2.395690e+05     2.406736e+05
3       128.0  327497.235581  327640.019990     4.620554e+05     4.628520e+05
4       256.0  556299.349494  557578.578117     8.289278e+05     8.303930e+05
5       512.0  695412.831832  694705.799962     9.925024e+05     9.995634e+05
6      1024.0  762886.145572  762344.556703     1.157397e+06     1.157576e+06
7      2048.0  809119.129505  807554.467032     1.211446e+06     1.210129e+06
mistralai/Mistral-Large-Instruct-2407, TP=4
mistralai/Mistral-Large-Instruct-2407 N=3584 K=12288: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    3161.210470    3159.396275     3.764272e+03     3.775890e+03
1        16.0   50405.703140   50449.007338     5.760887e+04     5.755241e+04
2        64.0  191538.931725  191486.882707     2.344997e+05     2.345778e+05
3       128.0  322282.983514  322209.318581     4.329780e+05     4.324466e+05
4       256.0  553660.781961  553878.374813     7.163118e+05     7.139532e+05
5       512.0  743422.626912  745191.524336     8.789171e+05     8.794655e+05
6      1024.0  796351.724416  797140.000726     1.018220e+06     1.017485e+06
7      2048.0  845230.999451  846309.588519     1.066271e+06     1.065842e+06
mistralai/Mistral-Large-Instruct-2407 N=12288 K=3072: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    2492.729382    2502.311715     3.576627e+03     3.583417e+03
1        16.0   39952.254901   39931.127159     5.753125e+04     5.755317e+04
2        64.0  154772.762705  154574.740772     2.316250e+05     2.316250e+05
3       128.0  248591.791257  249130.045224     4.154595e+05     4.131861e+05
4       256.0  334460.829128  333952.361262     7.553337e+05     7.550976e+05
5       512.0  507335.522470  510254.941626     9.161374e+05     9.213774e+05
6      1024.0  576341.651075  577133.185377     1.054353e+06     1.054928e+06
7      2048.0  665421.636904  666476.985076     1.208383e+06     1.199758e+06
mistralai/Mistral-Large-Instruct-2407 N=14336 K=12288: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   3.420338e+03   3.426725e+03     3.953144e+03     3.951371e+03
1        16.0   5.465961e+04   5.469143e+04     6.297897e+04     6.304095e+04
2        64.0   2.125546e+05   2.125707e+05     2.487369e+05     2.485176e+05
3       128.0   3.893349e+05   3.895366e+05     4.634664e+05     4.630667e+05
4       256.0   6.723967e+05   6.738837e+05     7.870129e+05     7.845759e+05
5       512.0   8.360801e+05   8.380687e+05     9.672913e+05     9.766549e+05
6      1024.0   9.947831e+05   9.960024e+05     1.222427e+06     1.223604e+06
7      2048.0   1.088283e+06   1.083889e+06     1.314667e+06     1.283756e+06
mistralai/Mistral-Large-Instruct-2407 N=12288 K=7168: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    3172.231737    3173.603120     3.872276e+03     3.869554e+03
1        16.0   50638.993569   50744.743829     6.176093e+04     6.177176e+04
2        64.0  196306.767096  196375.151311     2.461377e+05     2.464390e+05
3       128.0  337173.299721  336509.164667     4.825833e+05     4.832451e+05
4       256.0  500802.847831  500536.079936     8.821057e+05     8.803838e+05
5       512.0  790928.133378  790955.830504     9.968239e+05     1.004595e+06
6      1024.0  846032.551308  847973.204421     1.173937e+06     1.172563e+06
7      2048.0  949400.092387  949580.003682     1.314839e+06     1.317952e+06
Qwen/Qwen2.5-7B-Instruct, TP=4
Qwen/Qwen2.5-7B-Instruct N=1152 K=3584: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    1468.472266    1458.101741      2034.159515      2036.165605
1        16.0   23395.716027   23198.561530     26770.464257     26857.520803
2        64.0   80084.247707   79890.572117    126327.917659    126569.926999
3       128.0   96876.103782   97018.360523    166631.791362    165587.724599
4       256.0  147394.317487  147312.157266    215034.992942    211931.055944
5       512.0  171275.448804  172000.003905    233564.309104    230911.332266
6      1024.0  297275.602799  297442.894786    436282.310953    437184.473132
7      2048.0  305082.840206  305214.979161    540860.583839    541553.321926
Qwen/Qwen2.5-7B-Instruct N=3584 K=896: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0     810.968218     809.741946      1761.543825      1773.209768
1        16.0   12916.807918   12962.404663     27520.821957     27639.191988
2        64.0   45696.798412   45859.852167    109613.852750    112246.499047
3       128.0   84623.693303   84693.400333    209832.226083    209832.226083
4       256.0  105076.946325  104862.611302    363956.789176    363956.789176
5       512.0  132753.763138  134139.321446    458496.275222    467885.288222
6      1024.0  162911.929363  163170.470400    583363.348015    576010.020309
7      2048.0  195215.941116  195808.446114    692084.440182    693835.827128
Qwen/Qwen2.5-7B-Instruct N=9472 K=3584: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    2595.748023    2594.161368     3.340455e+03     3.340455e+03
1        16.0   41468.560299   41531.968367     5.330044e+04     5.332137e+04
2        64.0  158101.900566  158170.945911     2.164295e+05     2.170348e+05
3       128.0  249420.629331  248906.369374     3.839817e+05     3.852753e+05
4       256.0  283934.747596  284865.313504     6.626876e+05     6.647148e+05
5       512.0  440225.412224  441477.544979     7.491003e+05     7.542355e+05
6      1024.0  570928.143283  570815.667778     9.872570e+05     9.856896e+05
7      2048.0  670250.639076  670263.539605     1.168254e+06     1.166529e+06
Qwen/Qwen2.5-7B-Instruct N=3584 K=4736: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    2256.195636    2260.401625      3318.142402      3284.755564
1        16.0   36060.787740   35984.346698     52353.481118     51913.200839
2        64.0  135263.877691  135533.867542    207494.167457    205765.036280
3       128.0  241270.157984  239832.100590    395356.417726    389963.895804
4       256.0  358679.240328  356911.780984    650952.313615    650562.518849
5       512.0  481044.668345  482005.082522    667346.060431    667858.879105
6      1024.0  543016.069435  544103.877525    871592.034962    868283.228335
7      2048.0  607883.281948  608372.639495    976958.974523    975916.809486
Qwen/Qwen2.5-32B-Instruct, TP=4
Qwen/Qwen2.5-32B-Instruct N=1792 K=5120: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    2071.317286    2091.143173      2832.079020      2885.514466
1        16.0   33351.880120   33186.024124     41426.347894     41995.129503
2        64.0  122652.440945  123063.684447    179480.410662    177742.101018
3       128.0  133044.833029  133225.929914    297497.433960    293629.938053
4       256.0  229667.549511  229488.050250    278718.514482    277533.029758
5       512.0  295997.939488  295774.300744    343426.848006    342575.399765
6      1024.0  506149.434209  508010.317446    759592.459233    757999.210805
7      2048.0  565692.853954  567058.447586    889619.256221    887267.081654
Qwen/Qwen2.5-32B-Instruct N=5120 K=1280: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    1106.711693    1102.246111      2453.652612      2449.985156
1        16.0   17588.625738   17506.435273     39435.548898     39435.548898
2        64.0   65479.751342   65479.751342    141182.447358    141372.721646
3       128.0  124730.748968  124140.311630    237327.064428    239768.129742
4       256.0  145414.744235  147562.589426    390683.639396    389234.009472
5       512.0  190033.618972  190898.200720    468820.377439    471719.201903
6      1024.0  198424.898072  198577.489452    673505.971850    669075.892443
7      2048.0  257330.980374  254859.450041    875522.650256    871431.444680
Qwen/Qwen2.5-32B-Instruct N=13824 K=5120: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    2964.725725    2960.262352     3.814712e+03     3.804870e+03
1        16.0   47245.646096   47285.098506     6.094343e+04     6.081254e+04
2        64.0  183353.199636  183086.430475     2.363465e+05     2.362479e+05
3       128.0  313189.805568  313905.951300     4.453687e+05     4.436242e+05
4       256.0  492315.888856  490742.623327     7.489321e+05     7.467102e+05
5       512.0  605937.380788  606282.027830     8.382442e+05     8.421398e+05
6      1024.0  734631.793878  735272.680063     1.110774e+06     1.103670e+06
7      2048.0  821597.864509  825509.076034     1.231057e+06     1.224527e+06
Qwen/Qwen2.5-32B-Instruct N=5120 K=6912: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    2764.568203    2771.929745     3.638158e+03     3.654688e+03
1        16.0   44323.104348   44323.104348     5.726861e+04     5.741959e+04
2        64.0  169441.049758  169289.083538     2.105100e+05     2.101975e+05
3       128.0  306299.924227  306424.244857     3.678285e+05     3.674705e+05
4       256.0  457039.546283  456394.937201     5.650007e+05     5.669807e+05
5       512.0  508894.166029  508408.759050     6.485578e+05     6.486506e+05
6      1024.0  519694.361389  518386.074312     9.445738e+05     9.446723e+05
7      2048.0  681421.590823  678652.645210     1.173013e+06     1.166819e+06
Qwen/Qwen2.5-72B-Instruct, TP=4
Qwen/Qwen2.5-72B-Instruct N=2560 K=8192: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    2694.347333    2702.680430      3509.504762      3509.504762
1        16.0   42944.052279   42922.075110     52662.397241     52695.476888
2        64.0  160788.112371  161019.574944    218680.215456    218894.189894
3       128.0  219037.072305  220114.665343    392244.070116    392014.953844
4       256.0  386818.200723  385595.867317    479036.131524    477672.317195
5       512.0  492391.483007  491760.112417    595500.993585    598154.699641
6      1024.0  501544.783328  501872.949663    687985.254308    688338.069955
7      2048.0  699298.230388  700370.034491    960987.446441    960901.431362
Qwen/Qwen2.5-72B-Instruct N=8192 K=2048: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    1505.860718    1503.162989     3.154382e+03     3.185519e+03
1        16.0   24059.228184   24050.607829     5.062236e+04     5.092963e+04
2        64.0   93245.702080   93294.297897     1.924738e+05     1.938635e+05
3       128.0  180020.785320  180504.864413     3.688200e+05     3.678096e+05
4       256.0  267032.321452  267664.544586     6.729348e+05     6.637849e+05
5       512.0  324007.528852  324374.493093     7.160026e+05     7.151683e+05
6      1024.0  376628.756630  376315.325317     9.701932e+05     9.699742e+05
7      2048.0  416290.661816  412186.705077     1.074205e+06     1.072195e+06
Qwen/Qwen2.5-72B-Instruct N=14784 K=8192: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    3317.571913    3315.936960     3.939050e+03     3.933422e+03
1        16.0   53063.710048   52965.092696     6.299203e+04     6.285310e+04
2        64.0  206652.808452  206366.736061     2.510541e+05     2.506320e+05
3       128.0  366850.313924  367302.250432     4.724824e+05     4.722522e+05
4       256.0  619925.363536  618836.574356     8.053137e+05     8.014420e+05
5       512.0  773279.519678  771989.506205     9.569220e+05     9.608959e+05
6      1024.0  914825.693034  915284.739656     1.222690e+06     1.202210e+06
7      2048.0  958386.147660  960300.654482     1.238860e+06     1.238093e+06
Qwen/Qwen2.5-72B-Instruct N=8192 K=7392: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    2747.203863    2742.973094     3.684556e+03     3.671154e+03
1        16.0   43748.860243   43713.336269     5.899597e+04     5.876696e+04
2        64.0  170110.559837  170349.817017     2.361564e+05     2.353534e+05
3       128.0  325314.677490  324987.341765     4.586745e+05     4.580240e+05
4       256.0  558472.466055  558191.002307     8.260441e+05     8.230621e+05
5       512.0  702137.491666  702328.387656     9.334776e+05     9.355054e+05
6      1024.0  769580.387620  769236.711147     1.158771e+06     1.157646e+06
7      2048.0  812536.486449  810359.783567     1.197620e+06     1.185441e+06
deepseek-ai/DeepSeek-Coder-V2-Lite-Instruct, TP=4
deepseek-ai/DeepSeek-Coder-V2-Lite-Instruct N=768 K=2048: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0     853.171387     851.324682      1167.097906      1202.789039
1        16.0   13389.344708   13389.344708     16962.242500     17383.956205
2        64.0   44316.843692   44473.441505     66592.508465     68588.468562
3       128.0   62694.814240   62229.835124     96259.911405     95893.214848
4       256.0   97850.212369   98232.069762    137176.937124    136248.815417
5       512.0   85546.195537   86353.232700    140136.217218    140233.805856
6      1024.0  154015.871961  154607.094441    296795.502092    298113.601251
7      2048.0  240520.454632  245430.517027    547961.201664    554754.116929
deepseek-ai/DeepSeek-Coder-V2-Lite-Instruct N=1024 K=2048: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0     987.600703     985.744405      1421.181623      1468.952333
1        16.0   15654.208756   15596.014230     21082.050052     20615.861694
2        64.0   53274.004311   53189.580680     85837.911139     89025.530358
3       128.0   74500.830837   74625.068858    122045.906116    122045.906116
4       256.0  115335.476107  116033.276026    177815.229697    176645.383789
5       512.0  110312.654985  111968.715989    181480.903572    183152.110768
6      1024.0  186329.622841  186361.962669    396311.439978    393985.309662
7      2048.0  288904.917304  292087.014093    748956.754441    757407.609430
deepseek-ai/DeepSeek-Coder-V2-Lite-Instruct N=2048 K=512: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0     457.142853     453.979257       937.142832       975.464644
1        16.0    7327.050642    7339.860269     15378.753881     14679.720539
2        64.0   25253.533851   25444.847725     53313.013833     55607.946951
3       128.0   46909.498487   47106.871325    101471.906890     96514.942142
4       256.0   59499.026634   59631.071644    165454.184552    158804.731734
5       512.0   59578.183371   60205.598604    232035.930014    230839.859428
6      1024.0   90562.048975   90776.217617    481536.917797    476414.195203
7      2048.0  113662.263848  112637.859027    528152.557745    526342.030739
deepseek-ai/DeepSeek-Coder-V2-Lite-Instruct N=576 K=512: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0     228.482982     232.807572       318.103440       338.532108
1        16.0    3678.504678    3678.504678      5111.688352      5391.780967
2        64.0   12662.734991   12628.877152     20013.558717     19357.377423
3       128.0   21038.752139   20899.114698     35512.783532     33979.857062
4       256.0   34921.997162   35051.578850     54761.737915     54920.930226
5       512.0   22471.364878   22817.390798     76334.547663     75270.122405
6      1024.0   39014.559310   39421.595317    152054.728093    151445.285684
7      2048.0   63081.133026   63160.215699    270864.516261    268458.979535
deepseek-ai/DeepSeek-Coder-V2-Lite-Instruct N=5472 K=2048: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    1481.939743    1485.081042     2.990766e+03     2.987578e+03
1        16.0   23586.305663   23673.478523     4.729701e+04     4.724717e+04
2        64.0   90489.543797   90535.222866     1.727013e+05     1.722865e+05
3       128.0  169839.271401  170323.147660     2.867311e+05     2.858172e+05
4       256.0  214533.812772  216802.984895     4.792258e+05     4.814772e+05
5       512.0  280893.155442  281361.340925     5.339792e+05     5.333837e+05
6      1024.0  294062.021784  292861.575837     8.147656e+05     8.138413e+05
7      2048.0  365764.389214  363218.080740     1.048353e+06     1.045584e+06
deepseek-ai/DeepSeek-Coder-V2-Lite-Instruct N=2048 K=2736: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    1525.408804    1529.572069      2390.935235      2395.022335
1        16.0   24287.549689   24300.713773     35164.560300     35027.200885
2        64.0   89134.825120   88913.864525    143013.766858    143701.338304
3       128.0  123895.867802  124454.726118    229334.092098    229187.551630
4       256.0  195999.199411  196428.542197    251175.431532    250124.500674
5       512.0  223336.567138  224174.085663    288558.743242    286484.450193
6      1024.0  359848.046912  360843.588880    743375.176918    734244.692158
7      2048.0  430392.728430  431168.769232    888367.894650    884261.386247
deepseek-ai/DeepSeek-Coder-V2-Lite-Instruct N=704 K=2048: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0     806.568235     806.568235      1082.690679      1119.676994
1        16.0   12636.530576   12678.188976     15718.190712     16272.429109
2        64.0   41351.799303   41650.366165     60721.850678     62701.911652
3       128.0   58121.673107   57757.959458     90134.000228     89783.281612
4       256.0   90134.000228   89739.633599    124222.361481    123062.953597
5       512.0   78550.823261   79549.418135    127438.340218    128190.575834
6      1024.0  141019.433349  141832.062911    266562.365863    261836.068889
7      2048.0  222771.966044  225941.779219    509929.382023    508524.607185
deepseek-ai/DeepSeek-Coder-V2-Lite-Instruct N=2048 K=352: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0     326.365269     326.660624       658.686153       675.955067
1        16.0    5240.798504    5231.304232     10313.142582     10774.925236
2        64.0   18218.801367   18363.624320     39693.195341     38121.187114
3       128.0   33049.269174   33096.618057     71521.489244     72875.205467
4       256.0   42194.412656   42271.617186    118165.930882    123207.678929
5       512.0   41624.217950   42127.087983    181543.741418    181187.770576
6      1024.0   63805.120484   63772.092244    385827.800391    393216.000803
7      2048.0   79523.029227   80022.307618    423879.631003    427804.443723
meta-llama/Llama-3.1-8B-Instruct, TP=8
meta-llama/Llama-3.1-8B-Instruct N=768 K=4096: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    1232.802456    1230.873285      1635.193363      1628.422324
1        16.0   19301.300110   19301.300110     21622.762595     21772.402236
2        64.0   64535.630594   64452.997916    101079.901885     99678.795326
3       128.0   80961.464290   80799.020315    114926.468848    113245.874075
4       256.0  126001.986866  125883.822174    155004.748754    155183.946362
5       512.0  126258.768334  126596.144317    152135.380174    153293.619753
6      1024.0  232977.919835  233382.984505    366926.965988    363285.842981
7      2048.0  400848.413928  401597.937563    585110.548257    584898.088199
meta-llama/Llama-3.1-8B-Instruct N=4096 K=512: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0     530.637027     530.637027      1465.921828      1403.208572
1        16.0    8455.991993    8455.991993     23619.690557     22391.466472
2        64.0   29423.741703   29488.321584     90288.169732     87466.663573
3       128.0   56472.804789   56591.745894    168356.887427    161089.687119
4       256.0   66443.520291   66773.758997    298552.894017    288922.156706
5       512.0   88431.003013   88068.700266    405887.627562    405275.424352
6      1024.0  112921.870817  112791.518867    568973.228919    558042.783786
7      2048.0  138718.433103  137899.714045    620011.752078    611197.269593
meta-llama/Llama-3.1-8B-Instruct N=3584 K=4096: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    2147.726188    2145.215676      3194.485590      3184.784259
1        16.0   34363.619000   34323.450817     50108.725251     50151.516975
2        64.0  127391.378389  126943.904591    202508.361196    202683.089330
3       128.0  222505.026414  221456.208601    374955.621637    381037.637965
4       256.0  325585.154161  325923.955021    599450.600961    599641.831201
5       512.0  437855.901848  438366.608221    615248.816994    615148.144284
6      1024.0  504775.072941  505862.048360    826328.436783    824063.824761
7      2048.0  567972.065999  568186.730720    949609.716846    948890.435894
meta-llama/Llama-3.1-8B-Instruct N=4096 K=1792: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    1348.655398    1352.132621      2652.485655      2652.485655
1        16.0   21626.156438   21610.243030     41894.894047     41017.207823
2        64.0   78894.074140   78894.074140    164528.397961    160482.626048
3       128.0  149457.101214  149933.983007    310775.875483    305125.408864
4       256.0  192737.133721  192618.631780    559396.575869    568877.867084
5       512.0  256212.170799  259466.099553    626942.119146    633706.163180
6      1024.0  321596.801620  323729.329846    851641.343343    851255.633285
7      2048.0  380249.326711  381078.112331    965616.462575    965864.567354
meta-llama/Llama-3.3-70B-Instruct, TP=8
meta-llama/Llama-3.3-70B-Instruct N=1280 K=8192: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    2191.973196    2186.488651      2868.271414      2899.999952
1        16.0   34838.539044   34896.507107     33132.386055     33158.577865
2        64.0  119844.575702  119418.079299    180605.387472    178872.497939
3       128.0  119716.307248  119716.307248    217616.604130    218466.658939
4       256.0  211446.001803  211379.404129    276982.907308    276355.612535
5       512.0  266057.324185  266110.059592    307047.733437    308352.686862
6      1024.0  492933.967653  494204.425716    626345.864513    628840.082032
7      2048.0  501661.923840  502601.152944    690018.877416    691084.688950
meta-llama/Llama-3.3-70B-Instruct N=8192 K=1024: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0     922.680704     923.086638      2629.293158      2666.043211
1        16.0   14685.396682   14743.440666     42068.690529     42874.603997
2        64.0   56899.687117   57044.717161    155600.538770    158727.266012
3       128.0  110521.199954  110657.815575    283896.968392    287852.655027
4       256.0  153444.656756  153906.318365    505299.232101    504824.323171
5       512.0  180336.766771  180306.491490    618816.903538    623485.858341
6      1024.0  215079.054423  215175.988640    762161.117356    758528.641458
7      2048.0  246313.617635  246292.448960    876773.024793    875522.482434
meta-llama/Llama-3.3-70B-Instruct N=7168 K=8192: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    2939.719756    2918.679853     3.754721e+03     3.767246e+03
1        16.0   46745.346708   46647.871718     5.952746e+04     5.975461e+04
2        64.0  181071.769213  180897.467405     2.396280e+05     2.399953e+05
3       128.0  332742.428478  332978.271455     4.559968e+05     4.535754e+05
4       256.0  555965.366748  556541.652513     7.231721e+05     7.224771e+05
5       512.0  679072.335172  680209.179362     9.076985e+05     9.086861e+05
6      1024.0  730232.849514  728816.766846     1.033288e+06     1.032011e+06
7      2048.0  764314.555627  761777.752578     1.205703e+06     1.204013e+06
meta-llama/Llama-3.3-70B-Instruct N=8192 K=3584: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    2075.503579    2076.677847     3.489095e+03     3.490754e+03
1        16.0   33292.770141   33217.448748     5.529985e+04     5.537807e+04
2        64.0  129073.511737  129038.061450     2.213036e+05     2.207836e+05
3       128.0  246466.941594  246499.262087     3.926683e+05     3.903844e+05
4       256.0  392504.236086  392258.477308     7.749733e+05     7.676921e+05
5       512.0  476770.536596  477436.760689     8.692463e+05     8.719686e+05
6      1024.0  541548.985995  541392.975014     1.054164e+06     1.052982e+06
7      2048.0  589010.096752  585489.120310     1.127403e+06     1.125841e+06
mistralai/Mistral-Large-Instruct-2407, TP=8
mistralai/Mistral-Large-Instruct-2407 N=1792 K=12288: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    2848.033035    2835.564256     3.363010e+03     3.363010e+03
1        16.0   44986.704617   45032.703643     4.915400e+04     4.885411e+04
2        64.0  169107.696771  169432.984569     2.091635e+05     2.087298e+05
3       128.0  183579.977213  183843.393903     3.566153e+05     3.539285e+05
4       256.0  347642.683989  346958.005871     3.897521e+05     3.875016e+05
5       512.0  408032.279049  407619.241771     4.661298e+05     4.673664e+05
6      1024.0  744011.325555  745980.405844     9.436515e+05     9.427047e+05
7      2048.0  797083.632279  797309.101400     1.023024e+06     1.022931e+06
mistralai/Mistral-Large-Instruct-2407 N=12288 K=1536: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    1803.640876    1805.365461     3.253017e+03     3.208789e+03
1        16.0   28814.212094   28836.216242     5.197663e+04     5.123612e+04
2        64.0  112133.700546  111884.512653     1.987422e+05     2.009233e+05
3       128.0  164760.400015  166142.281348     3.797463e+05     3.792695e+05
4       256.0  208516.435336  210056.973003     5.563318e+05     5.571013e+05
5       512.0  312235.865546  313613.487511     7.341147e+05     7.374750e+05
6      1024.0  367336.300707  366973.724315     9.049637e+05     9.048790e+05
7      2048.0  435030.942710  435177.803903     1.062989e+06     1.059959e+06
mistralai/Mistral-Large-Instruct-2407 N=7168 K=12288: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    3282.795539    3282.061449     3.830404e+03     3.831070e+03
1        16.0   52267.598071   52333.585402     5.953128e+04     5.956650e+04
2        64.0  203530.007558  203809.620885     2.424260e+05     2.422385e+05
3       128.0  366825.486205  366110.781767     4.575048e+05     4.566894e+05
4       256.0  641594.977821  642545.607555     7.741519e+05     7.721901e+05
5       512.0  779827.596442  779611.954904     9.184382e+05     9.228737e+05
6      1024.0  839144.687339  838660.969512     1.068570e+06     1.066826e+06
7      2048.0  861934.384223  859028.438553     1.227250e+06     1.232482e+06
mistralai/Mistral-Large-Instruct-2407 N=12288 K=3584: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    2628.063149    2626.809292     3.672977e+03     3.671752e+03
1        16.0   41968.878257   42059.046080     5.845566e+04     5.845566e+04
2        64.0  163285.775787  163210.144928     2.351097e+05     2.339779e+05
3       128.0  266417.688639  266442.875950     4.342214e+05     4.336870e+05
4       256.0  363409.236309  366957.246759     8.038111e+05     8.042698e+05
5       512.0  559762.790531  561211.546705     9.535613e+05     9.571227e+05
6      1024.0  628777.231473  628181.719463     1.084948e+06     1.086935e+06
7      2048.0  721909.299168  722163.565284     1.231796e+06     1.229882e+06
Qwen/Qwen2.5-7B-Instruct, TP=8
Qwen/Qwen2.5-7B-Instruct N=576 K=3584: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    1002.267919    1004.217872      1347.697107      1347.697107
1        16.0   15582.429620   15700.928088     18812.501318     18769.745308
2        64.0   47058.051934   47192.504540     69840.914385     72444.630172
3       128.0   60200.005016   60172.591325     87858.380438     89525.076403
4       256.0  101802.010162  102116.701935    124659.436958    124307.633885
5       512.0   89661.753415   90258.886988    112939.328388    114903.482668
6      1024.0  167000.332109  167822.197941    232026.341090    231315.556505
7      2048.0  293397.745057  294337.184590    439182.423258    437999.593966
Qwen/Qwen2.5-7B-Instruct N=3584 K=448: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0     452.540536     452.031487      1128.808976      1196.000009
1        16.0    7232.503797    7248.811709     17810.791599     18855.413141
2        64.0   25744.527501   25744.527501     68039.111651     71341.978749
3       128.0   47804.429267   47804.429267    128273.237704    132571.056387
4       256.0   56774.359369   56743.043984    237312.895094    225603.364041
5       512.0   73613.690523   74170.972177    302573.930580    303466.485868
6      1024.0   89524.761996   88973.089448    421187.869730    423354.465253
7      2048.0  109064.546526  108589.664357    509440.487633    512135.084880
Qwen/Qwen2.5-7B-Instruct N=4736 K=3584: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    2044.339138    2044.339138     3.252144e+03     3.302761e+03
1        16.0   32552.620483   32630.834967     5.128759e+04     5.183570e+04
2        64.0  123127.410627  123015.881328     1.808383e+05     1.809587e+05
3       128.0  230772.363974  231756.895241     3.127451e+05     3.086580e+05
4       256.0  291632.353483  292890.209567     4.909518e+05     4.880846e+05
5       512.0  357070.500822  357158.529790     5.225956e+05     5.218426e+05
6      1024.0  363157.457168  361285.666879     7.950796e+05     7.918924e+05
7      2048.0  471356.288936  468233.048363     1.008155e+06     1.006521e+06
Qwen/Qwen2.5-7B-Instruct N=3584 K=2368: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    1546.775544    1547.903763      2766.852602      2699.969501
1        16.0   24784.538050   24640.649942     43420.479855     42496.642281
2        64.0   91893.956359   91893.956359    172141.016296    167885.371644
3       128.0  166956.680922  167059.361087    322228.378159    316226.450724
4       256.0  229521.349417  227980.301940    584168.889892    570669.192632
5       512.0  294459.114930  298832.265715    412198.062239    412511.047202
6      1024.0  350218.892803  350360.034527    788214.796421    792815.866525
7      2048.0  406416.352434  408018.814637    888796.822909    890618.141778
Qwen/Qwen2.5-32B-Instruct, TP=8
Qwen/Qwen2.5-32B-Instruct N=896 K=5120: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    1466.741655    1481.901781      2033.673805      2037.286042
1        16.0   23026.187768   23259.659963     24534.588521     25139.549691
2        64.0   76307.160799   76386.565318    127002.579361    126783.229535
3       128.0   91473.505430   91416.547645    135188.743535    134569.175397
4       256.0  144183.639655  144716.579591    178390.011560    178173.517176
5       512.0  160760.987269  162405.944208    204548.905986    205695.231623
6      1024.0  293409.896045  293190.183582    397334.180734    398547.598527
7      2048.0  508065.268511  509332.072305    745963.676921    745253.680710
Qwen/Qwen2.5-32B-Instruct N=5120 K=640: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0     655.872015     653.779903      1755.546019      1763.096812
1        16.0   10468.826858   10477.188605     27673.924028     27385.051657
2        64.0   38021.564374   37802.421427    100903.384341    100324.584257
3       128.0   72123.380470   72172.984273    173167.520187    171750.443067
4       256.0   80259.674551   80784.850138    289488.334716    285938.746164
5       512.0  110143.817728  110608.188965    388305.359193    386873.807793
6      1024.0  111637.787462  111608.104229    545849.275407    544080.469721
7      2048.0  144656.876653  143329.412243    686720.787808    683645.060628
Qwen/Qwen2.5-32B-Instruct N=6912 K=5120: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    2453.750427    2454.431084     3.575040e+03     3.576485e+03
1        16.0   39211.072905   39178.521366     5.687890e+04     5.690176e+04
2        64.0  151494.469574  151494.469574     2.299173e+05     2.298240e+05
3       128.0  283072.402392  283568.510536     4.331062e+05     4.324447e+05
4       256.0  451314.093290  451809.153337     7.021530e+05     7.001995e+05
5       512.0  540156.256821  540285.126504     7.963246e+05     8.014667e+05
6      1024.0  585157.685063  584176.738070     9.480570e+05     9.483548e+05
7      2048.0  624221.917267  620991.838802     1.128974e+06     1.129854e+06
Qwen/Qwen2.5-32B-Instruct N=5120 K=3456: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    2033.235258    2035.105774     3.248399e+03     3.301731e+03
1        16.0   32397.765008   32486.974767     5.155799e+04     5.243638e+04
2        64.0  123352.848430  123325.991926     1.856764e+05     1.875209e+05
3       128.0  227161.245408  226797.347959     3.206755e+05     3.165528e+05
4       256.0  301711.757086  302922.145021     5.122686e+05     5.108823e+05
5       512.0  371322.324247  371170.209536     5.583564e+05     5.575318e+05
6      1024.0  382724.698926  381516.104156     8.493633e+05     8.472982e+05
7      2048.0  498926.657055  496112.991196     1.078435e+06     1.075235e+06
Qwen/Qwen2.5-72B-Instruct, TP=8
Qwen/Qwen2.5-72B-Instruct N=1280 K=8192: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    2181.031694    2179.218686      2906.430156      2852.665902
1        16.0   34637.158676   34580.048336     33054.058305     33556.480750
2        64.0  119121.333501  119588.313070    178872.497939    181582.685424
3       128.0  120554.979371  120446.804315    217194.044401    218182.581420
4       256.0  211379.404129  211312.848394    276185.008120    275787.819495
5       512.0  266321.259652  266215.617734    307399.314511    308069.597541
6      1024.0  494477.498006  493840.769944    625616.056205    627663.846651
7      2048.0  501872.949663  502201.545725    688691.206517    689885.851120
Qwen/Qwen2.5-72B-Instruct N=8192 K=1024: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0     928.807449     928.807449      2626.002595      2669.435150
1        16.0   14801.946828   14841.209585     41858.873078     42819.919767
2        64.0   57288.083856   57202.666824    155061.502786    158352.906400
3       128.0  110248.988731  110294.265870    285102.467309    284498.432077
4       256.0  153620.198946  153422.747288    510097.876016    508167.500416
5       512.0  179763.409551  180276.240461    620246.011144    619530.612394
6      1024.0  214810.254820  215046.755740    758662.532318    752814.353502
7      2048.0  247135.266134  245968.221493    877668.387199    870379.654619
Qwen/Qwen2.5-72B-Instruct N=7392 K=8192: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    2918.222865    2924.988503     3.762829e+03     3.747460e+03
1        16.0   46754.648758   46610.704166     5.970812e+04     5.944438e+04
2        64.0  181246.413157  181127.832698     2.398078e+05     2.393044e+05
3       128.0  331489.357288  331234.378727     4.534129e+05     4.530948e+05
4       256.0  556864.007082  556783.990146     7.389463e+05     7.383832e+05
5       512.0  691454.145522  691546.698882     9.265535e+05     9.278289e+05
6      1024.0  743571.494302  742947.917947     1.056933e+06     1.056285e+06
7      2048.0  773240.941884  772066.416077     1.245679e+06     1.235651e+06
Qwen/Qwen2.5-72B-Instruct N=8192 K=3696: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    2066.730030    2066.730030     3.455241e+03     3.456029e+03
1        16.0   33031.609646   33022.604231     5.541030e+04     5.541030e+04
2        64.0  128312.402900  128346.393199     2.183451e+05     2.178542e+05
3       128.0  247008.752583  247008.752583     3.874511e+05     3.842249e+05
4       256.0  393428.882349  394229.153999     6.642778e+05     6.545190e+05
5       512.0  482757.657387  483118.718464     7.941121e+05     7.970514e+05
6      1024.0  545866.466857  545386.415748     1.031553e+06     1.025549e+06
7      2048.0  590120.814315  587883.265142     1.098696e+06     1.089393e+06
deepseek-ai/DeepSeek-Coder-V2-Lite-Instruct, TP=8
deepseek-ai/DeepSeek-Coder-V2-Lite-Instruct N=384 K=2048: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0     508.155029     505.542425       678.124147       627.291865
1        16.0    7996.177919    7945.697200      9771.726491     10384.474974
2        64.0   23927.726741   23882.322739     36270.847912     38666.616954
3       128.0   33121.009460   33296.254233     61395.042730     61097.010605
4       256.0   61999.920193   61809.618198     72962.224468     74805.253833
5       512.0   44990.111704   45090.851637     75876.317859     75676.716856
6      1024.0   85256.451778   86205.366456    155984.303624    152557.377263
7      2048.0  148507.189752  147961.604446    268859.471688    269579.315128
deepseek-ai/DeepSeek-Coder-V2-Lite-Instruct N=512 K=2048: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0     653.885291     652.258698       819.400021       882.855244
1        16.0   10182.835101   10158.179138     12339.200048     12412.212556
2        64.0   31308.417511   31425.678397     48222.160726     45976.195443
3       128.0   43815.435349   43987.712293     73764.008590     73926.485620
4       256.0   73441.192403   73200.924949     91826.607382     94277.032621
5       512.0   60175.032059   60040.471483    103748.454693    106210.838615
6      1024.0  109525.188789  110722.055291    197572.475748    200075.258705
7      2048.0  183967.784916  184695.428637    363084.513223    364811.122340
deepseek-ai/DeepSeek-Coder-V2-Lite-Instruct N=2048 K=256: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0     264.241448     265.577361       544.929457       602.422015
1        16.0    4194.108009    4189.926414      8718.871313      9507.909727
2        64.0   14642.843074   14694.042217     32203.033456     34588.444127
3       128.0   27069.216791   26895.975001     63195.431378     58571.372297
4       256.0   32672.465346   32640.744265     98882.259205     98017.401581
5       512.0   31612.571334   31479.371282    161634.463360    160861.086853
6      1024.0   47654.099495   47637.218708    304943.012767    303566.304418
7      2048.0   59968.728763   58975.934405    381503.175883    366930.076930
deepseek-ai/DeepSeek-Coder-V2-Lite-Instruct N=576 K=256: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0     121.900987     122.710966       192.877282       175.052128
1        16.0    1944.000086    1956.874170      3143.489271      2800.834044
2        64.0    6952.658850    6993.798557     11999.512617     10696.398443
3       128.0   11790.044961   11849.142519     21588.164965     21106.285425
4       256.0   19990.731806   20033.084140     34013.008143     35952.913307
5       512.0   12037.703591   12188.998594     55621.270803     54815.164001
6      1024.0   20907.939864   20942.670925    104194.115933    104194.115933
7      2048.0   33338.442665   33397.318949    185177.303953    184050.925824
deepseek-ai/DeepSeek-Coder-V2-Lite-Instruct N=2736 K=2048: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0    1373.700045    1368.333988      2391.081992      2382.948999
1        16.0   21936.188249   21786.961348     37178.746692     37240.505879
2        64.0   81153.966872   81007.346129    149458.552901    149708.068236
3       128.0  138494.422226  138708.645710    277202.902383    276348.649638
4       256.0  189688.287109  190191.164391    411118.098249    422746.663600
5       512.0  217988.789680  219388.712831    406920.631373    409241.930512
6      1024.0  241386.634242  242939.760259    591183.405320    595353.610546
7      2048.0  349482.936309  352012.324153    851009.577977    855321.730962
deepseek-ai/DeepSeek-Coder-V2-Lite-Instruct N=352 K=2048: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0     467.015544     465.808777       583.391594       622.687408
1        16.0    7395.610236    7376.695488      9244.513140      9156.469491
2        64.0   21871.378186   21768.210526     35718.738428     33833.289697
3       128.0   30521.566380   30602.526061     52204.308925     51969.152592
4       256.0   57830.334190   57902.894758     69188.316306     70888.797749
5       512.0   41231.726068   41333.280667     70402.149542     70348.488958
6      1024.0   78118.674107   79123.200387    142433.978267    143765.134167
7      2048.0  135956.123830  135831.077077    253912.559112    253912.559112
deepseek-ai/DeepSeek-Coder-V2-Lite-Instruct N=2048 K=176: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0     191.457621     189.848745       396.350869       358.603177
1        16.0    3018.555265    3012.266743      6232.275707      5760.510181
2        64.0   10218.289709   10218.289709     23227.116068     21341.520519
3       128.0   18777.766974   18869.663607     38815.785363     41018.100221
4       256.0   22893.822610   22848.599901     69891.869526     70316.741163
5       512.0   22006.382778   21871.147488    129603.401908    128523.375829
6      1024.0   33238.804464   33167.322622    246108.588627    237883.890897
7      2048.0   41412.770677   40801.071587    271369.003783    275817.686396

GPU: sm90-H100

meta-llama/Llama-3.1-8B-Instruct, TP=1
meta-llama/Llama-3.1-8B-Instruct N=6144 K=4096: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   1.000354e+04   9.971829e+03     1.023126e+04     1.023126e+04
1        16.0   1.582949e+05   1.590452e+05     1.554835e+05     1.553636e+05
2        64.0   6.402263e+05   6.311949e+05     6.341769e+05     6.341769e+05
3       128.0   1.230565e+06   1.229625e+06     1.150578e+06     1.147300e+06
4       256.0   2.156371e+06   2.156371e+06     2.021091e+06     2.023630e+06
5       512.0   3.079941e+06   3.077000e+06     3.177139e+06     3.174008e+06
6      1024.0   4.794075e+06   4.801220e+06     4.975473e+06     4.960152e+06
7      2048.0   5.238404e+06   5.242667e+06     5.212975e+06     5.226719e+06
meta-llama/Llama-3.1-8B-Instruct N=4096 K=4096: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   8.225130e+03   8.201009e+03     8.423325e+03     8.423325e+03
1        16.0   1.326424e+05   1.331688e+05     1.310880e+05     1.305779e+05
2        64.0   5.264083e+05   5.284807e+05     5.253781e+05     5.264083e+05
3       128.0   9.780263e+05   9.798111e+05     1.028614e+06     1.024688e+06
4       256.0   1.639501e+06   1.629549e+06     1.769148e+06     1.760447e+06
5       512.0   3.020740e+06   3.016497e+06     3.163101e+06     3.163101e+06
6      1024.0   4.396614e+06   4.401119e+06     4.432912e+06     4.423781e+06
7      2048.0   5.053519e+06   5.038700e+06     5.023967e+06     5.035746e+06
meta-llama/Llama-3.1-8B-Instruct N=28672 K=4096: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   1.697324e+04   1.701258e+04     1.564396e+04     1.563563e+04
1        16.0   2.642404e+05   2.646870e+05     2.539564e+05     2.543689e+05
2        64.0   1.054589e+06   1.055181e+06     1.068378e+06     1.065954e+06
3       128.0   2.054978e+06   2.058354e+06     2.084612e+06     2.084612e+06
4       256.0   3.650855e+06   3.654404e+06     2.853876e+06     2.868031e+06
5       512.0   5.220215e+06   5.200353e+06     5.463016e+06     5.468978e+06
6      1024.0   5.299338e+06   5.304479e+06     5.150028e+06     5.097642e+06
7      2048.0   5.222029e+06   5.222936e+06     5.216141e+06     5.220215e+06
meta-llama/Llama-3.1-8B-Instruct N=4096 K=14336: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   1.322574e+04   1.321384e+04     1.335812e+04     1.330968e+04
1        16.0   2.119939e+05   2.118027e+05     2.106630e+05     2.108521e+05
2        64.0   8.426519e+05   8.411431e+05     8.449253e+05     8.449253e+05
3       128.0   1.525255e+06   1.520319e+06     1.661462e+06     1.662933e+06
4       256.0   2.423100e+06   2.426228e+06     2.629970e+06     2.628131e+06
5       512.0   4.677321e+06   4.677321e+06     4.709558e+06     4.706609e+06
6      1024.0   5.245258e+06   5.232478e+06     5.213425e+06     5.210714e+06
7      2048.0   5.427043e+06   5.441285e+06     5.428513e+06     5.426064e+06
meta-llama/Llama-3.3-70B-Instruct, TP=1
meta-llama/Llama-3.3-70B-Instruct N=10240 K=8192: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   1.472809e+04   1.475918e+04     1.442421e+04     1.443414e+04
1        16.0   2.344967e+05   2.341694e+05     2.229666e+05     2.231149e+05
2        64.0   9.425977e+05   9.373319e+05     9.392996e+05     9.399574e+05
3       128.0   1.714252e+06   1.718092e+06     1.747733e+06     1.747733e+06
4       256.0   2.858912e+06   2.855871e+06     2.817653e+06     2.824322e+06
5       512.0   4.500450e+06   4.494798e+06     4.563567e+06     4.563567e+06
6      1024.0   5.421900e+06   5.412336e+06     5.358320e+06     5.367695e+06
7      2048.0   5.350977e+06   5.386543e+06     5.367024e+06     5.383168e+06
meta-llama/Llama-3.3-70B-Instruct N=8192 K=8192: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   1.270117e+04   1.274942e+04     1.249311e+04     1.249311e+04
1        16.0   2.046127e+05   2.046127e+05     1.943895e+05     1.939681e+05
2        64.0   8.134904e+05   8.110328e+05     7.977766e+05     8.001545e+05
3       128.0   1.767584e+06   1.769040e+06     1.748872e+06     1.746028e+06
4       256.0   3.324481e+06   3.319343e+06     2.776490e+06     2.763983e+06
5       512.0   4.925722e+06   4.922899e+06     4.951273e+06     4.948421e+06
6      1024.0   5.357318e+06   5.358989e+06     5.402804e+06     5.382493e+06
7      2048.0   5.389246e+06   5.387557e+06     5.373655e+06     5.375757e+06
meta-llama/Llama-3.3-70B-Instruct N=57344 K=8192: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   2.264270e+04   2.265580e+04     2.194463e+04     2.193029e+04
1        16.0   3.566451e+05   3.566451e+05     3.300831e+05     3.303152e+05
2        64.0   1.336531e+06   1.336056e+06     1.354596e+06     1.353133e+06
3       128.0   2.695106e+06   2.694623e+06     2.618130e+06     2.622240e+06
4       256.0   4.892857e+06   4.891265e+06     2.377653e+06     2.364842e+06
5       512.0   5.579255e+06   5.581844e+06     6.094376e+06     6.137921e+06
6      1024.0   5.318581e+06   5.320582e+06     4.756057e+06     4.770395e+06
7      2048.0   5.298781e+06   5.283998e+06     5.286901e+06     5.287017e+06
meta-llama/Llama-3.3-70B-Instruct N=8192 K=28672: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   1.723548e+04   1.726589e+04     1.740405e+04     1.741438e+04
1        16.0   2.775600e+05   2.776420e+05     2.606215e+05     2.608386e+05
2        64.0   1.084920e+06   1.083825e+06     1.064937e+06     1.065541e+06
3       128.0   2.556573e+06   2.553967e+06     2.449910e+06     2.453909e+06
4       256.0   4.697703e+06   4.699171e+06     2.691129e+06     2.684881e+06
5       512.0   5.496397e+06   5.506464e+06     5.962970e+06     5.887652e+06
6      1024.0   5.447352e+06   5.447105e+06     5.011718e+06     5.005251e+06
7      2048.0   5.397961e+06   5.399052e+06     5.409008e+06     5.399173e+06
mistralai/Mistral-Large-Instruct-2407, TP=1
mistralai/Mistral-Large-Instruct-2407 N=14336 K=12288: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   1.889403e+04   1.893464e+04     1.878924e+04     1.878123e+04
1        16.0   3.013994e+05   3.015283e+05     2.870353e+05     2.868017e+05
2        64.0   1.152366e+06   1.150484e+06     1.164266e+06     1.165228e+06
3       128.0   2.307090e+06   2.309453e+06     2.119314e+06     2.121307e+06
4       256.0   4.113370e+06   4.111870e+06     2.703128e+06     2.695051e+06
5       512.0   4.978363e+06   4.946150e+06     5.137730e+06     5.151815e+06
6      1024.0   5.360631e+06   5.368928e+06     5.032246e+06     5.051976e+06
7      2048.0   5.291756e+06   5.267343e+06     5.273502e+06     5.263962e+06
mistralai/Mistral-Large-Instruct-2407 N=12288 K=12288: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   1.802783e+04   1.801922e+04     1.695118e+04     1.697404e+04
1        16.0   2.843711e+05   2.846392e+05     2.760532e+05     2.764322e+05
2        64.0   1.126874e+06   1.125824e+06     1.131094e+06     1.127663e+06
3       128.0   2.145664e+06   2.142809e+06     2.197378e+06     2.196379e+06
4       256.0   3.687169e+06   3.675949e+06     2.708919e+06     2.703993e+06
5       512.0   5.346650e+06   5.358508e+06     5.648200e+06     5.649851e+06
6      1024.0   5.378639e+06   5.374152e+06     5.066354e+06     5.069011e+06
7      2048.0   5.297195e+06   5.294293e+06     5.282355e+06     5.295018e+06
mistralai/Mistral-Large-Instruct-2407 N=57344 K=12288: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   2.335361e+04   2.336135e+04     2.293704e+04     2.293853e+04
1        16.0   3.498432e+05   3.497564e+05     3.398056e+05     3.392738e+05
2        64.0   1.386296e+06   1.384169e+06     1.393837e+06     1.393493e+06
3       128.0   2.779428e+06   2.779770e+06     2.616558e+06     2.615041e+06
4       256.0   5.042374e+06   5.033650e+06     2.391314e+06     2.397544e+06
5       512.0   5.340161e+06   5.339686e+06     5.528023e+06     5.579487e+06
6      1024.0   5.195288e+06   5.200379e+06     4.766517e+06     4.773959e+06
7      2048.0   5.228035e+06   5.202479e+06     5.225309e+06     5.232357e+06
mistralai/Mistral-Large-Instruct-2407 N=12288 K=28672: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   2.168437e+04   2.167904e+04     1.999136e+04     1.998682e+04
1        16.0   3.455040e+05   3.453346e+05     3.257769e+05     3.255511e+05
2        64.0   1.348945e+06   1.349591e+06     1.352830e+06     1.352181e+06
3       128.0   2.531885e+06   2.534732e+06     2.564423e+06     2.568220e+06
4       256.0   4.229783e+06   4.220283e+06     2.627167e+06     2.632842e+06
5       512.0   5.535188e+06   5.531454e+06     5.538587e+06     5.553250e+06
6      1024.0   5.463106e+06   5.459965e+06     5.003933e+06     5.003377e+06
7      2048.0   5.126368e+06   5.125130e+06     5.140685e+06     5.102154e+06
Qwen/Qwen2.5-7B-Instruct, TP=1
Qwen/Qwen2.5-7B-Instruct N=4608 K=3584: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   8.112661e+03   8.096753e+03     8.308539e+03     8.325290e+03
1        16.0   1.303146e+05   1.303146e+05     1.285399e+05     1.287905e+05
2        64.0   5.192103e+05   5.212584e+05     5.171781e+05     5.192103e+05
3       128.0   9.842756e+05   9.842756e+05     8.913256e+05     8.913256e+05
4       256.0   1.662126e+06   1.662126e+06     1.547748e+06     1.547748e+06
5       512.0   2.320773e+06   2.313156e+06     2.386257e+06     2.383567e+06
6      1024.0   3.686529e+06   3.680112e+06     3.709165e+06     3.712421e+06
7      2048.0   4.646646e+06   4.636456e+06     4.641546e+06     4.653038e+06
Qwen/Qwen2.5-7B-Instruct N=3584 K=3584: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   6.997194e+03   6.997194e+03     7.217330e+03     7.217330e+03
1        16.0   1.131881e+05   1.131881e+05     1.124451e+05     1.124451e+05
2        64.0   4.497802e+05   4.497802e+05     4.487982e+05     4.487982e+05
3       128.0   8.238459e+05   8.288289e+05     8.728219e+05     8.709727e+05
4       256.0   1.403069e+06   1.398296e+06     1.500362e+06     1.500362e+06
5       512.0   2.618466e+06   2.626832e+06     2.647981e+06     2.650115e+06
6      1024.0   3.754330e+06   3.754330e+06     3.720354e+06     3.741517e+06
7      2048.0   4.402668e+06   4.420421e+06     4.414487e+06     4.408570e+06
Qwen/Qwen2.5-7B-Instruct N=37888 K=3584: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   1.726978e+04   1.725223e+04     1.582862e+04     1.585819e+04
1        16.0   2.767387e+05   2.768798e+05     2.605459e+05     2.601715e+05
2        64.0   1.081071e+06   1.079996e+06     1.070684e+06     1.070420e+06
3       128.0   2.151438e+06   2.149310e+06     2.165374e+06     2.171867e+06
4       256.0   3.844233e+06   3.844233e+06     2.785837e+06     2.797493e+06
5       512.0   5.398640e+06   5.410402e+06     5.613051e+06     5.609429e+06
6      1024.0   5.445144e+06   5.437905e+06     5.282571e+06     5.296655e+06
7      2048.0   5.360765e+06   5.374438e+06     5.358286e+06     5.360765e+06
Qwen/Qwen2.5-7B-Instruct N=3584 K=18944: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   1.345030e+04   1.346096e+04     1.362301e+04     1.363395e+04
1        16.0   2.155463e+05   2.155463e+05     2.167505e+05     2.167505e+05
2        64.0   8.567456e+05   8.547234e+05     8.621852e+05     8.642429e+05
3       128.0   1.524707e+06   1.522570e+06     1.709447e+06     1.708103e+06
4       256.0   2.374543e+06   2.374543e+06     2.473905e+06     2.468284e+06
5       512.0   4.535922e+06   4.530011e+06     4.615415e+06     4.622780e+06
6      1024.0   4.851145e+06   4.843035e+06     4.605632e+06     4.627088e+06
7      2048.0   4.847763e+06   4.869493e+06     4.852500e+06     4.857246e+06
Qwen/Qwen2.5-32B-Instruct, TP=1
Qwen/Qwen2.5-32B-Instruct N=7168 K=5120: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   1.157117e+04   1.155659e+04     1.180944e+04     1.180944e+04
1        16.0   1.863134e+05   1.858417e+05     1.794804e+05     1.795902e+05
2        64.0   7.368380e+05   7.377637e+05     7.340749e+05     7.349936e+05
3       128.0   1.418502e+06   1.423660e+06     1.383416e+06     1.385047e+06
4       256.0   2.584202e+06   2.584202e+06     2.155082e+06     2.163020e+06
5       512.0   3.776591e+06   3.773557e+06     3.838300e+06     3.841438e+06
6      1024.0   4.684027e+06   4.686364e+06     4.806219e+06     4.811141e+06
7      2048.0   5.331154e+06   5.329642e+06     5.322095e+06     5.307065e+06
Qwen/Qwen2.5-32B-Instruct N=5120 K=5120: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   1.003712e+04   1.006796e+04     1.025703e+04     1.030541e+04
1        16.0   1.628383e+05   1.628383e+05     1.560533e+05     1.560533e+05
2        64.0   6.443492e+05   6.443492e+05     6.394381e+05     6.413935e+05
3       128.0   1.222949e+06   1.219393e+06     1.092373e+06     1.092373e+06
4       256.0   2.048700e+06   2.053715e+06     1.915394e+06     1.917583e+06
5       512.0   2.787185e+06   2.789502e+06     2.863286e+06     2.868180e+06
6      1024.0   4.324447e+06   4.324447e+06     4.435917e+06     4.430061e+06
7      2048.0   5.303470e+06   5.309764e+06     5.300329e+06     5.297192e+06
Qwen/Qwen2.5-32B-Instruct N=55296 K=5120: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   2.086230e+04   2.092706e+04     1.895709e+04     1.898251e+04
1        16.0   3.173361e+05   3.171584e+05     3.023419e+05     3.022612e+05
2        64.0   1.261919e+06   1.260866e+06     1.263679e+06     1.263327e+06
3       128.0   2.509161e+06   2.503615e+06     2.449468e+06     2.441218e+06
4       256.0   4.461143e+06   4.461143e+06     2.717020e+06     2.732383e+06
5       512.0   5.419818e+06   5.413342e+06     5.959438e+06     5.937958e+06
6      1024.0   5.354558e+06   5.353372e+06     5.085929e+06     5.079157e+06
7      2048.0   5.291526e+06   5.293652e+06     5.287377e+06     5.297811e+06
Qwen/Qwen2.5-32B-Instruct N=5120 K=27648: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   1.765972e+04   1.770389e+04     1.785574e+04     1.783774e+04
1        16.0   2.845434e+05   2.844004e+05     2.735465e+05     2.735465e+05
2        64.0   1.127971e+06   1.126288e+06     1.123495e+06     1.126848e+06
3       128.0   2.154033e+06   2.156083e+06     1.667868e+06     1.665722e+06
4       256.0   3.405963e+06   3.398297e+06     2.432830e+06     2.441353e+06
5       512.0   3.862656e+06   3.864304e+06     4.032429e+06     4.029291e+06
6      1024.0   4.850682e+06   4.848086e+06     4.597164e+06     4.598331e+06
7      2048.0   5.487499e+06   5.477753e+06     5.456720e+06     5.470930e+06
Qwen/Qwen2.5-72B-Instruct, TP=1
Qwen/Qwen2.5-72B-Instruct N=10240 K=8192: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   1.472809e+04   1.471775e+04     1.443414e+04     1.444408e+04
1        16.0   2.340061e+05   2.340061e+05     2.231149e+05     2.228186e+05
2        64.0   9.425977e+05   9.412757e+05     9.379869e+05     9.399574e+05
3       128.0   1.710429e+06   1.713158e+06     1.748872e+06     1.747733e+06
4       256.0   2.852836e+06   2.854352e+06     2.816175e+06     2.808074e+06
5       512.0   4.559692e+06   4.550031e+06     4.594811e+06     4.608615e+06
6      1024.0   5.380470e+06   5.379796e+06     5.230430e+06     5.243200e+06
7      2048.0   5.452183e+06   5.445961e+06     5.432186e+06     5.444581e+06
Qwen/Qwen2.5-72B-Instruct N=8192 K=8192: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   1.269156e+04   1.273008e+04     1.249311e+04     1.249311e+04
1        16.0   2.044568e+05   2.044568e+05     1.938280e+05     1.938280e+05
2        64.0   8.141072e+05   8.141072e+05     7.960025e+05     7.954128e+05
3       128.0   1.767584e+06   1.770498e+06     1.744610e+06     1.746028e+06
4       256.0   3.324481e+06   3.321910e+06     2.763983e+06     2.769329e+06
5       512.0   4.911640e+06   4.920079e+06     4.939884e+06     4.942727e+06
6      1024.0   5.340664e+06   5.332377e+06     5.250892e+06     5.275075e+06
7      2048.0   5.450799e+06   5.444753e+06     5.438720e+06     5.446479e+06
Qwen/Qwen2.5-72B-Instruct N=59136 K=8192: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   2.215842e+04   2.217871e+04     2.200144e+04     2.200544e+04
1        16.0   3.331992e+05   3.333138e+05     3.284275e+05     3.281773e+05
2        64.0   1.319188e+06   1.319188e+06     1.357777e+06     1.358490e+06
3       128.0   2.709857e+06   2.708910e+06     2.656004e+06     2.652368e+06
4       256.0   4.941225e+06   4.934933e+06     2.427289e+06     2.429761e+06
5       512.0   5.615537e+06   5.629556e+06     6.242438e+06     6.147142e+06
6      1024.0   5.392498e+06   5.398601e+06     4.825678e+06     4.805391e+06
7      2048.0   5.349815e+06   5.345664e+06     5.351661e+06     5.349007e+06
Qwen/Qwen2.5-72B-Instruct N=8192 K=29568: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   1.725742e+04   1.725250e+04     1.718396e+04     1.722796e+04
1        16.0   2.774629e+05   2.777015e+05     2.600377e+05     2.602473e+05
2        64.0   1.082268e+06   1.081362e+06     1.060937e+06     1.060066e+06
3       128.0   2.538881e+06   2.539713e+06     2.433659e+06     2.433659e+06
4       256.0   4.711979e+06   4.706257e+06     2.674212e+06     2.677446e+06
5       512.0   5.527691e+06   5.525721e+06     5.967055e+06     6.071044e+06
6      1024.0   5.539048e+06   5.541523e+06     5.049851e+06     5.038157e+06
7      2048.0   5.400361e+06   5.398950e+06     5.406364e+06     5.399890e+06
deepseek-ai/DeepSeek-Coder-V2-Lite-Instruct, TP=1
deepseek-ai/DeepSeek-Coder-V2-Lite-Instruct N=3072 K=2048: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   4.206546e+03   4.217823e+03     4.370133e+03     4.370133e+03
1        16.0   6.821672e+04   6.858847e+04     6.730473e+04     6.730473e+04
2        64.0   2.721294e+05   2.706663e+05     2.685010e+05     2.692189e+05
3       128.0   5.009347e+05   5.009347e+05     5.230539e+05     5.271616e+05
4       256.0   8.605801e+05   8.605801e+05     9.174294e+05     9.132687e+05
5       512.0   1.724846e+06   1.724846e+06     1.674642e+06     1.681635e+06
6      1024.0   2.585884e+06   2.590042e+06     2.549060e+06     2.545033e+06
7      2048.0   4.022487e+06   4.027515e+06     4.017471e+06     4.027515e+06
deepseek-ai/DeepSeek-Coder-V2-Lite-Instruct N=4096 K=2048: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   5.392453e+03   5.378626e+03     5.564096e+03     5.534733e+03
1        16.0   8.740266e+04   8.763087e+04     8.432820e+04     8.454062e+04
2        64.0   3.477992e+05   3.460064e+05     3.424758e+05     3.407373e+05
3       128.0   6.454351e+05   6.454351e+05     6.679129e+05     6.662555e+05
4       256.0   1.125790e+06   1.128155e+06     1.206746e+06     1.201347e+06
5       512.0   2.174097e+06   2.169705e+06     2.156634e+06     2.139450e+06
6      1024.0   3.356262e+06   3.372069e+06     3.356262e+06     3.356262e+06
7      2048.0   4.410694e+06   4.410694e+06     4.406170e+06     4.397150e+06
deepseek-ai/DeepSeek-Coder-V2-Lite-Instruct N=2048 K=2048: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   2.946157e+03   2.913422e+03     3.031306e+03     3.031306e+03
1        16.0   4.822216e+04   4.822216e+04     4.727130e+04     4.740484e+04
2        64.0   1.901565e+05   1.901565e+05     1.880259e+05     1.880259e+05
3       128.0   3.477992e+05   3.514411e+05     3.678096e+05     3.678096e+05
4       256.0   5.966689e+05   5.914119e+05     7.336093e+05     7.356191e+05
5       512.0   1.177636e+06   1.177636e+06     1.159831e+06     1.159831e+06
6      1024.0   2.334791e+06   2.350118e+06     2.256311e+06     2.261061e+06
7      2048.0   3.458950e+06   3.458950e+06     3.487026e+06     3.498384e+06
deepseek-ai/DeepSeek-Coder-V2-Lite-Instruct N=576 K=2048: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   9.048589e+02   9.076431e+02     9.454616e+02     9.485016e+02
1        16.0   1.465759e+04   1.456711e+04     1.443347e+04     1.443347e+04
2        64.0   5.791097e+04   5.791097e+04     5.755785e+04     5.755785e+04
3       128.0   1.045927e+05   1.043037e+05     1.132172e+05     1.130478e+05
4       256.0   1.739998e+05   1.739998e+05     2.221056e+05     2.221056e+05
5       512.0   3.456105e+05   3.424757e+05     3.363737e+05     3.363737e+05
6      1024.0   6.757575e+05   6.757575e+05     6.653384e+05     6.653384e+05
7      2048.0   1.339528e+06   1.342505e+06     1.327752e+06     1.327752e+06
deepseek-ai/DeepSeek-Coder-V2-Lite-Instruct N=21888 K=2048: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   1.231801e+04   1.230449e+04     1.064520e+04     1.065532e+04
1        16.0   1.957972e+05   1.957972e+05     1.920238e+05     1.922297e+05
2        64.0   7.738954e+05   7.722294e+05     7.979990e+05     7.979990e+05
3       128.0   1.537837e+06   1.541141e+06     1.505564e+06     1.497706e+06
4       256.0   2.732957e+06   2.730356e+06     2.308612e+06     2.308612e+06
5       512.0   3.930965e+06   3.928274e+06     3.936357e+06     3.941764e+06
6      1024.0   5.063263e+06   5.061031e+06     5.326412e+06     5.336317e+06
7      2048.0   5.412433e+06   5.413071e+06     5.414986e+06     5.407334e+06
deepseek-ai/DeepSeek-Coder-V2-Lite-Instruct N=2048 K=10944: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   7.859164e+03   7.859164e+03     8.074329e+03     8.051126e+03
1        16.0   1.261891e+05   1.268138e+05     1.273542e+05     1.275353e+05
2        64.0   4.994838e+05   5.001804e+05     5.022820e+05     5.001804e+05
3       128.0   8.579650e+05   8.589925e+05     9.825462e+05     9.812022e+05
4       256.0   1.297032e+06   1.297032e+06     1.760144e+06     1.762306e+06
5       512.0   2.319349e+06   2.315605e+06     2.456365e+06     2.456365e+06
6      1024.0   4.307860e+06   4.301402e+06     4.277354e+06     4.278949e+06
7      2048.0   4.723663e+06   4.703336e+06     4.716868e+06     4.716868e+06
deepseek-ai/DeepSeek-Coder-V2-Lite-Instruct N=2816 K=2048: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   3.866338e+03   3.866338e+03     4.017114e+03     4.017114e+03
1        16.0   6.287276e+04   6.287276e+04     6.253199e+04     6.270191e+04
2        64.0   2.514911e+05   2.514911e+05     2.494519e+05     2.508076e+05
3       128.0   4.591901e+05   4.569169e+05     4.857748e+05     4.870565e+05
4       256.0   7.948092e+05   7.888651e+05     8.555941e+05     8.546039e+05
5       512.0   1.591331e+06   1.577730e+06     1.544723e+06     1.547962e+06
6      1024.0   2.381864e+06   2.385712e+06     2.336638e+06     2.336638e+06
7      2048.0   3.705785e+06   3.710441e+06     3.710441e+06     3.701141e+06
deepseek-ai/DeepSeek-Coder-V2-Lite-Instruct N=2048 K=1408: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   2.127292e+03   2.152693e+03     2.212123e+03     2.212123e+03
1        16.0   3.464995e+04   3.464995e+04     3.454620e+04     3.403667e+04
2        64.0   1.381848e+05   1.373623e+05     1.361467e+05     1.361467e+05
3       128.0   2.600210e+05   2.571238e+05     2.667846e+05     2.675578e+05
4       256.0   4.535993e+05   4.547165e+05     5.320315e+05     5.320315e+05
5       512.0   8.961889e+05   8.983694e+05     8.897104e+05     8.875717e+05
6      1024.0   1.823357e+06   1.823357e+06     1.749904e+06     1.741650e+06
7      2048.0   2.786640e+06   2.797196e+06     2.791908e+06     2.797196e+06
meta-llama/Llama-3.1-8B-Instruct, TP=4
meta-llama/Llama-3.1-8B-Instruct N=1536 K=4096: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   3.799652e+03   3.790496e+03     3.922833e+03     3.922833e+03
1        16.0   6.094164e+04   6.108955e+04     6.094164e+04     6.079443e+04
2        64.0   2.431777e+05   2.431777e+05     2.431777e+05     2.425918e+05
3       128.0   4.405934e+05   4.415596e+05     4.828565e+05     4.805517e+05
4       256.0   6.825463e+05   6.813914e+05     9.321813e+05     9.321813e+05
5       512.0   1.346831e+06   1.342341e+06     1.331247e+06     1.331247e+06
6      1024.0   2.636349e+06   2.653722e+06     2.645007e+06     2.640671e+06
7      2048.0   3.287366e+06   3.300839e+06     3.328119e+06     3.324684e+06
meta-llama/Llama-3.1-8B-Instruct N=4096 K=1024: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   3.014621e+03   3.014621e+03     3.122286e+03     3.140982e+03
1        16.0   4.936885e+04   4.893705e+04     4.837293e+04     4.809572e+04
2        64.0   1.946134e+05   1.963206e+05     1.929357e+05     1.934917e+05
3       128.0   3.619495e+05   3.619495e+05     3.678993e+05     3.668942e+05
4       256.0   6.582513e+05   6.614939e+05     6.886321e+05     6.904024e+05
5       512.0   1.316503e+06   1.313284e+06     1.294297e+06     1.285007e+06
6      1024.0   2.174628e+06   2.183468e+06     2.201365e+06     2.201365e+06
7      2048.0   3.131971e+06   3.131971e+06     3.127412e+06     3.136543e+06
meta-llama/Llama-3.1-8B-Instruct N=7168 K=4096: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   1.063903e+04   1.065447e+04     1.087545e+04     1.089158e+04
1        16.0   1.717176e+05   1.717176e+05     1.651967e+05     1.651967e+05
2        64.0   6.799123e+05   6.750279e+05     6.799123e+05     6.828770e+05
3       128.0   1.301439e+06   1.303244e+06     1.287176e+06     1.288942e+06
4       256.0   2.375825e+06   2.384870e+06     2.001361e+06     2.007775e+06
5       512.0   3.454554e+06   3.470503e+06     3.565992e+06     3.565992e+06
6      1024.0   4.490508e+06   4.490508e+06     4.724771e+06     4.724771e+06
7      2048.0   5.138148e+06   5.171730e+06     5.173510e+06     5.187792e+06
meta-llama/Llama-3.1-8B-Instruct N=4096 K=3584: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   7.615203e+03   7.615203e+03     7.809634e+03     7.843008e+03
1        16.0   1.233791e+05   1.231204e+05     1.210896e+05     1.213398e+05
2        64.0   4.863639e+05   4.883863e+05     4.823692e+05     4.843583e+05
3       128.0   9.122865e+05   9.087575e+05     9.396551e+05     9.434289e+05
4       256.0   1.530383e+06   1.521709e+06     1.657240e+06     1.657240e+06
5       512.0   2.830287e+06   2.834556e+06     2.918184e+06     2.931841e+06
6      1024.0   4.204274e+06   4.208982e+06     4.204274e+06     4.223169e+06
7      2048.0   4.929339e+06   4.922882e+06     4.945553e+06     4.932573e+06
meta-llama/Llama-3.3-70B-Instruct, TP=4
meta-llama/Llama-3.3-70B-Instruct N=2560 K=8192: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   8.322539e+03   8.335771e+03     8.539413e+03     8.539413e+03
1        16.0   1.335847e+05   1.333723e+05     1.346568e+05     1.346568e+05
2        64.0   5.334893e+05   5.351911e+05     5.369037e+05     5.360460e+05
3       128.0   9.386428e+05   9.360246e+05     1.061914e+06     1.065285e+06
4       256.0   1.481522e+06   1.481522e+06     1.560767e+06     1.558049e+06
5       512.0   2.706168e+06   2.714377e+06     2.770401e+06     2.767545e+06
6      1024.0   3.349368e+06   3.347280e+06     3.385269e+06     3.385269e+06
7      2048.0   4.557756e+06   4.540412e+06     4.532745e+06     4.528922e+06
meta-llama/Llama-3.3-70B-Instruct N=8192 K=2048: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   7.885955e+03   7.885955e+03     7.050972e+03     7.027350e+03
1        16.0   1.266514e+05   1.266514e+05     1.218244e+05     1.224913e+05
2        64.0   5.009347e+05   4.981466e+05     4.953893e+05     4.981466e+05
3       128.0   1.022861e+06   1.026772e+06     1.020916e+06     1.020916e+06
4       256.0   1.916153e+06   1.832771e+06     1.699373e+06     1.704768e+06
5       512.0   2.950560e+06   2.942477e+06     3.008414e+06     3.008414e+06
6      1024.0   4.326300e+06   4.308943e+06     4.315435e+06     4.321948e+06
7      2048.0   5.075034e+06   5.051165e+06     5.081036e+06     5.060089e+06
meta-llama/Llama-3.3-70B-Instruct N=14336 K=8192: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   1.693306e+04   1.692330e+04     1.661682e+04     1.664508e+04
1        16.0   2.627465e+05   2.628935e+05     2.542158e+05     2.543534e+05
2        64.0   1.054525e+06   1.050398e+06     1.069834e+06     1.070138e+06
3       128.0   2.086799e+06   2.084485e+06     1.949339e+06     1.949339e+06
4       256.0   3.740558e+06   3.735910e+06     2.834333e+06     2.832197e+06
5       512.0   4.927336e+06   4.906430e+06     5.036283e+06     5.041349e+06
6      1024.0   5.493624e+06   5.490615e+06     5.312121e+06     5.316818e+06
7      2048.0   5.353979e+06   5.360900e+06     5.358989e+06     5.364247e+06
meta-llama/Llama-3.3-70B-Instruct N=8192 K=7168: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   1.235782e+04   1.235782e+04     1.203368e+04     1.201398e+04
1        16.0   1.973928e+05   1.973928e+05     1.867229e+05     1.874680e+05
2        64.0   7.836444e+05   7.816886e+05     7.739618e+05     7.726889e+05
3       128.0   1.683852e+06   1.683852e+06     1.674848e+06     1.674848e+06
4       256.0   3.137194e+06   3.129358e+06     2.667394e+06     2.682626e+06
5       512.0   4.654314e+06   4.658641e+06     4.706773e+06     4.718592e+06
6      1024.0   5.289738e+06   5.291600e+06     5.329115e+06     5.346171e+06
7      2048.0   5.339526e+06   5.344271e+06     5.342372e+06     5.348073e+06
mistralai/Mistral-Large-Instruct-2407, TP=4
mistralai/Mistral-Large-Instruct-2407 N=3584 K=12288: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   1.167603e+04   1.170085e+04     1.191612e+04     1.190324e+04
1        16.0   1.870148e+05   1.870148e+05     1.866186e+05     1.868165e+05
2        64.0   7.417597e+05   7.417597e+05     7.456844e+05     7.472659e+05
3       128.0   1.313461e+06   1.311017e+06     1.448452e+06     1.448452e+06
4       256.0   2.094121e+06   2.091014e+06     2.276807e+06     2.273135e+06
5       512.0   4.041128e+06   4.041128e+06     4.064437e+06     4.067369e+06
6      1024.0   4.717468e+06   4.725376e+06     4.642680e+06     4.649380e+06
7      2048.0   4.857711e+06   4.850397e+06     4.849354e+06     4.838948e+06
mistralai/Mistral-Large-Instruct-2407 N=12288 K=3072: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   1.176898e+04   1.173970e+04     1.070150e+04     1.067729e+04
1        16.0   1.883036e+05   1.883036e+05     1.797851e+05     1.799994e+05
2        64.0   7.457754e+05   7.466973e+05     7.466973e+05     7.457754e+05
3       128.0   1.371346e+06   1.372905e+06     1.457366e+06     1.459126e+06
4       256.0   2.413898e+06   2.411489e+06     2.294694e+06     2.292516e+06
5       512.0   4.044037e+06   4.040656e+06     4.071293e+06     4.074726e+06
6      1024.0   5.130175e+06   5.141090e+06     5.369583e+06     5.357677e+06
7      2048.0   5.249999e+06   5.260000e+06     5.245726e+06     5.245726e+06
mistralai/Mistral-Large-Instruct-2407 N=14336 K=12288: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   1.890214e+04   1.893464e+04     1.865395e+04     1.866186e+04
1        16.0   3.007562e+05   3.010131e+05     2.864519e+05     2.863355e+05
2        64.0   1.155673e+06   1.151424e+06     1.166192e+06     1.166675e+06
3       128.0   2.304732e+06   2.305674e+06     2.120908e+06     2.116926e+06
4       256.0   4.110371e+06   4.111870e+06     2.703776e+06     2.708323e+06
5       512.0   4.941814e+06   4.946150e+06     5.164795e+06     5.161249e+06
6      1024.0   5.343482e+06   5.347284e+06     5.017131e+06     5.024958e+06
7      2048.0   5.266421e+06   5.244221e+06     5.257825e+06     5.245747e+06
mistralai/Mistral-Large-Instruct-2407 N=12288 K=7168: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   1.525572e+04   1.529280e+04     1.468598e+04     1.469088e+04
1        16.0   2.470870e+05   2.470870e+05     2.338063e+05     2.342727e+05
2        64.0   9.787392e+05   9.767044e+05     9.794193e+05     9.787392e+05
3       128.0   1.858121e+06   1.861802e+06     1.920142e+06     1.918835e+06
4       256.0   3.225136e+06   3.223292e+06     2.854450e+06     2.840069e+06
5       512.0   5.090327e+06   5.092627e+06     5.204282e+06     5.203081e+06
6      1024.0   5.448212e+06   5.442952e+06     5.365251e+06     5.338577e+06
7      2048.0   5.314045e+06   5.309666e+06     5.326598e+06     5.298749e+06
Qwen/Qwen2.5-7B-Instruct, TP=4
Qwen/Qwen2.5-7B-Instruct N=1152 K=3584: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   2.674446e+03   2.702450e+03     2.805261e+03     2.805261e+03
1        16.0   4.335269e+04   4.335269e+04     4.323920e+04     4.301400e+04
2        64.0   1.729568e+05   1.734108e+05     1.707222e+05     1.716091e+05
3       128.0   3.116486e+05   3.080163e+05     3.396890e+05     3.396890e+05
4       256.0   4.858052e+05   4.813807e+05     6.673687e+05     6.673687e+05
5       512.0   9.592668e+05   9.592668e+05     9.388207e+05     9.305564e+05
6      1024.0   1.857842e+06   1.857842e+06     1.870995e+06     1.874312e+06
7      2048.0   2.402527e+06   2.395721e+06     2.405261e+06     2.405261e+06
Qwen/Qwen2.5-7B-Instruct N=3584 K=896: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   2.419470e+03   2.397803e+03     2.525987e+03     2.494609e+03
1        16.0   3.894613e+04   3.906451e+04     3.882847e+04     3.847971e+04
2        64.0   1.553139e+05   1.553139e+05     1.530027e+05     1.534594e+05
3       128.0   2.816926e+05   2.816926e+05     2.937651e+05     2.937651e+05
4       256.0   5.299886e+05   5.299886e+05     5.633851e+05     5.649329e+05
5       512.0   1.065469e+06   1.062716e+06     1.020524e+06     1.012983e+06
6      1024.0   1.780395e+06   1.772721e+06     1.735321e+06     1.728030e+06
7      2048.0   2.530899e+06   2.530899e+06     2.534799e+06     2.530899e+06
Qwen/Qwen2.5-7B-Instruct N=9472 K=3584: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   1.075804e+04   1.074443e+04     9.801496e+03     9.835569e+03
1        16.0   1.699744e+05   1.687075e+05     1.630367e+05     1.626461e+05
2        64.0   6.756693e+05   6.781999e+05     6.781999e+05     6.765108e+05
3       128.0   1.253144e+06   1.250260e+06     1.305861e+06     1.302729e+06
4       256.0   2.105574e+06   2.111713e+06     2.147186e+06     2.147186e+06
5       512.0   3.400552e+06   3.403215e+06     3.438216e+06     3.432785e+06
6      1024.0   4.823424e+06   4.820749e+06     5.029983e+06     5.041653e+06
7      2048.0   5.340590e+06   5.336491e+06     5.350453e+06     5.340590e+06
Qwen/Qwen2.5-7B-Instruct N=3584 K=4736: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   8.037697e+03   8.083627e+03     8.305096e+03     8.272717e+03
1        16.0   1.298326e+05   1.300813e+05     1.293380e+05     1.298326e+05
2        64.0   5.163686e+05   5.144126e+05     5.163686e+05     5.144126e+05
3       128.0   9.398265e+05   9.414553e+05     9.985656e+05     1.000405e+06
4       256.0   1.566603e+06   1.567734e+06     1.676604e+06     1.692273e+06
5       512.0   2.862818e+06   2.874178e+06     2.972475e+06     2.960325e+06
6      1024.0   4.023850e+06   4.016412e+06     4.061456e+06     4.046329e+06
7      2048.0   4.576891e+06   4.572075e+06     4.570873e+06     4.560082e+06
Qwen/Qwen2.5-32B-Instruct, TP=4
Qwen/Qwen2.5-32B-Instruct N=1792 K=5120: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   5.052828e+03   5.075186e+03     5.201778e+03     5.213600e+03
1        16.0   8.031454e+04   8.174553e+04     8.084525e+04     8.084525e+04
2        64.0   3.233810e+05   3.233810e+05     3.205567e+05     3.219627e+05
3       128.0   5.780117e+05   5.791518e+05     6.411134e+05     6.383260e+05
4       256.0   8.817716e+05   8.817716e+05     1.249489e+06     1.244195e+06
5       512.0   1.727235e+06   1.727235e+06     1.709636e+06     1.709636e+06
6      1024.0   3.331971e+06   3.331971e+06     3.308507e+06     3.327251e+06
7      2048.0   4.179786e+06   4.161275e+06     4.183508e+06     4.179786e+06
Qwen/Qwen2.5-32B-Instruct N=5120 K=1280: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   4.313263e+03   4.301942e+03     4.502857e+03     4.453913e+03
1        16.0   7.011936e+04   7.011936e+04     6.901221e+04     6.883108e+04
2        64.0   2.775094e+05   2.782455e+05     2.760488e+05     2.775094e+05
3       128.0   5.311319e+05   5.311319e+05     5.092163e+05     5.092163e+05
4       256.0   9.690398e+05   9.735365e+05     9.514608e+05     9.536233e+05
5       512.0   1.545467e+06   1.539795e+06     1.536975e+06     1.548318e+06
6      1024.0   2.622464e+06   2.622464e+06     2.602135e+06     2.598107e+06
7      2048.0   3.876159e+06   3.880640e+06     3.858338e+06     3.867228e+06
Qwen/Qwen2.5-32B-Instruct N=13824 K=5120: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   1.430594e+04   1.436400e+04     1.355011e+04     1.355011e+04
1        16.0   2.257920e+05   2.259722e+05     2.149910e+05     2.148279e+05
2        64.0   9.126291e+05   9.111606e+05     9.067836e+05     9.082379e+05
3       128.0   1.771028e+06   1.773802e+06     1.679129e+06     1.675403e+06
4       256.0   3.161403e+06   3.163611e+06     2.757328e+06     2.750632e+06
5       512.0   4.335206e+06   4.327959e+06     4.445820e+06     4.456754e+06
6      1024.0   5.240359e+06   5.226756e+06     5.301686e+06     5.292396e+06
7      2048.0   5.216975e+06   5.228264e+06     5.203492e+06     5.216224e+06
Qwen/Qwen2.5-32B-Instruct N=5120 K=6912: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   1.140206e+04   1.135815e+04     1.158874e+04     1.158115e+04
1        16.0   1.836161e+05   1.836161e+05     1.762989e+05     1.762989e+05
2        64.0   7.325640e+05   7.325640e+05     7.306735e+05     7.306735e+05
3       128.0   1.376117e+06   1.381151e+06     1.211277e+06     1.209983e+06
4       256.0   2.244884e+06   2.242661e+06     2.087639e+06     2.093427e+06
5       512.0   3.002105e+06   3.004096e+06     3.054738e+06     3.050624e+06
6      1024.0   4.415376e+06   4.419684e+06     4.532442e+06     4.532442e+06
7      2048.0   5.245536e+06   5.232661e+06     5.234172e+06     5.237198e+06
Qwen/Qwen2.5-72B-Instruct, TP=4
Qwen/Qwen2.5-72B-Instruct N=2560 K=8192: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   8.322539e+03   8.296203e+03     8.567320e+03     8.581342e+03
1        16.0   1.344410e+05   1.335847e+05     1.344410e+05     1.346568e+05
2        64.0   5.351911e+05   5.351911e+05     5.377641e+05     5.360460e+05
3       128.0   9.386428e+05   9.360246e+05     1.065285e+06     1.061914e+06
4       256.0   1.479889e+06   1.481522e+06     1.560767e+06     1.558954e+06
5       512.0   2.706168e+06   2.703443e+06     2.770401e+06     2.779004e+06
6      1024.0   3.345194e+06   3.349368e+06     3.391685e+06     3.385269e+06
7      2048.0   4.540412e+06   4.550031e+06     4.525105e+06     4.527013e+06
Qwen/Qwen2.5-72B-Instruct N=8192 K=2048: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   7.930677e+03   7.915713e+03     7.027350e+03     7.050972e+03
1        16.0   1.266514e+05   1.268908e+05     1.222682e+05     1.224913e+05
2        64.0   4.990725e+05   5.009347e+05     4.981466e+05     4.981466e+05
3       128.0   1.017049e+06   1.022861e+06     1.022861e+06     1.018979e+06
4       256.0   1.858138e+06   1.894187e+06     1.707478e+06     1.699373e+06
5       512.0   2.946513e+06   2.938451e+06     3.008414e+06     3.012634e+06
6      1024.0   4.321948e+06   4.313269e+06     4.321948e+06     4.317604e+06
7      2048.0   5.063071e+06   5.072038e+06     5.054136e+06     5.051165e+06
Qwen/Qwen2.5-72B-Instruct N=14784 K=8192: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   1.578701e+04   1.577878e+04     1.671976e+04     1.670131e+04
1        16.0   2.592144e+05   2.469275e+05     2.519354e+05     2.519354e+05
2        64.0   1.028602e+06   9.841984e+05     1.067118e+06     1.065944e+06
3       128.0   2.105254e+06   2.110988e+06     2.006094e+06     2.004019e+06
4       256.0   3.831709e+06   3.825091e+06     2.755616e+06     2.744882e+06
5       512.0   5.056456e+06   5.054807e+06     5.148819e+06     5.157383e+06
6      1024.0   5.113158e+06   5.115266e+06     5.080483e+06     5.032655e+06
7      2048.0   5.254395e+06   5.262868e+06     5.248614e+06     5.246172e+06
Qwen/Qwen2.5-72B-Instruct N=8192 K=7392: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   1.203485e+04   1.207324e+04     1.176367e+04     1.175453e+04
1        16.0   1.934804e+05   1.930179e+05     1.809092e+05     1.809092e+05
2        64.0   7.690077e+05   7.647591e+05     7.534601e+05     7.558111e+05
3       128.0   1.616263e+06   1.617612e+06     1.484980e+06     1.483843e+06
4       256.0   3.011499e+06   3.011499e+06     1.894330e+06     1.893405e+06
5       512.0   4.224304e+06   4.219705e+06     4.118809e+06     4.114436e+06
6      1024.0   4.758501e+06   4.752666e+06     4.183269e+06     4.191186e+06
7      2048.0   4.933396e+06   4.914629e+06     4.917747e+06     4.918527e+06
deepseek-ai/DeepSeek-Coder-V2-Lite-Instruct, TP=4
deepseek-ai/DeepSeek-Coder-V2-Lite-Instruct N=768 K=2048: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   1.174066e+03   1.177581e+03     1.236830e+03     1.225271e+03
1        16.0   1.912764e+04   1.912764e+04     1.901206e+04     1.901206e+04
2        64.0   7.627869e+04   7.627869e+04     7.491657e+04     7.491657e+04
3       128.0   1.364334e+05   1.360647e+05     1.467753e+05     1.472045e+05
4       256.0   2.304070e+05   2.304070e+05     2.910054e+05     2.918489e+05
5       512.0   4.425841e+05   4.435589e+05     4.358782e+05     4.368238e+05
6      1024.0   8.969966e+05   8.969966e+05     8.774542e+05     8.774542e+05
7      2048.0   1.758740e+06   1.758740e+06     1.751093e+06     1.758740e+06
deepseek-ai/DeepSeek-Coder-V2-Lite-Instruct N=1024 K=2048: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   1.560762e+03   1.560762e+03     1.628621e+03     1.603719e+03
1        16.0   2.542623e+04   2.542623e+04     2.497219e+04     2.497219e+04
2        64.0   9.988876e+04   9.988876e+04     9.988876e+04     9.988876e+04
3       128.0   1.834023e+05   1.839048e+05     1.948483e+05     1.951315e+05
4       256.0   3.065080e+05   3.037342e+05     3.880072e+05     3.868890e+05
5       512.0   6.060971e+05   6.060971e+05     5.914119e+05     5.927174e+05
6      1024.0   1.195995e+06   1.195995e+06     1.177636e+06     1.177636e+06
7      2048.0   2.205347e+06   2.205347e+06     2.178507e+06     2.187381e+06
deepseek-ai/DeepSeek-Coder-V2-Lite-Instruct N=2048 K=512: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   8.864864e+02   8.864864e+02     9.304965e+02     9.304965e+02
1        16.0   1.432901e+04   1.452734e+04     1.418378e+04     1.418378e+04
2        64.0   5.692746e+04   5.741401e+04     5.673513e+04     5.673513e+04
3       128.0   1.020888e+05   1.020888e+05     1.076513e+05     1.086964e+05
4       256.0   2.011210e+05   1.993306e+05     2.195242e+05     2.216977e+05
5       512.0   3.916875e+05   3.974816e+05     3.882913e+05     3.882913e+05
6      1024.0   7.879695e+05   7.902871e+05     7.633454e+05     7.590328e+05
7      2048.0   1.301199e+06   1.301199e+06     1.298056e+06     1.294928e+06
deepseek-ai/DeepSeek-Coder-V2-Lite-Instruct N=576 K=512: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0     257.142855     256.249991       276.404499       276.404499
1        16.0    4202.135441    4202.135441      4099.999855      4099.999855
2        64.0   16748.936550   16748.936550     16399.999420     16399.999420
3       128.0   31176.236802   31279.470160     32573.793518     32461.855232
4       256.0   58311.112322   58311.112322     65147.587035     64261.225000
5       512.0  116622.224645  116622.224645    113470.269046    112123.440491
6      1024.0  231813.490889  229003.629526    223583.423643    223583.423643
7      2048.0  434317.239638  433072.768190    431835.443969    431835.443969
deepseek-ai/DeepSeek-Coder-V2-Lite-Instruct N=5472 K=2048: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   6.471935e+03   6.471935e+03     6.672257e+03     6.720259e+03
1        16.0   1.050060e+05   1.050060e+05     1.019036e+05     1.026031e+05
2        64.0   4.142039e+05   4.142039e+05     4.113538e+05     4.076143e+05
3       128.0   8.024621e+05   8.060686e+05     7.664541e+05     7.631926e+05
4       256.0   1.434802e+06   1.434802e+06     1.379617e+06     1.374332e+06
5       512.0   2.173943e+06   2.180550e+06     2.190538e+06     2.187198e+06
6      1024.0   3.503790e+06   3.499517e+06     3.499517e+06     3.503790e+06
7      2048.0   4.401234e+06   4.397861e+06     4.381075e+06     4.387774e+06
deepseek-ai/DeepSeek-Coder-V2-Lite-Instruct N=2048 K=2736: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   3.583345e+03   3.611052e+03     3.736235e+03     3.687074e+03
1        16.0   5.822703e+04   5.837866e+04     5.792612e+04     5.792612e+04
2        64.0   2.241741e+05   2.253006e+05     2.236150e+05     2.230588e+05
3       128.0   4.094504e+05   4.113286e+05     4.021060e+05     4.003109e+05
4       256.0   6.793154e+05   6.818984e+05     7.231422e+05     7.260699e+05
5       512.0   1.260733e+06   1.247577e+06     1.306661e+06     1.304286e+06
6      1024.0   2.321544e+06   2.317793e+06     2.009403e+06     2.006593e+06
7      2048.0   2.927988e+06   2.925003e+06     2.930979e+06     2.925003e+06
deepseek-ai/DeepSeek-Coder-V2-Lite-Instruct N=704 K=2048: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   1.089233e+03   1.089233e+03     1.130207e+03     1.126675e+03
1        16.0   1.758712e+04   1.753367e+04     1.753367e+04     1.748053e+04
2        64.0   7.013466e+04   7.013466e+04     6.908474e+04     6.908474e+04
3       128.0   1.264345e+05   1.260891e+05     1.345440e+05     1.345440e+05
4       256.0   2.107242e+05   2.112064e+05     2.683059e+05     2.683059e+05
5       512.0   4.129629e+05   4.129629e+05     4.030446e+05     4.030446e+05
6      1024.0   8.222469e+05   8.186006e+05     8.043330e+05     8.096247e+05
7      2048.0   1.598220e+06   1.598220e+06     1.598220e+06     1.594768e+06
deepseek-ai/DeepSeek-Coder-V2-Lite-Instruct N=2048 K=352: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0     626.666645     613.877556       656.290913       656.290913
1        16.0   10132.210976   10132.210976      9991.972709     10026.666312
2        64.0   40815.264842   40815.264842     39967.890836     39967.890836
3       128.0   73571.463929   73571.463929     76494.834342     76494.834342
4       256.0  142601.484443  142601.484443    156090.804011    156090.804011
5       512.0  280868.569517  280868.569517    275838.098018    275838.098018
6      1024.0  553327.885624  556661.211562    537242.790634    537242.790634
7      2048.0  908164.685999  905938.839953    903723.813699    903723.813699
meta-llama/Llama-3.1-8B-Instruct, TP=8
meta-llama/Llama-3.1-8B-Instruct N=768 K=4096: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   2.011581e+03   2.006449e+03     2.120022e+03     2.114323e+03
1        16.0   3.260220e+04   3.260220e+04     3.251795e+04     3.251795e+04
2        64.0   1.294031e+05   1.290713e+05     1.297366e+05     1.297366e+05
3       128.0   2.293294e+05   2.303789e+05     2.588061e+05     2.588061e+05
4       256.0   3.477568e+05   3.489621e+05     5.136510e+05     5.123439e+05
5       512.0   6.860347e+05   6.848679e+05     6.756751e+05     6.768107e+05
6      1024.0   1.374411e+06   1.374411e+06     1.353621e+06     1.358187e+06
7      2048.0   2.627748e+06   2.632041e+06     2.627748e+06     2.632041e+06
meta-llama/Llama-3.1-8B-Instruct N=4096 K=512: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   1.687460e+03   1.687460e+03     1.755184e+03     1.755184e+03
1        16.0   2.753049e+04   2.744052e+04     2.699936e+04     2.691282e+04
2        64.0   1.059533e+05   1.059533e+05     1.049600e+05     1.056201e+05
3       128.0   2.066905e+05   2.066905e+05     2.112402e+05     2.112402e+05
4       256.0   3.763272e+05   3.773843e+05     3.742306e+05     3.731911e+05
5       512.0   7.611830e+05   7.590328e+05     7.341465e+05     7.341465e+05
6      1024.0   1.273448e+06   1.279512e+06     1.246857e+06     1.252669e+06
7      2048.0   1.912439e+06   1.869201e+06     1.892237e+06     1.905657e+06
meta-llama/Llama-3.1-8B-Instruct N=3584 K=4096: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   7.583603e+03   7.567967e+03     7.842872e+03     7.826149e+03
1        16.0   1.226042e+05   1.223488e+05     1.220944e+05     1.218411e+05
2        64.0   4.813723e+05   4.823608e+05     4.803879e+05     4.823608e+05
3       128.0   8.914979e+05   8.914979e+05     9.491301e+05     9.472165e+05
4       256.0   1.484421e+06   1.486770e+06     1.606220e+06     1.584551e+06
5       512.0   2.747482e+06   2.735484e+06     2.821738e+06     2.813290e+06
6      1024.0   3.878798e+06   3.878798e+06     3.911088e+06     3.907022e+06
7      2048.0   4.440112e+06   4.437491e+06     4.437491e+06     4.440112e+06
meta-llama/Llama-3.1-8B-Instruct N=4096 K=1792: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   4.817638e+03   4.830316e+03     4.960865e+03     4.974309e+03
1        16.0   7.728505e+04   7.769397e+04     7.608373e+04     7.608373e+04
2        64.0   3.099559e+05   3.091402e+05     3.051254e+05     3.051254e+05
3       128.0   5.859016e+05   5.661363e+05     5.888385e+05     5.932994e+05
4       256.0   1.037292e+06   1.039587e+06     1.108238e+06     1.110858e+06
5       512.0   2.034169e+06   2.034169e+06     1.978497e+06     1.982671e+06
6      1024.0   3.137850e+06   3.153645e+06     3.096495e+06     3.096495e+06
7      2048.0   4.233271e+06   4.233271e+06     4.233271e+06     4.233271e+06
meta-llama/Llama-3.3-70B-Instruct, TP=8
meta-llama/Llama-3.3-70B-Instruct N=1280 K=8192: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   5.080620e+03   5.080620e+03     5.232735e+03     5.243200e+03
1        16.0   8.160622e+04   8.192500e+04     8.128993e+04     8.128993e+04
2        64.0   3.232802e+05   3.245308e+05     3.226585e+05     3.226585e+05
3       128.0   5.630282e+05   5.639745e+05     6.490615e+05     6.515821e+05
4       256.0   7.812917e+05   7.812917e+05     1.256797e+06     1.256797e+06
5       512.0   1.549953e+06   1.553541e+06     1.537525e+06     1.537525e+06
6      1024.0   2.784770e+06   2.787662e+06     2.905323e+06     2.905323e+06
7      2048.0   3.228525e+06   3.232412e+06     3.242172e+06     3.240215e+06
meta-llama/Llama-3.3-70B-Instruct N=8192 K=1024: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   4.983791e+03   4.983791e+03     4.454726e+03     4.464204e+03
1        16.0   8.012128e+04   7.955169e+04     7.917645e+04     7.917645e+04
2        64.0   3.174545e+05   3.174545e+05     3.189626e+05     3.189626e+05
3       128.0   6.566419e+05   6.550403e+05     6.518605e+05     6.487114e+05
4       256.0   1.223538e+06   1.223538e+06     1.119027e+06     1.123709e+06
5       512.0   2.170235e+06   2.157161e+06     2.152838e+06     2.148532e+06
6      1024.0   3.168927e+06   3.173608e+06     3.104815e+06     3.100335e+06
7      2048.0   4.008456e+06   4.034802e+06     4.034802e+06     4.031017e+06
meta-llama/Llama-3.3-70B-Instruct N=7168 K=8192: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   1.354332e+04   1.359348e+04     1.375910e+04     1.374622e+04
1        16.0   2.127675e+05   2.139302e+05     2.055078e+05     2.060486e+05
2        64.0   8.683747e+05   8.683747e+05     8.707891e+05     8.707891e+05
3       128.0   1.692940e+06   1.691416e+06     1.636901e+06     1.638329e+06
4       256.0   3.090729e+06   3.085653e+06     2.488958e+06     2.492259e+06
5       512.0   4.396988e+06   4.400850e+06     4.431988e+06     4.434603e+06
6      1024.0   4.906430e+06   4.893653e+06     4.950050e+06     4.959031e+06
7      2048.0   5.368078e+06   5.371435e+06     5.357556e+06     5.349930e+06
meta-llama/Llama-3.3-70B-Instruct N=8192 K=3584: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   9.749079e+03   9.788075e+03     9.399559e+03     9.387539e+03
1        16.0   1.566092e+05   1.570279e+05     1.492464e+05     1.492464e+05
2        64.0   6.281118e+05   6.331908e+05     6.297957e+05     6.297957e+05
3       128.0   1.332844e+06   1.330956e+06     1.332844e+06     1.338540e+06
4       256.0   2.469527e+06   2.476035e+06     2.140445e+06     2.142885e+06
5       512.0   3.666947e+06   3.674116e+06     3.747379e+06     3.747379e+06
6      1024.0   5.028255e+06   5.034991e+06     5.195053e+06     5.195053e+06
7      2048.0   5.249470e+06   5.254975e+06     5.264174e+06     5.255893e+06
mistralai/Mistral-Large-Instruct-2407, TP=8
mistralai/Mistral-Large-Instruct-2407 N=1792 K=12288: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   7.710431e+03   7.721245e+03     7.819954e+03     7.819954e+03
1        16.0   1.235399e+05   1.233669e+05     1.238874e+05     1.242369e+05
2        64.0   4.879998e+05   4.846436e+05     4.879998e+05     4.900360e+05
3       128.0   8.339311e+05   8.359095e+05     9.719610e+05     9.719610e+05
4       256.0   1.218102e+06   1.218102e+06     1.859292e+06     1.859292e+06
5       512.0   2.323732e+06   2.325649e+06     2.333350e+06     2.337220e+06
6      1024.0   4.329780e+06   4.333108e+06     4.353185e+06     4.346472e+06
7      2048.0   4.666700e+06   4.680261e+06     4.699770e+06     4.692924e+06
mistralai/Mistral-Large-Instruct-2407 N=12288 K=1536: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   8.692685e+03   8.660785e+03     7.776158e+03     7.737915e+03
1        16.0   1.390830e+05   1.393396e+05     1.336673e+05     1.339043e+05
2        64.0   5.522636e+05   5.512558e+05     5.502517e+05     5.502517e+05
3       128.0   1.017132e+06   1.017132e+06     1.069339e+06     1.065567e+06
4       256.0   1.817072e+06   1.814343e+06     1.792808e+06     1.795472e+06
5       512.0   2.987275e+06   2.983587e+06     3.039881e+06     3.032253e+06
6      1024.0   4.402014e+06   4.414074e+06     4.394010e+06     4.402014e+06
7      2048.0   5.063815e+06   5.054547e+06     5.077113e+06     5.075780e+06
mistralai/Mistral-Large-Instruct-2407 N=7168 K=12288: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   1.505194e+04   1.507255e+04     1.520787e+04     1.519737e+04
1        16.0   2.451885e+05   2.450180e+05     2.331806e+05     2.325649e+05
2        64.0   9.773533e+05   9.780315e+05     9.800720e+05     9.821209e+05
3       128.0   1.918779e+06   1.916171e+06     1.841076e+06     1.843484e+06
4       256.0   3.529977e+06   3.523359e+06     2.518934e+06     2.520060e+06
5       512.0   4.775412e+06   4.759286e+06     4.793685e+06     4.807995e+06
6      1024.0   4.881372e+06   4.882957e+06     4.722902e+06     4.708602e+06
7      2048.0   5.339054e+06   5.331480e+06     5.358721e+06     5.330535e+06
mistralai/Mistral-Large-Instruct-2407 N=12288 K=3584: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   1.214067e+04   1.218096e+04     1.157895e+04     1.157895e+04
1        16.0   1.968551e+05   1.970753e+05     1.876308e+05     1.876308e+05
2        64.0   7.990265e+05   7.972188e+05     7.954192e+05     7.954192e+05
3       128.0   1.465159e+06   1.463637e+06     1.560889e+06     1.559162e+06
4       256.0   2.555726e+06   2.544193e+06     2.430143e+06     2.428050e+06
5       512.0   4.271160e+06   4.274398e+06     4.333536e+06     4.326885e+06
6      1024.0   5.215478e+06   5.227567e+06     5.513869e+06     5.487037e+06
7      2048.0   5.295075e+06   5.289486e+06     5.295075e+06     5.295697e+06
Qwen/Qwen2.5-7B-Instruct, TP=8
Qwen/Qwen2.5-7B-Instruct N=576 K=3584: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   1.421950e+03   1.421950e+03     1.504863e+03     1.500488e+03
1        16.0   2.287725e+04   2.294080e+04     2.262654e+04     2.256472e+04
2        64.0   9.025889e+04   9.025889e+04     9.001295e+04     8.976835e+04
3       128.0   1.615391e+05   1.615391e+05     1.785662e+05     1.785662e+05
4       256.0   2.469888e+05   2.479156e+05     3.533129e+05     3.542601e+05
5       512.0   4.903117e+05   4.903117e+05     4.805055e+05     4.813807e+05
6      1024.0   9.557975e+05   9.557975e+05     9.438500e+05     9.421676e+05
7      2048.0   1.851335e+06   1.838456e+06     1.848098e+06     1.841659e+06
Qwen/Qwen2.5-7B-Instruct N=3584 K=448: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   1.313255e+03   1.311113e+03     1.380948e+03     1.380948e+03
1        16.0   2.150400e+04   2.157616e+04     2.108097e+04     2.108097e+04
2        64.0   8.488047e+04   8.488047e+04     8.432388e+04     8.404831e+04
3       128.0   1.612463e+05   1.607424e+05     1.659276e+05     1.648640e+05
4       256.0   2.939290e+05   2.990556e+05     3.061760e+05     3.061760e+05
5       512.0   5.929403e+05   5.895423e+05     5.763313e+05     5.763313e+05
6      1024.0   9.915676e+05   9.915676e+05     9.774360e+05     9.705202e+05
7      2048.0   1.446399e+06   1.448946e+06     1.454066e+06     1.443862e+06
Qwen/Qwen2.5-7B-Instruct N=4736 K=3584: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   8.193143e+03   8.193143e+03     8.454279e+03     8.471154e+03
1        16.0   1.336708e+05   1.336708e+05     1.303354e+05     1.305861e+05
2        64.0   5.315442e+05   5.315442e+05     5.284418e+05     5.274157e+05
3       128.0   1.011617e+06   1.011617e+06     9.160846e+05     9.114734e+05
4       256.0   1.705614e+06   1.710986e+06     1.602472e+06     1.600112e+06
5       512.0   2.372219e+06   2.367051e+06     2.438780e+06     2.441520e+06
6      1024.0   3.759434e+06   3.752941e+06     3.812198e+06     3.808857e+06
7      2048.0   4.754820e+06   4.757422e+06     4.762636e+06     4.754820e+06
Qwen/Qwen2.5-7B-Instruct N=3584 K=2368: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   5.188694e+03   5.227034e+03     5.413714e+03     5.399939e+03
1        16.0   8.383905e+04   8.467535e+04     8.363255e+04     8.363255e+04
2        64.0   3.361863e+05   3.378589e+05     3.345302e+05     3.345302e+05
3       128.0   6.244564e+05   6.273407e+05     6.406569e+05     6.421715e+05
4       256.0   1.088732e+06   1.084385e+06     1.110996e+06     1.113273e+06
5       512.0   2.109814e+06   2.103687e+06     2.077541e+06     2.089527e+06
6      1024.0   3.035067e+06   3.030834e+06     2.976860e+06     2.968727e+06
7      2048.0   3.756453e+06   3.759703e+06     3.756453e+06     3.753209e+06
Qwen/Qwen2.5-32B-Instruct, TP=8
Qwen/Qwen2.5-32B-Instruct N=896 K=5120: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   2.744000e+03   2.737451e+03     2.881889e+03     2.881889e+03
1        16.0   4.443553e+04   4.454338e+04     4.422138e+04     4.400929e+04
2        64.0   1.760371e+05   1.764603e+05     1.751969e+05     1.751969e+05
3       128.0   3.117091e+05   3.110487e+05     3.512320e+05     3.495595e+05
4       256.0   4.552402e+05   4.552402e+05     6.941607e+05     6.974583e+05
5       512.0   9.034768e+05   9.048689e+05     8.938507e+05     8.924922e+05
6      1024.0   1.793160e+06   1.790427e+06     1.774199e+06     1.771523e+06
7      2048.0   3.209071e+06   3.204693e+06     3.195972e+06     3.195972e+06
Qwen/Qwen2.5-32B-Instruct N=5120 K=640: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   2.484364e+03   2.476858e+03     2.594430e+03     2.602667e+03
1        16.0   4.023754e+04   4.023754e+04     3.962973e+04     3.974982e+04
2        64.0   1.609502e+05   1.609502e+05     1.594826e+05     1.594826e+05
3       128.0   3.015503e+05   3.006863e+05     2.939482e+05     2.939482e+05
4       256.0   5.611739e+05   5.626784e+05     5.423231e+05     5.409254e+05
5       512.0   9.539956e+05   9.561688e+05     9.348732e+05     9.411616e+05
6      1024.0   1.652591e+06   1.652591e+06     1.639680e+06     1.642889e+06
7      2048.0   2.461924e+06   2.465539e+06     2.458320e+06     2.461924e+06
Qwen/Qwen2.5-32B-Instruct N=6912 K=5120: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   1.106028e+04   1.107412e+04     1.122871e+04     1.122871e+04
1        16.0   1.815020e+05   1.817350e+05     1.763033e+05     1.765232e+05
2        64.0   7.195506e+05   7.204661e+05     7.213839e+05     7.232265e+05
3       128.0   1.398238e+06   1.398238e+06     1.356374e+06     1.358001e+06
4       256.0   2.522433e+06   2.519628e+06     2.099301e+06     2.113009e+06
5       512.0   3.659362e+06   3.647577e+06     3.737864e+06     3.734782e+06
6      1024.0   4.548485e+06   4.548485e+06     4.753715e+06     4.728905e+06
7      2048.0   5.183399e+06   5.187851e+06     5.180435e+06     5.175996e+06
Qwen/Qwen2.5-32B-Instruct N=5120 K=3456: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   8.379394e+03   8.379394e+03     8.675138e+03     8.709291e+03
1        16.0   1.369229e+05   1.363952e+05     1.348364e+05     1.348364e+05
2        64.0   5.424453e+05   5.424453e+05     5.393457e+05     5.414082e+05
3       128.0   1.044858e+06   1.042934e+06     9.314358e+05     9.345098e+05
4       256.0   1.769728e+06   1.778063e+06     1.682951e+06     1.680454e+06
5       512.0   2.511366e+06   2.508585e+06     2.577078e+06     2.582955e+06
6      1024.0   4.034286e+06   4.030697e+06     4.052329e+06     4.048707e+06
7      2048.0   5.087595e+06   5.093315e+06     5.096180e+06     5.090453e+06
Qwen/Qwen2.5-72B-Instruct, TP=8
Qwen/Qwen2.5-72B-Instruct N=1280 K=8192: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   5.031862e+03   5.070793e+03     5.243200e+03     5.253707e+03
1        16.0   8.160622e+04   8.160622e+04     8.128993e+04     8.128993e+04
2        64.0   3.239043e+05   3.226585e+05     3.232802e+05     3.232802e+05
3       128.0   5.620851e+05   5.639745e+05     6.503194e+05     6.503194e+05
4       256.0   7.812917e+05   7.812917e+05     1.261522e+06     1.256797e+06
5       512.0   1.551745e+06   1.551745e+06     1.535766e+06     1.535766e+06
6      1024.0   2.790560e+06   2.793464e+06     2.902182e+06     2.902182e+06
7      2048.0   3.293888e+06   3.291868e+06     3.301991e+06     3.301991e+06
Qwen/Qwen2.5-72B-Instruct N=8192 K=1024: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   5.007580e+03   4.995657e+03     4.464204e+03     4.483282e+03
1        16.0   8.012128e+04   7.974065e+04     7.880473e+04     7.899015e+04
2        64.0   3.174545e+05   3.167058e+05     3.212518e+05     3.197221e+05
3       128.0   6.471483e+05   6.558401e+05     6.518605e+05     6.487114e+05
4       256.0   1.226331e+06   1.220757e+06     1.126065e+06     1.126065e+06
5       512.0   2.139972e+06   2.135718e+06     2.135718e+06     2.139972e+06
6      1024.0   3.168927e+06   3.173608e+06     3.109308e+06     3.109308e+06
7      2048.0   4.023469e+06   4.031017e+06     4.023469e+06     4.012198e+06
Qwen/Qwen2.5-72B-Instruct N=7392 K=8192: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   1.180027e+04   1.182792e+04     1.370112e+04     1.370112e+04
1        16.0   1.873440e+05   1.910377e+05     2.061582e+05     2.056331e+05
2        64.0   7.546288e+05   7.540415e+05     8.705691e+05     8.721363e+05
3       128.0   1.698411e+06   1.698411e+06     1.666283e+06     1.667717e+06
4       256.0   3.140821e+06   3.123105e+06     2.539825e+06     2.543158e+06
5       512.0   4.519852e+06   4.519852e+06     4.554375e+06     4.562417e+06
6      1024.0   5.010696e+06   4.999385e+06     5.059756e+06     5.061408e+06
7      2048.0   5.018807e+06   5.009887e+06     5.039198e+06     5.003015e+06
Qwen/Qwen2.5-72B-Instruct N=8192 K=3696: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   9.023161e+03   9.012419e+03     9.198581e+03     9.165172e+03
1        16.0   1.464654e+05   1.462885e+05     1.365580e+05     1.365580e+05
2        64.0   5.944879e+05   5.952182e+05     5.937594e+05     5.930326e+05
3       128.0   1.225051e+06   1.223504e+06     9.969293e+05     9.959047e+05
4       256.0   2.165397e+06   2.160569e+06     1.262561e+06     1.262149e+06
5       512.0   3.161551e+06   3.148709e+06     2.543347e+06     2.546689e+06
6      1024.0   3.632672e+06   3.636080e+06     2.789537e+06     2.790541e+06
7      2048.0   3.926119e+06   3.925125e+06     3.922146e+06     3.923139e+06
deepseek-ai/DeepSeek-Coder-V2-Lite-Instruct, TP=8
deepseek-ai/DeepSeek-Coder-V2-Lite-Instruct N=384 K=2048: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0     610.732906     610.732906       638.493532       638.493532
1        16.0    9802.168239    9802.168239      9771.726491      9771.726491
2        64.0   39086.905965   39086.905965     39086.905965     39086.905965
3       128.0   70509.711971   70509.711971     76510.540513     76510.540513
4       256.0  117901.491737  117901.491737    152557.377263    152557.377263
5       512.0  232535.496674  232535.496674    228836.070940    228836.070940
6      1024.0  459237.736983  458714.683181    447501.661020    447501.661020
7      2048.0  879370.070777  871756.474453    866132.250750    866132.250750
deepseek-ai/DeepSeek-Coder-V2-Lite-Instruct N=512 K=2048: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   7.969848e+02   7.969848e+02     8.513247e+02     8.513247e+02
1        16.0   1.302897e+04   1.302897e+04     1.302897e+04     1.302897e+04
2        64.0   5.211587e+04   5.211587e+04     5.195453e+04     5.195453e+04
3       128.0   9.401295e+04   9.401295e+04     1.020141e+05     1.017049e+05
4       256.0   1.572020e+05   1.572020e+05     1.997775e+05     2.003739e+05
5       512.0   3.100473e+05   3.093329e+05     3.044229e+05     3.044229e+05
6      1024.0   6.102295e+05   6.102295e+05     6.006734e+05     6.006734e+05
7      2048.0   1.190692e+06   1.190692e+06     1.180224e+06     1.185435e+06
deepseek-ai/DeepSeek-Coder-V2-Lite-Instruct N=2048 K=256: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0     467.359454     467.359454       491.865176       490.029845
1        16.0    7668.788576    7668.788576      7451.234212      7504.456942
2        64.0   30563.607438   30675.154304     29911.005050     29911.005050
3       128.0   58166.036877   58367.997935     59609.873692     59399.236493
4       256.0  110592.004911  110592.004911    117964.805239    117552.337737
5       512.0  219738.351700  219738.351700    214824.078858    214824.078858
6      1024.0  426920.212238  426920.212238    413784.220412    413784.220412
7      2048.0  682638.939977  682638.939977    680910.727013    680910.727013
deepseek-ai/DeepSeek-Coder-V2-Lite-Instruct N=576 K=256: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0     131.914282     131.914282       142.061538       142.610042
1        16.0    2188.800085    2205.134300      2156.846787      2156.846787
2        64.0    8658.988618    8658.988618      8627.387148      8627.387148
3       128.0   16415.999419   16415.999419     16885.028120     16885.028120
4       256.0   31414.007400   31414.007400     33649.880681     33649.880681
5       512.0   62828.014800   62828.014800     61201.398956     61201.398956
6      1024.0  121225.850347  121225.850347    118565.712886    118565.712886
7      2048.0  248016.162163  248016.162163    244805.595823    244805.595823
deepseek-ai/DeepSeek-Coder-V2-Lite-Instruct N=2736 K=2048: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0   3.828344e+03   3.828344e+03     3.969331e+03     3.980608e+03
1        16.0   6.210189e+04   6.108660e+04     6.125351e+04     6.142132e+04
2        64.0   2.477214e+05   2.477214e+05     2.450140e+05     2.450140e+05
3       128.0   4.483757e+05   4.483757e+05     4.821244e+05     4.808318e+05
4       256.0   7.764081e+05   7.747312e+05     8.361318e+05     8.361318e+05
5       512.0   1.542798e+06   1.542798e+06     1.507145e+06     1.507145e+06
6      1024.0   2.303053e+06   2.303053e+06     2.266670e+06     2.273855e+06
7      2048.0   3.587006e+06   3.595995e+06     3.618669e+06     3.618669e+06
deepseek-ai/DeepSeek-Coder-V2-Lite-Instruct N=352 K=2048: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0     559.838497     559.838497       585.285737       585.285737
1        16.0    9070.087885    9013.400228      8957.415950      8957.415950
2        64.0   36053.600911   35941.283542     35829.663801     35829.663801
3       128.0   64815.459927   64815.459927     70134.662137     70134.662137
4       256.0  108330.062359  108330.062359    139844.262491    139844.262491
5       512.0  213157.538618  213157.538618    209766.398362    209766.398362
6      1024.0  421448.486233  421448.486233    411123.443972    410209.855935
7      2048.0  800843.542376  809624.686036    799110.101582    797384.148710
deepseek-ai/DeepSeek-Coder-V2-Lite-Instruct N=2048 K=176: 
fp8 scaled matmul:
   batch_size  vllm-fp8-fp16  vllm-fp8-bf16  sglang-fp8-fp16  sglang-fp8-bf16
0         1.0     323.899642     322.742849       338.456934       338.456934
1        16.0    5276.963679    5276.963679      5219.812015      5238.724521
2        64.0   21107.854716   21107.854716     20655.542305     20655.542305
3       128.0   40093.950107   40024.582880     40303.497897     40303.497897
4       256.0   74867.988153   74867.988153     77114.023004     77371.931394
5       512.0  143690.729731  143690.729731    139783.739053    136485.006544
6      1024.0  287381.459462  287381.459462    267447.502023    267447.502023
7      2048.0  455846.455781  455846.455781    453611.929792    453611.929792
cc @zhyncs @ispobock

sgl-kernel/src/sgl-kernel/csrc/fp8_gemm_kernel.cu Outdated Show resolved Hide resolved
sgl-kernel/src/sgl-kernel/csrc/utils.h Outdated Show resolved Hide resolved
sgl-kernel/tests/test_fp8_gemm.py Outdated Show resolved Hide resolved
@zhyncs zhyncs requested review from ispobock, BBuf and zhyncs January 24, 2025 10:31
@HandH1998
Copy link
Collaborator Author

We have fixed the review issues and resolved the conflicts. And we also tried to optimize the performance on sm90, but it can't still overcome vllm under all cases. The final results shows that our kernel and vllm's have their own advantages in different cases.

@ll2088
Copy link

ll2088 commented Jan 26, 2025

why it fails when I run 'pip install .' in sgl-kernel dir?

@zhyncs
Copy link
Member

zhyncs commented Jan 26, 2025

why it fails when I run 'pip install .' in sgl-kernel dir?

@ll2088 Please run git pull to update with the latest main, delete the build cache, and then execute pip3 install -e .. It works well for me.

sgl-kernel git:(main) pip3 install -e .
Obtaining file:///sgl-workspace/sglang/sgl-kernel
  Installing build dependencies ... done
  Checking if build backend supports build_editable ... done
  Getting requirements to build editable ... done
  Preparing editable metadata (pyproject.toml) ... done
Building wheels for collected packages: sgl-kernel
  Building editable for sgl-kernel (pyproject.toml) ... done
  Created wheel for sgl-kernel: filename=sgl_kernel-0.0.2.post17-0.editable-cp39-abi3-linux_x86_64.whl size=11340 sha256=3c4ce8397126b4eceebe897e134be305914f3c58b911e902a517d1c830c4beb4
  Stored in directory: /tmp/pip-ephem-wheel-cache-ni1yn_73/wheels/37/60/b6/3c7398dfa86ca77510e7421dffec09170b60be817b325e847b
Successfully built sgl-kernel
Installing collected packages: sgl-kernel
  Attempting uninstall: sgl-kernel
    Found existing installation: sgl-kernel 0.0.2.post17
    Uninstalling sgl-kernel-0.0.2.post17:
      Successfully uninstalled sgl-kernel-0.0.2.post17
Successfully installed sgl-kernel-0.0.2.post17
WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager, possibly rendering your system unusable.It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv. Use the --root-user-action option if you know what you are doing and want to suppress this warning.

@ll2088
Copy link

ll2088 commented Jan 26, 2025

why it fails when I run 'pip install .' in sgl-kernel dir?

@ll2088 Please run git pull to update with the latest main, delete the build cache, and then execute pip3 install -e .. It works well for me.

sgl-kernel git:(main) pip3 install -e .
Obtaining file:///sgl-workspace/sglang/sgl-kernel
  Installing build dependencies ... done
  Checking if build backend supports build_editable ... done
  Getting requirements to build editable ... done
  Preparing editable metadata (pyproject.toml) ... done
Building wheels for collected packages: sgl-kernel
  Building editable for sgl-kernel (pyproject.toml) ... done
  Created wheel for sgl-kernel: filename=sgl_kernel-0.0.2.post17-0.editable-cp39-abi3-linux_x86_64.whl size=11340 sha256=3c4ce8397126b4eceebe897e134be305914f3c58b911e902a517d1c830c4beb4
  Stored in directory: /tmp/pip-ephem-wheel-cache-ni1yn_73/wheels/37/60/b6/3c7398dfa86ca77510e7421dffec09170b60be817b325e847b
Successfully built sgl-kernel
Installing collected packages: sgl-kernel
  Attempting uninstall: sgl-kernel
    Found existing installation: sgl-kernel 0.0.2.post17
    Uninstalling sgl-kernel-0.0.2.post17:
      Successfully uninstalled sgl-kernel-0.0.2.post17
Successfully installed sgl-kernel-0.0.2.post17
WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager, possibly rendering your system unusable.It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv. Use the --root-user-action option if you know what you are doing and want to suppress this warning.

image

@zhyncs
Copy link
Member

zhyncs commented Jan 26, 2025

@ll2088
Copy link

ll2088 commented Jan 26, 2025

@ll2088 build-wheels CI works well, so I think the issue is caused by your local environment. ref https://github.com/sgl-project/sglang/actions/runs/12971636303/job/36178255456 https://github.com/sgl-project/sglang/actions/runs/12971803255/job/36178590808?pr=3047

image

which version of flashinfer are you using?

@HandH1998
Copy link
Collaborator Author

@zhyncs @ispobock We have resovled conflicts and passed all the checks.

@zhyncs
Copy link
Member

zhyncs commented Jan 26, 2025

@HandH1998 Please paste the latest benchmark results. Thanks!

@zhyncs zhyncs merged commit 82392da into sgl-project:main Jan 26, 2025
5 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants