Skip to content

Actions: NVIDIA/TensorRT-LLM

All workflows

Actions

Loading...
Loading

Showing runs from all workflows
581 workflow runs
581 workflow runs

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

Close inactive issues
Close inactive issues #392: Scheduled
December 12, 2024 06:04 14s main
December 12, 2024 06:04 14s
Close inactive issues
Close inactive issues #391: Scheduled
December 12, 2024 05:03 15s main
December 12, 2024 05:03 15s
[bug] Medusa example fails with vicuna 33B
Blossom-CI #145: Issue comment #2478 (comment) created by SoundProvider
December 12, 2024 04:51 5s
December 12, 2024 04:51 5s
Close inactive issues
Close inactive issues #390: Scheduled
December 12, 2024 04:04 14s main
December 12, 2024 04:04 14s
Close inactive issues
Close inactive issues #389: Scheduled
December 12, 2024 03:16 15s main
December 12, 2024 03:16 15s
Close inactive issues
Close inactive issues #388: Scheduled
December 12, 2024 02:30 16s main
December 12, 2024 02:30 16s
Close inactive issues
Close inactive issues #387: Scheduled
December 12, 2024 01:30 22s main
December 12, 2024 01:30 22s
How to use greedy search correctly
Blossom-CI #144: Issue comment #2557 (comment) created by nv-guomingz
December 12, 2024 01:08 5s
December 12, 2024 01:08 5s
[bug] Medusa example fails with vicuna 33B
Blossom-CI #143: Issue comment #2478 (comment) created by rakib-hasan
December 12, 2024 00:19 4s
December 12, 2024 00:19 4s
Build Qwen2-72B-Instruct model by INT4-AWQ quantization failed
Blossom-CI #142: Issue comment #2445 (comment) created by basujindal
December 12, 2024 00:15 5s
December 12, 2024 00:15 5s
Close inactive issues
Close inactive issues #386: Scheduled
December 12, 2024 00:14 14s main
December 12, 2024 00:14 14s
What does "weights_scaling_factor_2" mean in safetensor results of awq_w4a8
auto-assign #28: Issue #2561 labeled by nv-guomingz
December 12, 2024 00:14 40s
December 12, 2024 00:14 40s
What does "weights_scaling_factor_2" mean in safetensor results of awq_w4a8
auto-assign #27: Issue #2561 labeled by nv-guomingz
December 12, 2024 00:08 42s
December 12, 2024 00:08 42s
What does "weights_scaling_factor_2" mean in safetensor results of awq_w4a8
auto-assign #26: Issue #2561 labeled by nv-guomingz
December 12, 2024 00:07 50s
December 12, 2024 00:07 50s
Close inactive issues
Close inactive issues #385: Scheduled
December 11, 2024 23:03 15s main
December 11, 2024 23:03 15s
Blossom-CI
Blossom-CI #141: created by michaelfeil
December 11, 2024 22:11 5s
December 11, 2024 22:11 5s
Close inactive issues
Close inactive issues #384: Scheduled
December 11, 2024 22:03 21s main
December 11, 2024 22:03 21s
Close inactive issues
Close inactive issues #383: Scheduled
December 11, 2024 21:03 15s main
December 11, 2024 21:03 15s
Close inactive issues
Close inactive issues #382: Scheduled
December 11, 2024 20:04 15s main
December 11, 2024 20:04 15s
[feature request] Can we add H200 in infer_cluster_key() method?
Blossom-CI #140: Issue comment #2552 (comment) created by renjie0
December 11, 2024 19:28 5s
December 11, 2024 19:28 5s
Close inactive issues
Close inactive issues #381: Scheduled
December 11, 2024 19:02 17s main
December 11, 2024 19:02 17s
Close inactive issues
Close inactive issues #380: Scheduled
December 11, 2024 18:04 17s main
December 11, 2024 18:04 17s
Upgrade transformers to 4.45.2
Blossom-CI #139: Issue comment #2465 (comment) created by VALLIS-NERIA
December 11, 2024 15:42 6s
December 11, 2024 15:42 6s
Issue with converting custom encoder model
Blossom-CI #138: Issue comment #2535 (comment) created by AvivSham
December 11, 2024 15:33 5s
December 11, 2024 15:33 5s
Upgrade transformers to 4.45.2
Blossom-CI #137: Issue comment #2465 (comment) created by Xarbirus
December 11, 2024 15:33 5s
December 11, 2024 15:33 5s