Skip to content

hotfix - Revert vllm/attention/layer.py changes from 0f8cafe - fix torch.compile recompilations #298

hotfix - Revert vllm/attention/layer.py changes from 0f8cafe - fix torch.compile recompilations

hotfix - Revert vllm/attention/layer.py changes from 0f8cafe - fix torch.compile recompilations #298

Annotations

1 warning

update-description

succeeded Jan 24, 2025 in 6s