You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@VALLIS-NERIA it's not possible to do correctly with pypoetry, we cannot have a consistent lock file. I think there won't be problems with upgrading transformers by one minor version for future TRT-LLM releases. We already tested it with trnasformers 4.45.2, and it's totally fine
Hi @cupertank@Xarbirus , you can edit the requirements.txt if needed. Upgrading transformers offically is not that trival because TRT-LLM has to support many models. Maybe you have already noticed some models in examples directory have different transformers version in their own requirements.txt.
I'd like to use vLLM together with TensorRT-LLM in my project, but now it's not possible because of
transformers
versions.vLLM requires
transformers
>= 4.45.2
and TensorRT-LLM has upper bound<= 4.45.1
.Is it possible to upgrade a minor version of
transformers
in TensorRT-LLM?The text was updated successfully, but these errors were encountered: