Replies: 1 comment 8 replies
-
You need mount your local model file via volume mount, so Docker can see it. |
Beta Was this translation helpful? Give feedback.
8 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I pre-download an embedding model named "moka-ai/m3e-base" from
https://huggingface.co/moka-ai/m3e-base
.The tree look like this:
The
m3e-base.yaml
content is:Use
latest-aio-gpu-nvidia-cuda-12
dock image to start service, and try to test this embedding model but failed.The debug info as following:
How to config sentencetransformers to use these offline model files?
Beta Was this translation helpful? Give feedback.
All reactions