Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ollama model names in example are wrong and auto-pulling broken #1642

Open
morganhein opened this issue Jan 12, 2025 · 2 comments
Open

Ollama model names in example are wrong and auto-pulling broken #1642

morganhein opened this issue Jan 12, 2025 · 2 comments
Labels
bug Something isn't working

Comments

@morganhein
Copy link

morganhein commented Jan 12, 2025

Bug description

The configuration here: https://github.com/huggingface/chat-ui/blob/main/docs/source/configuration/models/providers/ollama.md says that the model name should be "mistral". It should be "mistral:latest" instead. Furthermore, the auto-pulling of ollama models originally added here: #1227 does not work with this model name inconsistency.

Steps to reproduce

Vanilla chat-ui setup with ollama installed. Use config from above. Download mistral "ollama pull mistral". Chat-ui unable to use ollama/mistral

Context

Logs

Here's a snippet from ollama's tags:

 $> curl http://localhost:11434/api/tags
{"models":[{"name":"mistral:latest","model":"mistral:latest","modified_at":"2025-01-11T16:17:32.785621658-08:00","size":4113301824,"digest":"f974a74358d62a017b37c6f424fcdf2744ca02926c4f952513ddf474b2fa5091","details":{"parent_model":"","format":"gguf","family":"llama","families":["llama"],"parameter_size":"7.2B","quantization_level":"Q4_0"}},{"name":"tinyllama:latest","model":"tinyllama:latest","modified_at":"2025-01-11T16:03:16.107607114-08:00","size":637700138,"digest":"2644915ede352ea7bdfaff0bfac0be74c719d5d5202acb63a6fb095b52f394a4","details":{"parent_model":"","format":"gguf","family":"llama","families":["llama"],"parameter_size":"1B","quantization_level":"Q4_0"}}]}

Specs

  • OS: Mac
  • Browser: Firefox
  • chat-ui commit: f82af0b

Notes

If you change the config to:

MONGODB_URL=mongodb://localhost:27017/
MODELS=`[
  {
    "name": "Ollama Mistral",
    "chatPromptTemplate": "<s>{{#each messages}}{{#ifUser}}[INST] {{#if @first}}{{#if @root.preprompt}}{{@root.preprompt}}\n{{/if}}{{/if}} {{content}} [/INST]{{/ifUser}}{{#ifAssistant}}{{content}}</s> {{/ifAssistant}}{{/each}}",
    "parameters": {
      "temperature": 0.1,
      "top_p": 0.95,
      "repetition_penalty": 1.2,
      "top_k": 50,
      "truncate": 3072,
      "max_new_tokens": 1024,
      "stop": ["</s>"]
    },
    "endpoints": [
      {
        "type": "ollama",
        "url" : "http://127.0.0.1:11434",
        "ollamaName" : "mistral:latest"
      }
    ]
  }
]`

Notice the ollamaName: "mistral:latest" it works.

@morganhein morganhein added the bug Something isn't working label Jan 12, 2025
@nsarrazin
Copy link
Collaborator

Thanks for the report! So if I understand correctly this is just a documentation change right ?

@morganhein
Copy link
Author

morganhein commented Jan 13, 2025

Also a code change. Automatic pulling of models against ollama is broken. I think it pulls correctly, but chat-ui never sees that the model is available since chat doesn't check for the model names suffixed by ":latest"

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants