You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Maybe I missed this in the docs somewhere, but I don't believe it is explicitly called out that you need to install the various providers' deps yourself.
This could be solved with updated docs and/or installation extras to enable me to do: uv add chatlas[all] or uv add chatlas[openai,shiny].
Regardless, I think the logic around imports for these providers may be broken since I would not expect my simple code to break due to needing openai related deps when trying to chat with ollama. I see that you are using the OpenAI chat completion object which is why this would be triggered. Could look into using ollama's instead: https://github.com/ollama/ollama-python or providing a better error messaging notifying the user what went wrong during import.
The env var issue can be fixed by doing a quick check if that key is not set yet.
>>> %run "/Users/mconflitti/Documents/repos/test-shiny-app/app.py"
ImportError: `ChatOpenAI()` requires the `openai` package. Install it with `pip install openai`.
File [~/Documents/repos/test-shiny-app/.venv/lib/python3.12/site-packages/chatlas/_openai.py:195](vscode-file://vscode-app/Applications/Positron.app/Contents/Resources/app/out/vs/code/electron-sandbox/workbench/workbench.html#), in OpenAIProvider.__init__(self, api_key, model, base_url, seed, kwargs)
194 try:
--> 195 from openai import AsyncOpenAI, OpenAI
196 except ImportError:
Show Traceback
>>> %run "/Users/mconflitti/Documents/repos/test-shiny-app/app.py"
OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable
File [~/Documents/repos/test-shiny-app/app.py:3](vscode-file://vscode-app/Applications/Positron.app/Contents/Resources/app/out/vs/code/electron-sandbox/workbench/workbench.html#)
1 from chatlas import ChatOllama
----> 3 chat = ChatOllama(model="llama3.2")
5 if __name__ == "__main__":
6 chat.app()
Show Traceback
>>> %run "/Users/mconflitti/Documents/repos/test-shiny-app/app.py"
ImportError: The `shiny` package is required for the `browser` method. Install it with `pip install shiny`.
File [~/Documents/repos/test-shiny-app/.venv/lib/python3.12/site-packages/chatlas/_chat.py:220](vscode-file://vscode-app/Applications/Positron.app/Contents/Resources/app/out/vs/code/electron-sandbox/workbench/workbench.html#), in Chat.app(self, stream, port, launch_browser, bg_thread, kwargs)
219 try:
--> 220 from shiny import App, run_app, ui
221 except ImportError:
Hide Traceback
ModuleNotFoundError: No module named 'shiny'
During handling of the above exception, another exception occurred:
ImportError Traceback (most recent call last)
File [~/Documents/repos/test-shiny-app/app.py:9](vscode-file://vscode-app/Applications/Positron.app/Contents/Resources/app/out/vs/code/electron-sandbox/workbench/workbench.html#)
6 chat = ChatOllama(model="llama3.2")
8 if __name__ == "__main__":
----> 9 chat.app()
File [~/Documents/repos/test-shiny-app/.venv/lib/python3.12/site-packages/chatlas/_chat.py:222](vscode-file://vscode-app/Applications/Positron.app/Contents/Resources/app/out/vs/code/electron-sandbox/workbench/workbench.html#), in Chat.app(self, stream, port, launch_browser, bg_thread, kwargs)
220 from shiny import App, run_app, ui
221 except ImportError:
--> 222 raise ImportError(
223 "The `shiny` package is required for the `browser` method. "
224 "Install it with `pip install shiny`."
225 )
227 app_ui = ui.page_fillable(
228 ui.chat_ui("chat"),
229 fillable_mobile=True,
230 )
232 def server(input): # noqa: A002
>>> %run "/Users/mconflitti/Documents/repos/test-shiny-app/app.py"
INFO: Started server process [82420]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: Uvicorn running on [http://127.0.0.1:47139](vscode-file://vscode-app/Applications/Positron.app/Contents/Resources/app/out/vs/code/electron-sandbox/workbench/workbench.html#) (Press CTRL+C to quit)
The text was updated successfully, but these errors were encountered:
mconflitti-pbc
changed the title
Bug: Unable to use Ollama and chat app without also installing openai package
Bug: Unable to use Ollama and chat app without also installing/configuring openai package
Dec 17, 2024
It was an intentional decision to build on ollama's OpenAI compatibility rather than the ollama Python package so that there'd less provider implementations to maintain. That said, I'd be open to switching if there's advantage (other than a different Python dependency)
Also, you're right that ChatOllama is missing a callout to install openai, and it shouldn't require OPENAI_API_KEY to be set. Both those things are now fixed (by f8564d3)
As for the extras thing, it's something I haven't given much thought to yet, but FWIW, been using uv add chatlas[dev] as basically the equivalent of uv add chatlas[all]
Nice! I think your comment says to install the ollama package instead of the openai package though.
Wondering if it would be useful to abstract this away a bit from the user.
uv add chatlas[ollama, interactive] could be defined to install openai and shiny and then if those underlying implementations do change at some point, the dev doesnt care from a code perspective nor dep perspective.
Just a thought! Didn't see the dev extra documented but makes sense it has been useful internally!
Maybe I missed this in the docs somewhere, but I don't believe it is explicitly called out that you need to install the various providers' deps yourself.
This could be solved with updated docs and/or installation
extras
to enable me to do:uv add chatlas[all]
oruv add chatlas[openai,shiny]
.Regardless, I think the logic around imports for these providers may be broken since I would not expect my simple code to break due to needing openai related deps when trying to chat with ollama. I see that you are using the OpenAI chat completion object which is why this would be triggered. Could look into using ollama's instead: https://github.com/ollama/ollama-python or providing a better error messaging notifying the user what went wrong during import.
The env var issue can be fixed by doing a quick check if that key is not set yet.
The text was updated successfully, but these errors were encountered: