We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
I think either the documentation needs to be tweaked or there's an underlying bug.
In https://docs.letta.com/guides/server/providers/openai#enabling-openai-models it says:
export OPENAI_API_KEY=...
But it does not cover how to configure the OPENAI_API_BASE environment variable, or even if one exists -- it's not clear from
OPENAI_API_BASE
letta/letta/settings.py
Line 61 in a1a2dd4
My use case is that I want to use https://docs.lambdalabs.com/public-cloud/lambda-inference-api/ which has a different API base:
openai_api_key = "<API-KEY>" openai_api_base = "https://api.lambdalabs.com/v1" client = OpenAI( api_key=openai_api_key, base_url=openai_api_base, )
The text was updated successfully, but these errors were encountered:
No branches or pull requests
I think either the documentation needs to be tweaked or there's an underlying bug.
In https://docs.letta.com/guides/server/providers/openai#enabling-openai-models it says:
export OPENAI_API_KEY=...
But it does not cover how to configure the
OPENAI_API_BASE
environment variable, or even if one exists -- it's not clear fromletta/letta/settings.py
Line 61 in a1a2dd4
My use case is that I want to use https://docs.lambdalabs.com/public-cloud/lambda-inference-api/ which has a different API base:
The text was updated successfully, but these errors were encountered: