-
Notifications
You must be signed in to change notification settings - Fork 5.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature/keywordsai llm #16860
base: main
Are you sure you want to change the base?
Feature/keywordsai llm #16860
Conversation
@logan-markewich just a heads up. Looking to get a review on this by the founders of keywords to ensure they are happy with the API. |
response.message.additional_kwargs["tool_calls"] = [tool_calls[0]] | ||
|
||
|
||
class KeywordsAI(OpenAI): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
High-level -- users should use OpenAILike
to access openai-like APIs. This would probably avoid some of the checks/errors that you noted in the PR description
That being said, happy to have a specific integration for keywords ai. The implementation likely can be a lot smaller though, most methods are copies from the openai class right?
@jordanparker6 is this good to merge? (let me know if so, it seeeeeems ok?) |
@logan-markewich let me quickly ping the keywords guys. There is an opportunity to add a better dev-x here for some of their features (e.g. adding a Customer ID / Thread ID to the LLM logs for observability in their UI). Shouldn't be a big add, just moving the params up to the model params rather than into the metadata param. |
@logan-markewich this is good to go |
Description
Due to some of the model validation of the LLMs, LLM proxy's that support an OpenAI like API like KeywordsAI required hacky solutions to override the error message when wrapped in OpenAI LLMs.
Currently in discussions with the founders of KeywordsAI to get a review of this so they are happy with the API.
Fixes # (issue)
This provides a KeywordsAI LLM provider that dynamically fetches the model configurations and allows for model params.
New Package?
Did I fill in the
tool.llamahub
section in thepyproject.toml
and provide a detailed README.md for my new integration or package?Version Bump?
Did I bump the version in the
pyproject.toml
file of the package I am updating? (Except for thellama-index-core
package)Type of Change
Please delete options that are not relevant.
How Has This Been Tested?
Your pull-request will likely not be merged unless it is covered by some form of impactful unit testing.
Suggested Checklist:
make format; make lint
to appease the lint gods