-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
move things around #11
Conversation
_get_provider_from_model_name, | ||
_validate_model_aliases, | ||
) | ||
from spice.errors import SpiceError |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It seems like the import statements for _get_client
, _get_clients_from_env
, _get_provider_from_model_name
, and _validate_model_aliases
have been moved here from spice.py
. This change improves the organization by separating the client management logic into its own module, which is a good practice for maintainability and readability.
key = os.getenv("OPENAI_API_KEY") | ||
if key is None: | ||
if base_url: | ||
key = "dummy_key_base_url" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Using a dummy key ("dummy_key_base_url"
) when OPENAI_API_BASE
is set but OPENAI_API_KEY
is not seems like a workaround. It might be better to always require an API key to be set for OpenAI to avoid confusion or potential misuse.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this is for when users set up a proxy using LiteLLM - so they aren't really using OpenAI, just the sdk. please add a brief comment explaning this
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Understood, thanks for clarifying the use case with LiteLLM. A comment explaining this specific scenario will indeed help future maintainers or users of the code to understand the rationale behind using a dummy key. I'll add a note to the documentation accordingly.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
please suggest again it may have been lost when I pushed a new commit
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I've added the suggested comment to clarify the use of a dummy key in specific scenarios like LiteLLM proxy setup. This should help future maintainers understand the rationale behind this decision.
spice/client_manager.py
Outdated
else: | ||
provider = _get_provider_from_model_name(model_name) | ||
if provider not in clients: | ||
raise SpiceError(f"Provider {provider} is not set up for model f{model_name} ({alias})") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There seems to be a typo in the error message: f"Provider {provider} is not set up for model f{model_name} ({alias})"
. The f
before {model_name}
should be removed as it's outside the f-string.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
please fix
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for catching that typo! I've suggested the correction in the code.
messages = messages[1:] | ||
|
||
if response_format is not None: | ||
raise SpiceError("response_format not supported by anthropic") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Raising an error for unsupported response_format
in WrappedAnthropicClient
is clear and prevents misuse. However, it would be helpful to document which formats are supported or to provide alternatives for handling unsupported formats.
MENTAT CODE REVIEW IN ACTIVE DEVELOPMENT. Only in use on mentat and internal repos. This pull request reorganizes the code by moving client management and error handling into separate modules, which improves readability and maintainability. Some minor issues were identified, such as a potential workaround with a dummy key and a typo in an error message. Additionally, a suggestion was made to simplify conditional logic in |
key = os.getenv("OPENAI_API_KEY") | ||
if key is None: | ||
if base_url: | ||
key = "dummy_key_base_url" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Consider explicitly requiring an OPENAI_API_KEY
instead of using a dummy key to avoid potential confusion.
messages = messages[1:] | ||
|
||
if response_format is not None: | ||
raise SpiceError("response_format not supported by anthropic") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Documenting supported response_format
values or providing alternatives for unsupported formats in WrappedAnthropicClient
could enhance usability.
MENTAT CODE REVIEW IN ACTIVE DEVELOPMENT. Only in use on mentat and internal repos. The reorganization of the code into separate modules for client management and error handling is a good improvement. However, addressing the minor issues identified, such as the misuse of a dummy key and the typo in an error message, would further enhance the code quality. Additionally, providing documentation or alternatives for unsupported formats in |
spice/client_manager.py
Outdated
else: | ||
provider = _get_provider_from_model_name(model_name) | ||
if provider not in clients: | ||
raise SpiceError(f"Provider {provider} is not set up for model f{model_name} ({alias})") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
raise SpiceError(f"Provider {provider} is not set up for model f{model_name} ({alias})") | |
raise SpiceError(f"Provider {provider} is not set up for model {model_name} ({alias})") |
Correcting the typo in the error message by removing the extraneous f
before {model_name}
.
Co-authored-by: abanteai-butler[bot] <160964065+abanteai-butler[bot]@users.noreply.github.com>
else: | ||
provider = _get_provider_from_model_name(model_name) | ||
if provider not in clients: | ||
raise SpiceError(f"Provider {provider} is not set up for model {model_name} ({alias})") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
raise SpiceError(f"Provider {provider} is not set up for model {model_name} ({alias})") | |
raise SpiceError(f"Provider {provider} is not set up for model {model_name} ({alias})") |
Correcting the typo in the error message by removing the extraneous f
before {model_name}
.
MENTAT CODE REVIEW IN ACTIVE DEVELOPMENT. Only in use on mentat and internal repos. The reorganization of the code into separate modules for client management and error handling is a good improvement. However, addressing the minor issues identified, such as the misuse of a dummy key and the typo in an error message, would further enhance the code quality. Additionally, providing documentation or alternatives for unsupported formats in |
key = os.getenv("OPENAI_API_KEY") | ||
if key is None: | ||
if base_url: | ||
key = "dummy_key_base_url" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
key = "dummy_key_base_url" | |
# Using a dummy key for LiteLLM proxy setup where OPENAI_API_BASE is set but OPENAI_API_KEY is not required. |
A comment explaining the use of a dummy key for specific scenarios like LiteLLM proxy setup will help clarify its purpose and prevent confusion.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
bad indentation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for pointing out the indentation issue. I've adjusted the suggestion to correct the indentation.
Co-authored-by: abanteai-butler[bot] <160964065+abanteai-butler[bot]@users.noreply.github.com>
if base_url: | ||
key = "dummy_key_base_url" | ||
else: | ||
# Using a dummy key for LiteLLM proxy setup where OPENAI_API_BASE is set but OPENAI_API_KEY is not required. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
# Using a dummy key for LiteLLM proxy setup where OPENAI_API_BASE is set but OPENAI_API_KEY is not required. | |
else: | |
# Using a dummy key for LiteLLM proxy setup where OPENAI_API_BASE is set but OPENAI_API_KEY is not required. | |
key = "dummy_key_base_url" | |
raise SpiceError("OPENAI_API_KEY not set") |
Correcting the indentation and ensuring the else
block is properly included with the explanatory comment and the raise SpiceError
line.
key = os.getenv("OPENAI_API_KEY") | ||
if key is None: | ||
if base_url: | ||
# Using a dummy key for LiteLLM proxy setup where OPENAI_API_BASE is set but OPENAI_API_KEY is not required. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
# Using a dummy key for LiteLLM proxy setup where OPENAI_API_BASE is set but OPENAI_API_KEY is not required. | |
# Using a dummy key for LiteLLM proxy setup where OPENAI_API_BASE is set but OPENAI_API_KEY is not required. |
Adding this comment to explain the use of a dummy key for specific scenarios like LiteLLM proxy setup will help clarify its purpose and prevent confusion.
else: | ||
provider = _get_provider_from_model_name(model_name) | ||
if provider not in clients: | ||
raise SpiceError(f"Provider {provider} is not set up for model {model_name} ({alias})") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
raise SpiceError(f"Provider {provider} is not set up for model {model_name} ({alias})") | |
raise SpiceError(f"Provider {provider} is not set up for model {model_name} ({alias})") |
Correcting the typo in the error message by removing the extraneous f
before {model_name}
.
MENTAT CODE REVIEW IN ACTIVE DEVELOPMENT. Only in use on mentat and internal repos. The reorganization of the code into separate modules for client management and error handling is a good improvement. However, addressing the minor issues identified, such as the misuse of a dummy key and the typo in an error message, would further enhance the code quality. Additionally, providing documentation or alternatives for unsupported formats in |
No description provided.