Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

move things around #11

Merged
merged 5 commits into from
Mar 23, 2024
Merged

move things around #11

merged 5 commits into from
Mar 23, 2024

Conversation

biobootloader
Copy link
Member

No description provided.

_get_provider_from_model_name,
_validate_model_aliases,
)
from spice.errors import SpiceError
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It seems like the import statements for _get_client, _get_clients_from_env, _get_provider_from_model_name, and _validate_model_aliases have been moved here from spice.py. This change improves the organization by separating the client management logic into its own module, which is a good practice for maintainability and readability.

key = os.getenv("OPENAI_API_KEY")
if key is None:
if base_url:
key = "dummy_key_base_url"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Using a dummy key ("dummy_key_base_url") when OPENAI_API_BASE is set but OPENAI_API_KEY is not seems like a workaround. It might be better to always require an API key to be set for OpenAI to avoid confusion or potential misuse.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is for when users set up a proxy using LiteLLM - so they aren't really using OpenAI, just the sdk. please add a brief comment explaning this

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Understood, thanks for clarifying the use case with LiteLLM. A comment explaining this specific scenario will indeed help future maintainers or users of the code to understand the rationale behind using a dummy key. I'll add a note to the documentation accordingly.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please suggest again it may have been lost when I pushed a new commit

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've added the suggested comment to clarify the use of a dummy key in specific scenarios like LiteLLM proxy setup. This should help future maintainers understand the rationale behind this decision.

else:
provider = _get_provider_from_model_name(model_name)
if provider not in clients:
raise SpiceError(f"Provider {provider} is not set up for model f{model_name} ({alias})")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There seems to be a typo in the error message: f"Provider {provider} is not set up for model f{model_name} ({alias})". The f before {model_name} should be removed as it's outside the f-string.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please fix

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for catching that typo! I've suggested the correction in the code.

messages = messages[1:]

if response_format is not None:
raise SpiceError("response_format not supported by anthropic")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Raising an error for unsupported response_format in WrappedAnthropicClient is clear and prevents misuse. However, it would be helpful to document which formats are supported or to provide alternatives for handling unsupported formats.

Copy link
Contributor

mentatbot bot commented Mar 23, 2024

MENTAT CODE REVIEW IN ACTIVE DEVELOPMENT. Only in use on mentat and internal repos.
Please Reply with feedback.

This pull request reorganizes the code by moving client management and error handling into separate modules, which improves readability and maintainability. Some minor issues were identified, such as a potential workaround with a dummy key and a typo in an error message. Additionally, a suggestion was made to simplify conditional logic in WrappedAnthropicClient. Overall, the changes are positive, but addressing the identified issues would further improve the code quality.

key = os.getenv("OPENAI_API_KEY")
if key is None:
if base_url:
key = "dummy_key_base_url"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Consider explicitly requiring an OPENAI_API_KEY instead of using a dummy key to avoid potential confusion.

spice/client_manager.py Outdated Show resolved Hide resolved
messages = messages[1:]

if response_format is not None:
raise SpiceError("response_format not supported by anthropic")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Documenting supported response_format values or providing alternatives for unsupported formats in WrappedAnthropicClient could enhance usability.

Copy link
Contributor

mentatbot bot commented Mar 23, 2024

MENTAT CODE REVIEW IN ACTIVE DEVELOPMENT. Only in use on mentat and internal repos.
Please Reply with feedback.

The reorganization of the code into separate modules for client management and error handling is a good improvement. However, addressing the minor issues identified, such as the misuse of a dummy key and the typo in an error message, would further enhance the code quality. Additionally, providing documentation or alternatives for unsupported formats in WrappedAnthropicClient could improve usability.

else:
provider = _get_provider_from_model_name(model_name)
if provider not in clients:
raise SpiceError(f"Provider {provider} is not set up for model f{model_name} ({alias})")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
raise SpiceError(f"Provider {provider} is not set up for model f{model_name} ({alias})")
raise SpiceError(f"Provider {provider} is not set up for model {model_name} ({alias})")

Correcting the typo in the error message by removing the extraneous f before {model_name}.

Co-authored-by: abanteai-butler[bot] <160964065+abanteai-butler[bot]@users.noreply.github.com>
else:
provider = _get_provider_from_model_name(model_name)
if provider not in clients:
raise SpiceError(f"Provider {provider} is not set up for model {model_name} ({alias})")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
raise SpiceError(f"Provider {provider} is not set up for model {model_name} ({alias})")
raise SpiceError(f"Provider {provider} is not set up for model {model_name} ({alias})")

Correcting the typo in the error message by removing the extraneous f before {model_name}.

Copy link
Contributor

mentatbot bot commented Mar 23, 2024

MENTAT CODE REVIEW IN ACTIVE DEVELOPMENT. Only in use on mentat and internal repos.
Please Reply with feedback.

The reorganization of the code into separate modules for client management and error handling is a good improvement. However, addressing the minor issues identified, such as the misuse of a dummy key and the typo in an error message, would further enhance the code quality. Additionally, providing documentation or alternatives for unsupported formats in WrappedAnthropicClient could improve usability.

key = os.getenv("OPENAI_API_KEY")
if key is None:
if base_url:
key = "dummy_key_base_url"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
key = "dummy_key_base_url"
# Using a dummy key for LiteLLM proxy setup where OPENAI_API_BASE is set but OPENAI_API_KEY is not required.

A comment explaining the use of a dummy key for specific scenarios like LiteLLM proxy setup will help clarify its purpose and prevent confusion.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

bad indentation

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for pointing out the indentation issue. I've adjusted the suggestion to correct the indentation.

Co-authored-by: abanteai-butler[bot] <160964065+abanteai-butler[bot]@users.noreply.github.com>
if base_url:
key = "dummy_key_base_url"
else:
# Using a dummy key for LiteLLM proxy setup where OPENAI_API_BASE is set but OPENAI_API_KEY is not required.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
# Using a dummy key for LiteLLM proxy setup where OPENAI_API_BASE is set but OPENAI_API_KEY is not required.
else:
# Using a dummy key for LiteLLM proxy setup where OPENAI_API_BASE is set but OPENAI_API_KEY is not required.
key = "dummy_key_base_url"
raise SpiceError("OPENAI_API_KEY not set")

Correcting the indentation and ensuring the else block is properly included with the explanatory comment and the raise SpiceError line.

key = os.getenv("OPENAI_API_KEY")
if key is None:
if base_url:
# Using a dummy key for LiteLLM proxy setup where OPENAI_API_BASE is set but OPENAI_API_KEY is not required.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
# Using a dummy key for LiteLLM proxy setup where OPENAI_API_BASE is set but OPENAI_API_KEY is not required.
# Using a dummy key for LiteLLM proxy setup where OPENAI_API_BASE is set but OPENAI_API_KEY is not required.

Adding this comment to explain the use of a dummy key for specific scenarios like LiteLLM proxy setup will help clarify its purpose and prevent confusion.

else:
provider = _get_provider_from_model_name(model_name)
if provider not in clients:
raise SpiceError(f"Provider {provider} is not set up for model {model_name} ({alias})")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
raise SpiceError(f"Provider {provider} is not set up for model {model_name} ({alias})")
raise SpiceError(f"Provider {provider} is not set up for model {model_name} ({alias})")

Correcting the typo in the error message by removing the extraneous f before {model_name}.

Copy link
Contributor

mentatbot bot commented Mar 23, 2024

MENTAT CODE REVIEW IN ACTIVE DEVELOPMENT. Only in use on mentat and internal repos.
Please Reply with feedback.

The reorganization of the code into separate modules for client management and error handling is a good improvement. However, addressing the minor issues identified, such as the misuse of a dummy key and the typo in an error message, would further enhance the code quality. Additionally, providing documentation or alternatives for unsupported formats in WrappedAnthropicClient could improve usability.

@biobootloader biobootloader merged commit eae5510 into main Mar 23, 2024
2 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant