We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
--verbose
I'm trying to configure rawdog to run with Mistral inside of Ollama (like #3), and I encountered the error message.
It would be nice if rawdog accepted a --verbose flag that would be propagated to litellm options.
rawdog
litellm
WDYT? Would you accept a PR adding it?
The text was updated successfully, but these errors were encountered:
hmm.. we already use --verbose in the rawdog function to print/review scripts, and looks like this also prints the full response..
Is there specific field you want to see? Can we add that to the log in LLMClient.get_response instead?
Sorry, something went wrong.
@hasparus the best way to do this would be as a config setting (since it only applies to ollama)
would love if you added a PR for that!
No branches or pull requests
I'm trying to configure rawdog to run with Mistral inside of Ollama (like #3), and I encountered the error message.
It would be nice if
rawdog
accepted a--verbose
flag that would be propagated tolitellm
options.WDYT? Would you accept a PR adding it?
The text was updated successfully, but these errors were encountered: