Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

--verbose flag #7

Open
hasparus opened this issue Feb 3, 2024 · 2 comments
Open

--verbose flag #7

hasparus opened this issue Feb 3, 2024 · 2 comments

Comments

@hasparus
Copy link

hasparus commented Feb 3, 2024

I'm trying to configure rawdog to run with Mistral inside of Ollama (like #3), and I encountered the error message.

image

It would be nice if rawdog accepted a --verbose flag that would be propagated to litellm options.

WDYT? Would you accept a PR adding it?

@granawkins
Copy link
Member

hmm.. we already use --verbose in the rawdog function to print/review scripts, and looks like this also prints the full response..

Is there specific field you want to see? Can we add that to the log in LLMClient.get_response instead?

@biobootloader
Copy link
Member

@hasparus the best way to do this would be as a config setting (since it only applies to ollama)

would love if you added a PR for that!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants