-
Notifications
You must be signed in to change notification settings - Fork 88
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
About raw prompt display option in OntoGPT command line #181
Comments
Hi @yy20716 - you're right, the output generally truncates the text of the prompt or omits the text input entirely, working under the assumption that the output contains the text in its own field. The recursion in the extraction approach also means that while this is the initial prompt in the above spaghetti example:
This prompt is then also created and queried:
then this:
and this
and so on. That being said, it would certainly be useful to see all those prompts, so I'll add the |
OK, the |
Hello @caufieldjh and @cmungall,
I'm reaching out to inquire whether ontoGPT could potentially offer additional command line options to display the raw prompt that's being passed to the language models. It appears that certain portions of the prompts are stored in the output yaml file, as evident in this example. However, it's uncertain if this represents the complete prompt.
I also noticed the existence of the "--show-prompt" option in the cli.py file, though it seems to be specifically related to SPINDOCTOR, if I've interpreted it correctly. Could you kindly clarify whether such an option already exists? Your assistance is greatly appreciated. Thank you for your valuable support.
The text was updated successfully, but these errors were encountered: