-
Notifications
You must be signed in to change notification settings - Fork 45.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Command chaining/piping #6256
Comments
This issue has automatically been marked as stale because it has not had any activity in the last 50 days. You can unstale it by commenting or removing the label. Otherwise, this issue will be closed in 10 days. |
This issue was closed automatically because it has been stale for 10 days with no activity. |
This issue has automatically been marked as stale because it has not had any activity in the last 50 days. You can unstale it by commenting or removing the label. Otherwise, this issue will be closed in 10 days. |
The ability to chain multiple commands or actions together could significantly increase the efficiency of AutoGPT.
Related:
Notes & ideas for implementation
The simplest way?
The simplest way is probably to amend the current command execution interface to support calling multiple functions which are executed in order, and of which the output can be used as arguments for the next command:
Execution tree
More complex, harder to implement and probably more error prone, but worth considering:
Command interface -> script-like execution
We could profit from totally rethinking the command execution interface. For example, if we implement #6253 + #5132, and we exposed all available commands as functions that can be imported or executed in that environment, it could be as simple as this:
User: Please scrape the content of https://en.wikipedia.org/wiki/Otters and write it to a text file```
AutoGPT:
I have written the content of the webpage to the file
otters.txt
!Currently, it would be more like:
User: Please scrape the content of https://en.wikipedia.org/wiki/Otters and write it to a text file```
AutoGPT: I will now get the content from the webpage.
Executing
get_text_from_webpage("https://en.wikipedia.org/wiki/Otters")
AutoGPT: Now that I have scraped the webpage, I will save the content to a file called
otters.txt
.Executing
write_file("otters.txt", "[lots of text/tokens here, very expensive and possibly slow!]")
This is both slower and significantly more expensive, because we're copy-pasting data using an LLM.
The text was updated successfully, but these errors were encountered: