-
Notifications
You must be signed in to change notification settings - Fork 13
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
New pipeline deployment system #58
Conversation
Let's assume we want to deploy First, we need to to launch the server. We're also showing debug logs in this example using LOG=debug hayhooks server Deploy the pipelineWe have 3 ways to deploy the pipeline:
Deploy using the
|
pass | ||
|
||
@abstractmethod | ||
def run_chat(self, user_message: str, model_id: str, messages: List[dict], body: dict) -> dict: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
user_message
and messages
sound confusing to me...
user_message
and history
would be better, for example.
But maybe there is some reason I'm overlooking...
Let's also discuss on support/conversion of Haystack ChatMessage
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The incoming request looks like this one, but I agree, user_message
(which is the last message) and history
will be less confusing.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As a dev, I can still return generator here? Or somewhere else?
…es pipeline deployment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great work man, looks almost there, a few things were not immediately clear to me so I left a few comments. And the code could use a few comments here and there as well. What about metadata for PipelineWrapper
e.g. third party lib dependencies so we can load them and import automatically :-)
""" | ||
Run the pipeline in chat mode. | ||
|
||
Args: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Watch out for Google pydoc
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You're correct about run_api
. I am about to rewrite base_pipeline_wrapper
docstrings because they're not very precise.
pass | ||
|
||
@abstractmethod | ||
def run_chat(self, user_message: str, model_id: str, messages: List[dict], body: dict) -> dict: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As a dev, I can still return generator here? Or somewhere else?
About I have this covered in the next PR!
Of course! I was planning to support additional requirements in a separate PR (this is already quite big!) ;) |
…ed checks ; Update tests
I think it should be enough for this PR. Next one will update |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Given that we will take care of run_chat
in a future PR,
this PR looks good to me!
I would be happy if also @vblagoje could take a look.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Solid foundation, looking forward for upcoming PRs to complete this idea/effort
TODO
save_pipeline_files
to save pipeline files in a folderload_pipeline_module
to loadPipelineWrapper
from new saved modulePipelineWrapper
as well asPipeline
run_api
while the other one will runrun_chat
)update the currentadd new/deploy
route to support the new deployment system/deploy_files
routedeploy-files
CLI commandTODO after reviews
BasePipelineWrapper
run_api
's signaturerun_api
andrun_chat
must be implement (not mandatory to have both)Goals
This is to add a new pipeline deployment system based on pipeline files (heavily inspired by open-webui pipelines).
This has multiple goals:
Pipeline
can be provided both as a YAML (and loaded in wrapper class) but also directly as code.run_api
method needs to be implemented, which will contain the needed code to run the pipeline. Method input's args will be converted in a JSON-compatible Pydantic model and it will be used as API route input. This way, will be easier to run pipelines since users will expose exactly the input fields they need (rather than try to modeling all pipeline components inputs/outputs)run_chat
method can be implemented, which will receive in input an OpenAI-compatible user message. This way will be easy to usehayhooks
as a custom backend for open-webui. Note:open-webui
support will be added in a different PR.Description
The main idea is to let users provide a Pipeline wrapper class while deploying pipeline like the following:
Where
BasePipelineWrapper
is:On the example above, user will call
/deploy_files
route and provide both the YAML file of the pipeline and thepipeline_wrapper.py
file. They will be saved in a folder,pipeline_wrapper.py
will be loaded as a module and two API routes will be added dynamically (for chat and API runs).A full end-to-end example will be provided below.