-
I want to use llm right after doc upload, so I've created a custom component. The thing is I receiving this error each time I've try to invoke it This is my component: class ExtractMetadataPipeline(BaseComponent):
"""Rewrite user question
Args:
llm: the language model to rewrite question
rewrite_template: the prompt template for llm to paraphrase a text input
"""
llm: ChatLLM # = Node(default_callback=lambda _: llms.get_default())
rewrite_template: str = PROMPT
def run(self, context: str) -> str: # type: ignore
print(context)
prompt_template = PromptTemplate(self.rewrite_template)
prompt = prompt_template.populate(context=context)
messages = [
SystemMessage(content="You are a helpful assistant"),
HumanMessage(content=prompt),
]
return self.llm(messages)
@classmethod
def get_info(cls):
return {"id": "ExtractMetadataPipeline"} This is how I invoke the component: self.extract_metadata_pipeline: ExtractMetadataPipeline = (
ExtractMetadataPipeline.withx()
) [...] metadata = self.extract_metadata_pipeline(metadata_context) Am I missing something? Could you provide way to access llm at any point (obviously if there is a way of doing such thing)? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
I discovered how to do it: here the code snippet to get access to the default selected LLM: llm = llms.get_default()
response = llm("hello") If I am wrong correct me. |
Beta Was this translation helpful? Give feedback.
I discovered how to do it: here the code snippet to get access to the default selected LLM:
If I am wrong correct me.