-
Notifications
You must be signed in to change notification settings - Fork 16k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Missing key error - Using PromptTemplate and GraphCypherQAChain. #24260
Comments
The problem is that the input_key: str = "query" This field is then used to retrieve the question. See the code in the question = inputs[self.input_key]
intermediate_steps: List = []
generated_cypher = self.cypher_generation_chain.run(
{"question": question, "schema": self.graph_schema}, callbacks=callbacks
) Your code works if you change it to: res = chain.invoke({"schema": graph.schema, "examples": examples, "query": question}) |
Thanks @RafaelXokito. Problem persists. Code
Error (similar as above)
|
At the moment the generated_cypher = self.cypher_generation_chain.run(
{"question": question, "schema": self.graph_schema}, callbacks=callbacks
) My suggestion is to override generated_cypher = self.cypher_generation_chain.run(
{"question": question, "schema": self.graph_schema, "examples": inputs["examples"]}, callbacks=callbacks
) |
Thanks @RafaelXokito. Yes, it would be highly appreciated if you could provide a patch. |
Here's a simple example: class GraphCypherQAChainAux(GraphCypherQAChain):
def _call(
self,
inputs: Dict[str, Any],
run_manager: Optional[CallbackManagerForChainRun] = None,
) -> Dict[str, Any]:
"""Generate Cypher statement, use it to look up in the database and answer the question."""
_run_manager = run_manager or CallbackManagerForChainRun.get_noop_manager()
callbacks = _run_manager.get_child()
question = inputs[self.input_key]
intermediate_steps: List = []
generated_cypher = self.cypher_generation_chain.run(
{"question": question, "schema": self.graph_schema, "examples": inputs["examples"]}, callbacks=callbacks
)
# Extract Cypher code if it is wrapped in backticks
generated_cypher = extract_cypher(generated_cypher)
# Correct Cypher query if enabled
if self.cypher_query_corrector:
generated_cypher = self.cypher_query_corrector(generated_cypher)
_run_manager.on_text("Generated Cypher:", end="\n", verbose=self.verbose)
_run_manager.on_text(
generated_cypher, color="green", end="\n", verbose=self.verbose
)
intermediate_steps.append({"query": generated_cypher})
# Retrieve and limit the number of results
# Generated Cypher can be null if query corrector identifies an invalid schema
if generated_cypher:
context = self.graph.query(generated_cypher)[: self.top_k]
else:
context = []
if self.return_direct:
final_result = context
else:
_run_manager.on_text("Full Context:", end="\n", verbose=self.verbose)
_run_manager.on_text(
str(context), color="green", end="\n", verbose=self.verbose
)
intermediate_steps.append({"context": context})
result = self.qa_chain(
{"question": question, "context": context},
callbacks=callbacks,
)
final_result = result[self.qa_chain.output_key]
chain_result: Dict[str, Any] = {self.output_key: final_result}
if self.return_intermediate_steps:
chain_result[INTERMEDIATE_STEPS_KEY] = intermediate_steps
return chain_result And here is how you can use it: CYPHER_QA_TEMPLATE = """
You're an AI cook formulating Cypher statements to navigate through a recipe database.
Schema: {schema}
Examples: {examples}
Question: {question}
"""
CYPHER_GENERATION_PROMPT = PromptTemplate(
input_variables=["schema", "examples", "question"],
template=CYPHER_QA_TEMPLATE)
chain = GraphCypherQAChainAux.from_llm(
graph=graph, llm=model, verbose=True, validate_cypher=True,
cypher_prompt=CYPHER_GENERATION_PROMPT, input_key="question")
res = chain.invoke({"examples": examples, "question": question}) Take into account that I didn't include the schema because the original Please give me your feedback on this. |
Many thanks @RafaelXokito ! Your patched version of the class is working. Any change in sight in the package ? |
Thank you, @pierreoberholzer, for your feedback! I am considering making changes to how inputs are used in the question = inputs[self.input_key]
args = {
"question": question,
"schema": self.graph_schema,
}
args.update(inputs)
intermediate_steps: List = []
generated_cypher = self.cypher_generation_chain.run(
args, callbacks=callbacks
) However, I am concerned about the potential impact this change might have on existing users of this method. @ccurme, could you please provide your insights on this proposed modification? Thank you! |
I ended up doing something similar to get it working to be able to generate sparql. I am also working with TTL file so you would also need to override RdfGraph class. |
…rovided by the user for cypher generation (#24300) **Description:** This PR introduces a change to the `cypher_generation_chain` to dynamically concatenate inputs. This improvement aims to streamline the input handling process and make the method more flexible. The change involves updating the arguments dictionary with all elements from the `inputs` dictionary, ensuring that all necessary inputs are dynamically appended. This will ensure that any cypher generation template will not require a new `_call` method patch. **Issue:** This PR fixes issue #24260.
…rovided by the user for cypher generation (langchain-ai#24300) **Description:** This PR introduces a change to the `cypher_generation_chain` to dynamically concatenate inputs. This improvement aims to streamline the input handling process and make the method more flexible. The change involves updating the arguments dictionary with all elements from the `inputs` dictionary, ensuring that all necessary inputs are dynamically appended. This will ensure that any cypher generation template will not require a new `_call` method patch. **Issue:** This PR fixes issue langchain-ai#24260.
Checked other resources
Example Code
Error Message and Stack Trace (if applicable)
Description
I'm getting a missing key error when passing custom arguments in
PromptTemplate
andGraphCypherQAChain
.This seems similar to #19560 now closed.
System Info
The text was updated successfully, but these errors were encountered: