You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently, when applying machine learning model predictions during search, users are limited in their ability to pass additional input fields that are not part of the search queries. This limitation hinders the flexibility and effectiveness of model predictions in certain scenarios.
Proposed Solution
Propose introducing an ml_inference search extension that can be used alongside any search query. This extension would allow users to pass a flexible, large object containing various model input formats, making it adaptable to different models.
Key Features
Flexible input format to accommodate various model requirements
Integration with existing search queries
Compatibility with search pipelines and request processors
Sample Usage
Basic usage with match queries and use the search content to feed to a claude model in prompt to answer questions:
{
"query": { "match_all": {} },
"ext": {
"ml_inference": {
"question": "what is opensearch"
}
},
"search_pipeline": {
"request_processors": [
{
"ml_inference": {
"model_id": "<model_id>",
"input_map": [
{
"inputs": "ext.ml_inference.question",
"context": "text_docs"
}
],
"output_map": [
{
"ext.ml_inference.llm_response": "response"
}
],
"model_config": {
"prompt": ""\n\nHuman: You are a professional data analysist. You will always answer question based on the given context first. If the answer is not directly shown in the context, you will analyze the data and find the answer. If you don't know the answer, just say I don't know. Context: ${parameters.context.toString()}. \n\n Human: please answer the question:${parameters.inputs} \n\n Assistant:""
},
"ignore_missing": false,
"ignore_failure": false
}
}
]
}
}
Combined with template query and search pipelines:
Problem Statement:
Currently, when applying machine learning model predictions during search, users are limited in their ability to pass additional input fields that are not part of the search queries. This limitation hinders the flexibility and effectiveness of model predictions in certain scenarios.
Proposed Solution
Propose introducing an
ml_inference
search extension that can be used alongside any search query. This extension would allow users to pass a flexible, large object containing various model input formats, making it adaptable to different models.Key Features
Sample Usage
Basic usage with match queries and use the search content to feed to a claude model in prompt to answer questions:
Combined with template query and search pipelines:
Related issue:
opensearch-project/OpenSearch#16823
#3054
The text was updated successfully, but these errors were encountered: