-
Notifications
You must be signed in to change notification settings - Fork 2k
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
feat: allow passing API_BASE as optional parameter for openai provider (
#6820) * allow passing API_BASE as optional parameter for openai provider * Add release note * fix linter whitespace issue * Update examples/getting_started.py Co-authored-by: Madeesh Kannan <[email protected]> * Update examples/getting_started.py Co-authored-by: Madeesh Kannan <[email protected]> * revert optional api_base based on shadeMe comment * Update haystack/utils/getting_started.py Co-authored-by: Madeesh Kannan <[email protected]> * Recommendations from shadeMe comments * fix param ordering to build_pipeline * Update haystack/nodes/prompt/invocation_layer/open_ai.py Co-authored-by: Madeesh Kannan <[email protected]> --------- Co-authored-by: Madeesh Kannan <[email protected]>
- Loading branch information
Showing
5 changed files
with
28 additions
and
5 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,34 +1,41 @@ | ||
import logging | ||
|
||
from typing import Optional | ||
|
||
from haystack.document_stores import InMemoryDocumentStore | ||
from haystack.utils import build_pipeline, add_example_data, print_answers | ||
|
||
logger = logging.getLogger(__name__) | ||
|
||
def getting_started(provider, API_KEY): | ||
|
||
def getting_started(provider, API_KEY, API_BASE: Optional[str] = None): | ||
""" | ||
This getting_started example shows you how to use LLMs with your data with a technique called Retrieval Augmented Generation - RAG. | ||
:param provider: We are model agnostic :) Here, you can choose from: "anthropic", "cohere", "huggingface", and "openai". | ||
:param API_KEY: The API key matching the provider. | ||
:param API_BASE: The URL to use for a custom endpoint, e.g., if using LM Studio. Only openai provider supported. /v1 at the end is needed (e.g., http://localhost:1234/v1) | ||
""" | ||
|
||
# We support many different databases. Here we load a simple and lightweight in-memory database. | ||
document_store = InMemoryDocumentStore(use_bm25=True) | ||
|
||
# Pipelines are the main abstraction in Haystack, they connect components like LLMs and databases. | ||
pipeline = build_pipeline(provider, API_KEY, document_store) | ||
pipeline = build_pipeline(provider, API_KEY, API_BASE, document_store) | ||
|
||
# Download and add Game of Thrones TXT articles to Haystack's database. | ||
# You can also provide a folder with your local documents. | ||
# You might need to install additional dependencies - look inside the function for more information. | ||
add_example_data(document_store, "data/GoT_getting_started") | ||
|
||
# Ask a question on the data you just added. | ||
result = pipeline.run(query="Who is the father of Arya Stark?") | ||
result = pipeline.run(query="Who is the father of Arya Stark?", debug=True) | ||
|
||
# For details such as which documents were used to generate the answer, look into the <result> object. | ||
print_answers(result, details="medium") | ||
return result | ||
|
||
|
||
if __name__ == "__main__": | ||
# getting_started(provider="openai", API_KEY="NOT NEEDED", API_BASE="http://192.168.1.100:1234/v1") | ||
getting_started(provider="openai", API_KEY="ADD KEY HERE") |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,8 @@ | ||
--- | ||
enhancements: | ||
- | | ||
API_BASE can now be passed as an optional parameter in the getting_started sample. Only openai provider is supported in this set of changes. | ||
PromptNode and PromptModel were enhanced to allow passing of this parameter. | ||
This allows RAG against a local endpoint (e.g, http://localhost:1234/v1), so long as it is OpenAI compatible (such as LM Studio) | ||
Logging in the getting started sample was made more verbose, to make it easier for people to see what was happening under the covers. |