Skip to content

Commit

Permalink
Precommit CI changes
Browse files Browse the repository at this point in the history
Signed-off-by: Yogesh <[email protected]>
  • Loading branch information
yogeshmpandey committed Jan 9, 2025
1 parent 12b2415 commit 2f34b3d
Show file tree
Hide file tree
Showing 12 changed files with 289 additions and 390 deletions.
14 changes: 8 additions & 6 deletions comps/integrations/langchain/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# langchain-opea

This package contains the LangChain integrations for [OPEA](https://opea.dev/) Microservices.
This package contains the LangChain integrations for OPENAI Compatible [OPEA](https://opea.dev/) Microservices.

## Installation

Expand All @@ -11,7 +11,7 @@ You can install LangChain OPEA package in several ways:
To install the package from the source, run:

```bash
pip install -e .
pip install poetry && poetry install --with test
```

### Install from Wheel Package
Expand All @@ -24,15 +24,15 @@ pip install dist/langchain_opea-0.1.0-py3-none-any.whl

### Install from PyPi

Once the package is available on PyPi, you can install it using:
> **Note:** Once the package is available on PyPi, you can install it using:
```bash
pip install -U langchain-opea
```

## Chat Models

`ChatOPEA` class exposes chat models from OPEA.
`ChatOPEA` class exposes OPENAI Compatible chat models from OPEA.

```python
from langchain_opea import ChatOPEA
Expand All @@ -45,7 +45,7 @@ llm.invoke("Sing a ballad of LangChain.")

## Embeddings

`OPEAEmbeddings` class exposes embeddings from OPEA.
`OPEAEmbeddings` class exposes OPENAI Compatible embeddings from OPEA.

```python
from langchain_opea import OPEAEmbeddings
Expand All @@ -60,7 +60,7 @@ embeddings.embed_query("What is the meaning of life?")

## LLMs

`OPEALLM` class exposes LLMs from OPEA.
`OPEALLM` class exposes OPENAI Compatible LLMs from OPEA.

```python
from langchain_opea import OPEALLM
Expand All @@ -70,3 +70,5 @@ llm = OPEALLM(
)
llm.invoke("The meaning of life is")
```

Check out [Samples](/comps/integrations/langchain/samples/README.md) for more examples using the OPEA Langchain package.
Binary file not shown.
Binary file modified comps/integrations/langchain/dist/langchain_opea-0.1.0.tar.gz
Binary file not shown.
16 changes: 12 additions & 4 deletions comps/integrations/langchain/langchain_opea/chat_models.py
Original file line number Diff line number Diff line change
Expand Up @@ -86,7 +86,9 @@ def _llm_type(self) -> str:
"""Return type of chat model."""
return "opea-chat"

def _get_ls_params(self, stop: Optional[List[str]] = None, **kwargs: Any) -> LangSmithParams:
def _get_ls_params(
self, stop: Optional[List[str]] = None, **kwargs: Any
) -> LangSmithParams:
"""Get the parameters used to invoke the model."""
params = super()._get_ls_params(stop=stop, **kwargs)
params["ls_provider"] = "opea"
Expand All @@ -101,7 +103,9 @@ def validate_environment(self) -> Self:
raise ValueError("n must be 1 when streaming.")

client_params: dict = {
"api_key": (self.opea_api_key.get_secret_value() if self.opea_api_key else None),
"api_key": (
self.opea_api_key.get_secret_value() if self.opea_api_key else None
),
"base_url": self.opea_api_base,
}

Expand All @@ -113,10 +117,14 @@ def validate_environment(self) -> Self:

if not (self.client or None):
sync_specific: dict = {"http_client": self.http_client}
self.client = openai.OpenAI(**client_params, **sync_specific).chat.completions
self.client = openai.OpenAI(
**client_params, **sync_specific
).chat.completions
if not (self.async_client or None):
async_specific: dict = {"http_client": self.http_async_client}
self.async_client = openai.AsyncOpenAI(**client_params, **async_specific).chat.completions
self.async_client = openai.AsyncOpenAI(
**client_params, **async_specific
).chat.completions
return self

@property
Expand Down
4 changes: 3 additions & 1 deletion comps/integrations/langchain/langchain_opea/llms.py
Original file line number Diff line number Diff line change
Expand Up @@ -76,7 +76,9 @@ def validate_environment(self) -> Self:
if self.streaming and self.best_of > 1:
raise ValueError("Cannot stream results when best_of > 1.")
client_params: dict = {
"api_key": self.opea_api_key.get_secret_value() if self.opea_api_key else None,
"api_key": self.opea_api_key.get_secret_value()
if self.opea_api_key
else None,
"base_url": self.opea_api_base,
}

Expand Down
26 changes: 18 additions & 8 deletions comps/integrations/langchain/samples/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Running Langchain OPEA SDK with OPEA microservices

## 1. Starting the compose microservices
## 1. Starting the microservices using compose

Set up the environment variables:

Expand Down Expand Up @@ -35,20 +35,30 @@ curl http://${host_ip}:9009/v1/chat/completions \

## 3. Install Langchain OPEA package

You can install LangChain OPEA package in several ways:

### Install from Source

To install the package from the source, run:

```bash
pip install langchain-opea
pip install poetry && poetry install --with test
```

or build from source
### Install from Wheel Package

To install the package from a pre-built wheel, run:

```bash
pip install -e .
pip install dist/langchain_opea-0.1.0-py3-none-any.whl
```

or install from wheel package
### Install from PyPi

> **Note:** Once the package is available on PyPi, you can install it using:
```bash
pip install dist/langchain_opea-0.1.0-py3-none-any.whl
pip install -U langchain-opea
```

## 4. Install Jupyter Notebook
Expand All @@ -65,6 +75,6 @@ Start Jupyter Notebook:
jupyter notebook
```

Open the `summarize.ipynb` notebook and run the cells to execute the summarization example.
Open the [`summarize.ipynb`](/comps/integrations/langchain/samples/summarize.ipynb) notebook and run the cells to execute the summarization example.

Open the `qa_streaming.ipynb` notebook and run the cells to execute the QA chatbot with retrieval example.
Open the [`qa_streaming.ipynb`](/comps/integrations/langchain/samples/qa_streaming.ipynb) notebook and run the cells to execute the QA chatbot with retrieval example.
Loading

0 comments on commit 2f34b3d

Please sign in to comment.