Skip to content

Commit

Permalink
Feature/keywordsai llm (#16860)
Browse files Browse the repository at this point in the history
  • Loading branch information
jordanparker6 authored Dec 19, 2024
1 parent 9bcd6ed commit fc561a3
Show file tree
Hide file tree
Showing 12 changed files with 815 additions and 0 deletions.
153 changes: 153 additions & 0 deletions llama-index-integrations/llms/llama-index-llms-keywordsai/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,153 @@
llama_index/_static
.DS_Store
# Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]
*$py.class

# C extensions
*.so

# Distribution / packaging
.Python
bin/
build/
develop-eggs/
dist/
downloads/
eggs/
.eggs/
etc/
include/
lib/
lib64/
parts/
sdist/
share/
var/
wheels/
pip-wheel-metadata/
share/python-wheels/
*.egg-info/
.installed.cfg
*.egg
MANIFEST

# PyInstaller
# Usually these files are written by a python script from a template
# before PyInstaller builds the exe, so as to inject date/other infos into it.
*.manifest
*.spec

# Installer logs
pip-log.txt
pip-delete-this-directory.txt

# Unit test / coverage reports
htmlcov/
.tox/
.nox/
.coverage
.coverage.*
.cache
nosetests.xml
coverage.xml
*.cover
*.py,cover
.hypothesis/
.pytest_cache/
.ruff_cache

# Translations
*.mo
*.pot

# Django stuff:
*.log
local_settings.py
db.sqlite3
db.sqlite3-journal

# Flask stuff:
instance/
.webassets-cache

# Scrapy stuff:
.scrapy

# Sphinx documentation
docs/_build/

# PyBuilder
target/

# Jupyter Notebook
.ipynb_checkpoints
notebooks/

# IPython
profile_default/
ipython_config.py

# pyenv
.python-version

# pipenv
# According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control.
# However, in case of collaboration, if having platform-specific dependencies or dependencies
# having no cross-platform support, pipenv may install dependencies that don't work, or not
# install all needed dependencies.
#Pipfile.lock

# PEP 582; used by e.g. github.com/David-OConnor/pyflow
__pypackages__/

# Celery stuff
celerybeat-schedule
celerybeat.pid

# SageMath parsed files
*.sage.py

# Environments
.env
.venv
env/
venv/
ENV/
env.bak/
venv.bak/
pyvenv.cfg

# Spyder project settings
.spyderproject
.spyproject

# Rope project settings
.ropeproject

# mkdocs documentation
/site

# mypy
.mypy_cache/
.dmypy.json
dmypy.json

# Pyre type checker
.pyre/

# Jetbrains
.idea
modules/
*.swp

# VsCode
.vscode

# pipenv
Pipfile
Pipfile.lock

# pyright
pyrightconfig.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
poetry_requirements(
name="poetry",
)
17 changes: 17 additions & 0 deletions llama-index-integrations/llms/llama-index-llms-keywordsai/Makefile
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
GIT_ROOT ?= $(shell git rev-parse --show-toplevel)

help: ## Show all Makefile targets.
@grep -E '^[a-zA-Z_-]+:.*?## .*$$' $(MAKEFILE_LIST) | awk 'BEGIN {FS = ":.*?## "}; {printf "\033[33m%-30s\033[0m %s\n", $$1, $$2}'

format: ## Run code autoformatters (black).
pre-commit install
git ls-files | xargs pre-commit run black --files

lint: ## Run linters: pre-commit (black, ruff, codespell) and mypy
pre-commit install && git ls-files | xargs pre-commit run --show-diff-on-failure --files

test: ## Run tests via pytest.
pytest tests

watch-docs: ## Build and watch documentation.
sphinx-autobuild docs/ docs/_build/html --open-browser --watch $(GIT_ROOT)/llama_index/
131 changes: 131 additions & 0 deletions llama-index-integrations/llms/llama-index-llms-keywordsai/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,131 @@
# LlamaIndex Llms Integration: KeywordsAI

## Installation

To install the required package, run:

```bash
%pip install llama-index-llms-keywordsai
```

## Setup

1. Set your KeywordsAI API key as an environment variable. You can replace `"sk-..."` with your actual API key:

```python
import os

os.environ["OPENAI_API_KEY"] = "sk-..."
```

## Basic Usage

### Generate Completions

To generate a completion for a prompt, use the `complete` method:

```python
from llama_index.llms.keywordsai import KeywordsAI

resp = KeywordsAI().complete("Paul Graham is ")
print(resp)
```

### Chat Responses

To send a chat message and receive a response, create a list of `ChatMessage` instances and use the `chat` method:

```python
from llama_index.core.llms import ChatMessage

messages = [
ChatMessage(
role="system", content="You are a pirate with a colorful personality."
),
ChatMessage(role="user", content="What is your name?"),
]
resp = KeywordsAI().chat(messages)
print(resp)
```

## Streaming Responses

### Stream Complete

To stream responses for a prompt, use the `stream_complete` method:

```python
from llama_index.llms.keywordsai import KeywordsAI

llm = KeywordsAI()
resp = llm.stream_complete("Paul Graham is ")
for r in resp:
print(r.delta, end="")
```

### Stream Chat

To stream chat responses, use the `stream_chat` method:

```python
from llama_index.llms.keywordsai import KeywordsAI
from llama_index.core.llms import ChatMessage

llm = KeywordsAI()
messages = [
ChatMessage(
role="system", content="You are a pirate with a colorful personality."
),
ChatMessage(role="user", content="What is your name?"),
]
resp = llm.stream_chat(messages)
for r in resp:
print(r.delta, end="")
```

## Configure Model

You can specify a particular model when creating the `KeywordsAI` instance:

```python
llm = KeywordsAI(model="gpt-3.5-turbo")
resp = llm.complete("Paul Graham is ")
print(resp)

messages = [
ChatMessage(
role="system", content="You are a pirate with a colorful personality."
),
ChatMessage(role="user", content="What is your name?"),
]
resp = llm.chat(messages)
print(resp)
```

## Asynchronous Usage

You can also use asynchronous methods for completion:

```python
from llama_index.llms.keywordsai import KeywordsAI

llm = KeywordsAI(model="gpt-3.5-turbo")
resp = await llm.acomplete("Paul Graham is ")
print(resp)
```

## Set API Key at a Per-Instance Level

If desired, you can have separate LLM instances use different API keys:

```python
from llama_index.llms.keywordsai import KeywordsAI

llm = KeywordsAI(model="gpt-3.5-turbo", api_key="BAD_KEY")
resp = KeywordsAI().complete("Paul Graham is ")
print(resp)
```

### LLM Implementation example

https://docs.llamaindex.ai/en/stable/examples/llm/keywordsai/
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
python_sources()
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
from llama_index.llms.keywordsai.base import KeywordsAI

__all__ = ["KeywordsAI"]
Loading

0 comments on commit fc561a3

Please sign in to comment.