Skip to content

garland3/llminterface

Repository files navigation

LLM Interface in python

demo of the using the cli

Installation

git clone 
pip install -e .

OR

pip install git+https://github.com/garland3/llminterface.git

Setup

  • make an openai account
  • create an openai api key

Put it in either.

  • src/chatconfig/.secrets.toml
  • ~/.llminterface/.secrets.toml
  • ~/.secrets.toml
openaikey = "blahblahblah"

I'm using dynaconfig to load the secrets. So you could also put it in an environment variable.

export DYNACONF_openaikey="blahblahblah"

Usage

See the examples folder for examples of how to use the interface.

from llminterface.interface_openai import OpenAIChat
# def main():
c = OpenAIChat()
c.start_dialog()
c("Hello")

# %%
c("Given a csv file with 'temp' as a column and some other columns. Use flaml, automl to predict the temp column. Write the code. ")

# %%
c("can you write some fake data to test the code?")

OR in the commandline

chat Given a csv file with 'temp' as a column and some other columns. Use flaml, automl to predict the temp column. Write the code.

Dev

  • pre-commit with black and flake8
  • testing with pytest
pytest tests -vv -s -x

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published