Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

option to use local models vs. openai key? #27

Open
james-see opened this issue May 1, 2023 · 4 comments
Open

option to use local models vs. openai key? #27

james-see opened this issue May 1, 2023 · 4 comments

Comments

@james-see
Copy link

I want to be able to pass in "local-mode" env or similar that uses a local model you put in a particular folder, etc.

@logan-markewich
Copy link
Contributor

The llama-agi package supports passing in an llm from langchain, as well as using the service context to set the LLM for llama-index

I can create an example for this if you need more guidance :)

@james-see
Copy link
Author

I am not as familiar with those packages, an example would be very helpful, thank you!

@nathanjzhao
Copy link

An example for passing an LLM from a local model would be great!

@jabogithub
Copy link

I also would like an example for passing an LLM from a local model! Especially Llama 2

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants