Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add support for llama cpp and rag #31

Open
Developer282 opened this issue Dec 4, 2024 · 2 comments
Open

add support for llama cpp and rag #31

Developer282 opened this issue Dec 4, 2024 · 2 comments

Comments

@Developer282
Copy link

Is your feature request related to a problem? Please describe.
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]

Describe the solution you'd like
A clear and concise description of what you want to happen.

Describe alternatives you've considered
A clear and concise description of any alternative solutions or features you've considered.

Additional context
Add any other context or screenshots about the feature request here.

@paulrobello
Copy link
Owner

I will add LlamaCpp to the roadmap, RAG is already on the roadmap and should be available in the next release which I hope to have out in the next month.

@paulrobello
Copy link
Owner

Just relesed parllama v0.3.11 with support for llamacpp running in openai compatibility mode.
start llamacpp in openai compat mode with your desired model loaded
Go to options area in parllama to change base url if needed, go to chat tab seled LlamaCpp as provider and leave model as "default"

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants