Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make it possible to use Local LLM(llamacpp,oobabooga) APIs to run evo.ninja #717

Open
Tempaccnt opened this issue Apr 3, 2024 · 6 comments
Assignees

Comments

@Tempaccnt
Copy link

Is your feature request related to a problem? Please describe.
not everyone has access to openAI API. and even if you do have access, you're always limited by your budget. having an agent that can run completely free and offline too, will be great.

Describe the solution you'd like
try to include llamacpp or other LLM project where the user can use custom models placed inside a folder.

@haliliceylan
Copy link

What you're saying seems feasible. Is there an open-source project available that handles requests using OpenAI API routes? The only concern is whether this project utilizes function calling. If it does, then we might need to include an additional parameter for function calling in the prompt. like here

I am very eager to integrate it with PR.

@rihp
Copy link
Contributor

rihp commented Apr 15, 2024

This thread on Polywrap's discord covers previous research around this feature

https://discord.com/channels/796821176743362611/1211733650375184466/1211753664218275840

to join the discord and read the messages https://discord.gg/k7UCsH3ps9

Let me know if there is any way i can provide guidance either through a comment here or through the discord!

@haliliceylan @Tempaccnt Happy monday!

@haliliceylan
Copy link

haliliceylan commented Apr 17, 2024

@rihp I could not pass the captcha check on Discord. So I am writing here again.

Litellm is supporting function call actually, and its even supporting to non openai AI see:

https://docs.litellm.ai/docs/completion/function_call#using-function_to_dict-with-function-calling

@akramer-zibra
Copy link

Hey together, found this discussion by coincidence. I like this project, but for my use case it is too strongly coupled to the OpenAI Service. Which is why I started writing additional adapters for other LLM services. Unfortunately, the code doesn't work yet, so I can't show anything about it. But I just wanted to briefly comment on this and show that there is also interest in such feature here as well.

@Tempaccnt
Copy link
Author

Tempaccnt commented Apr 27, 2024

I found a project that seems to implement this somehow(focused on coding) using ollama, I still haven't gotten a deep look into it. but I have seen some videos of it running.

here it's:
https://github.com/stitionai/devika

@ArthurMynl
Copy link

I would be very interested by such a feature too !

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

6 participants