Skip to content
This repository has been archived by the owner on May 13, 2024. It is now read-only.

Implemented support for more LLM models #104

Draft
wants to merge 12 commits into
base: develop
Choose a base branch
from

Conversation

Sponge-bink
Copy link
Collaborator

This PR implements support for more LLM models like Claude 2 from Anthropic, PaLM from Google, Llama v2 70B from Meta, and GPT-4 from OpenAI (more models are being added) by routing the request using OpenRouter with change of code and affected compatibility kept to a minimum.

  1. Requests to OpenRouter use the same openai module as those sent to OpenAI, no additional dependency is required. The only difference is that it requires a headers argument for tracking usages from different apps. User can fill out whatever header they like for potentially supporting more routing services.

  2. The option to use OpenRouter (or other services supported) is opt-in, means that users who doesn't know or don't want to use it won't feel a thing after they upgrade the workflow. And if they do use it, it will only apply to the chat completion part, instruct GPT part and image generation part will still use models from OpenAI.

  3. Unsupported parameter will be ignored for models that do not support it, such as stream for meta-llama/llama-2-70b-chat, in which case a window will show after the full reply is returned. No code modification for each parameter based on each model.

Chris Lemke and others added 8 commits August 3, 2023 11:00
* Should fix history_length 0 issue

* ci: auto fixes from pre-commit.com hooks

* Re-implemented the logic for replacing content within promts with its aliases

* ci: auto fixes from pre-commit.com hooks

---------

Co-authored-by: stayinalive <[email protected]>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
updates:
- [github.com/astral-sh/ruff-pre-commit: v0.0.281 → v0.0.282](astral-sh/ruff-pre-commit@v0.0.281...v0.0.282)

Co-authored-by: Chris Lemke <[email protected]>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
* bump version

* ChatGPT Aliases Logic Change (chrislemke#87)

* Should fix history_length 0 issue

* ci: auto fixes from pre-commit.com hooks

* Re-implemented the logic for replacing content within promts with its aliases

* ci: auto fixes from pre-commit.com hooks

---------

Co-authored-by: stayinalive <[email protected]>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>

* version bump

* Implemented streaming using Flet
Add the current model to the subtext
Dependency for Flet
Custom font

* storyboard and Workflow config file updated, an option $stream_reply added

* Change version rto 1.5.4

* ci: auto fixes from pre-commit.com hooks

* Updated Readme

* Updated Readme and discription

* Readme update

* combined text_chat_flet and text_chat to one file

* Renamed it back to the original text_chat.py

* ci: auto fixes from pre-commit.com hooks

* fix: remove all .so files

* add: Readme to reflect what Stream reply does

---------

Co-authored-by: Chris Lemke <[email protected]>
Co-authored-by: stayinalive <[email protected]>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
@Sponge-bink
Copy link
Collaborator Author

I'll add the missing documentation once if it seems reasonable for you to include this in ChatFred, I'm curious about what you think! @chrislemke

You can get your own key at https://openrouter.ai, $1 of credit is granted with each new account.

@chrislemke
Copy link
Owner

Hey @Sponge-bink!
Thanks for this. A really cool idea! I will checkout https://openrouter.ai - looks interesting.

@Sponge-bink
Copy link
Collaborator Author

I'm now discouraging this PR since the service of OpenRouter has been, to say at least, unstable recently.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants