Releases: KoboldAI/KoboldAI-Client
Releases · KoboldAI/KoboldAI-Client
Looking for our latest KoboldAI product?
At the time of writing this post the current official KoboldAI branch is outdated and behind in model support.
Want to run the latest models? Want to avoid large downloads and installations?
Check out KoboldCpp, our GGUF based solution.
Need a model for KoboldCpp?
- Go to Huggingface and look for GGUF models if you want the GGUF for a specific model search for a part of the name of your model followed by GGUF to find GGUF releases.
- Go the files tab and pick the file size that best fits your hardware, Q4_K_S is a good balance.
- Click the small download icon right to the filename to download your GGUF.
- Load the GGUF in KoboldCpp, you can now use the AI.
KoboldCpp is available for Windows, Linux and ARM MacOS
(Files attached below are automatically posted by Github and do not work, please use the link above to obtain the release).
1.19.2
- Sampling Order loading is fixed
- More models
- Flask_session cleaned on launch (Helps against bugs caused by switching between versions)
- First base of GPU softprompt tuning (Interface is not yet added)
- Compatibility improvements when other versions of conda are installed
1.19.1
This is a small release adding a few improvements including a patch for a pytorch vulnerability that is not fixed upstream.
- Malicious Pytorch models now give an error instead of executing malicious code
- API can now influence the seed
- LUA Error's are now correctly shown as errors instead of debug messages.