Skip to content

v1.0: Hello LlamaChat 🙌

Compare
Choose a tag to compare
@alexrozanski alexrozanski released this 11 Apr 12:05
· 68 commits to main since this release

This is the v1.0 release of LlamaChat, which allows you to run LLaMA-compatible model files in a native macOS chat-style app.

LlamaChat currently supports models from:

LlamaChat supports models in both the raw PyTorch checkpoint format (.pth) as well as the .ggml format, since LlamaChat is powered by the ggml and llama.cpp and llama.swift libraries.

Features

  • Import .pth and .ggml models with support for pre-converting .pth files directly within the app. Note that some manual intervention may be necessary in the case of outdated .ggml model files; please see the llama.cpp repository for more.
  • Chat with LLaMA-compatible models in a native macOS chat-style interface. Messages are stored between sessions.
  • Support for fun Llama and Alpaca avatars to make the chat experience more fun ✌️
  • Inspect and debug model context including raw tokens.
  • Clear model context and chat history on demand.