Skip to content
/ jan Public

Jan is an open source alternative to ChatGPT that runs 100% offline on your computer. Multiple engine support (llama.cpp, TensorRT-LLM)

License

Notifications You must be signed in to change notification settings

janhq/jan

Repository files navigation

Jan - Local AI Assistant

Jan banner

GitHub commit activity Github Last Commit Github Contributors GitHub closed issues Discord

Getting Started - Docs - Changelog - Bug reports - Discord

⚠️ Jan is currently in Development: Expect breaking changes and bugs!

Jan is a ChatGPT-alternative that runs 100% offline on your device. Our goal is to make it easy for a layperson to download and run LLMs and use AI with full control and privacy.

Jan is powered by Cortex, our embeddable local AI engine that runs on any hardware. From PCs to multi-GPU clusters, Jan & Cortex supports universal architectures:

  • NVIDIA GPUs (fast)
  • Apple M-series (fast)
  • Apple Intel
  • Linux Debian
  • Windows x64

Features:

  • Model Library with popular LLMs like Llama, Gemma, Mistral, or Qwen
  • Connect to Remote AI APIs like Groq and OpenRouter
  • Local API Server with OpenAI-equivalent API
  • Extensions for customizing Jan

Download

Version Type Windows MacOS Linux
Stable (Recommended) jan.exe Intel M1/M2/M3/M4 jan.deb jan.AppImage
Beta (Preview) jan.exe Intel M1/M2/M3/M4 jan.deb jan.AppImage
Nightly Build (Experimental) jan.exe Intel M1/M2/M3/M4 jan.deb jan.AppImage

Download the latest version of Jan at https://jan.ai/ or visit the GitHub Releases to download any previous release.

Demo

Jan.Demo.README.mp4

Real-time Video: Jan v0.5.7 on a Mac M2, 16GB Sonoma 14.2

Quicklinks

Jan

Cortex.cpp

Jan is powered by Cortex.cpp. It is a C++ command-line interface (CLI) designed as an alternative to Ollama. By default, it runs on the llama.cpp engine but also supports other engines, including ONNX and TensorRT-LLM, making it a multi-engine platform.

Requirements for running Jan

  • MacOS: 13 or higher
  • Windows:
    • Windows 10 or higher
    • To enable GPU support:
      • Nvidia GPU with CUDA Toolkit 11.7 or higher
      • Nvidia driver 470.63.01 or higher
  • Linux:
    • glibc 2.27 or higher (check with ldd --version)
    • gcc 11, g++ 11, cpp 11 or higher, refer to this link for more information
    • To enable GPU support:
      • Nvidia GPU with CUDA Toolkit 11.7 or higher
      • Nvidia driver 470.63.01 or higher

Troubleshooting

As Jan is in development mode, you might get stuck on a some common issues:

If you can't find what you need in our troubleshooting guide, feel free reach out to us for extra help:

  1. Copy your error logs & device specifications.
  2. Go to our Discord & send it to #🆘|get-help channel for further support.

Check the logs to ensure the information is what you intend to send. Note that we retain your logs for only 24 hours, so report any issues promptly.

Contributing

Contributions are welcome! Please read the CONTRIBUTING.md file

Pre-requisites

  • node >= 20.0.0
  • yarn >= 1.22.0
  • make >= 3.81

Instructions

  1. Clone the repository and prepare:

    git clone https://github.com/janhq/jan
    cd jan
    git checkout -b DESIRED_BRANCH
  2. Run development and use Jan Desktop

    make dev

This will start the development server and open the desktop app.

For production build

# Do steps 1 and 2 in the previous section
# Build the app
make build

This will build the app MacOS m1/m2 for production (with code signing already done) and put the result in dist folder.

Acknowledgements

Jan builds on top of other open-source projects:

Contact

Trust & Safety

Beware of scams!

  • We will never request your personal information.
  • Our product is completely free; no paid version exists.
  • We do not have a token or ICO.
  • We are a bootstrapped company, and don't have any external investors (yet). We're open to exploring opportunities with strategic partners want to tackle our mission together.

License

Jan is free and open source, under the AGPLv3 license.