Skip to content

Local GPT assistance for maximum privacy and offline access

License

Notifications You must be signed in to change notification settings

LangMers-Systems/obsidian-local-gpt

 
 

Repository files navigation

Local GPT plugin for Obsidian

demo
No speedup. MacBook Pro 13, M1, 16GB, Ollama, orca-mini.

Local GPT assistance for maximum privacy and offline access.
The plugin allows you to open a context menu on selected text to pick an AI-assistant's action.

Default actions:

  • Continue writing
  • Summarize text
  • Fix spelling and grammar
  • Find action items in text
  • General help (just use selected text as a prompt for any purpose)

You can also add yours, share the best actions or get one from the community
Settings

Supported AI Providers:

  • Ollama
  • OpenAI compatible server

Limitations:

  • No mobile support.

Installation

Obsidian plugin store

This plugin is available in the Obsidian community plugin store https://obsidian.md/plugins?id=local-gpt

BRAT

You can install this plugin via BRAT: pfrankov/obsidian-local-gpt

Install LLM

Ollama (recommended)

  1. Install Ollama. No Windows support yet.
  2. Install orca-mini (default) ollama pull orca-mini or any preferred model from the library.

Additional: if you want to enable streaming completion with Ollama you should run it in API-mode: OLLAMA_ORIGINS='*' ollama serve.

OpenAI compatible server

There are several options to run local OpenAI-like server:

Here is an example for llama.cpp:

  1. Install llama.cpp and follow build instructions for your OS
  2. Download a model trained on the ChatML dialog format. For example, Orca 2
  3. Run the server by calling ./server -c 4096 --host 0.0.0.0 -t 16 --mlock -m models/orca-2-7b.Q4_K_M.gguf or as described in the documentation.

Configure Obsidian hotkey (optional)

  1. Open Obsidian Settings
  2. Go to Hotkeys
  3. Filter "Local" and you should see "Local GPT: Show context menu"
  4. Click on + icon and press hotkey (e.g. ⌘ + M)

Roadmap

  • Ability to select models from the list instead of typing their names
  • Ability to share and apply presets (system prompt + prompt + model)
  • Additional AI providers (OpenAI, etc...)
  • Streaming completions
  • Optional settings for prompts (top_p, top_k, temperature, repeat_penalty)
  • Fallback for action if first URL is unavailable (remote GPU)
  • Changing order of the actions
  • Accounting your local documents in results as described here https://ollama.ai/blog/llms-in-obsidian

Other AI providers

If you would like to use other providers, please let me know in the discussions.

My other Obsidian plugins

  • Colored Tags that colorizes tags in distinguishable colors.

Inspired by

About

Local GPT assistance for maximum privacy and offline access

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • TypeScript 93.0%
  • JavaScript 5.3%
  • CSS 1.7%