Extension for use of Ollama inside VSCode.
Easy-Ollama extension provides simple interface to interact with Ollama service inside your usual workspace in Visual Studio Code. It allows to send prompt to one of predefined models, shows generated text. And, if there's any code in response, it will show code snippets, with ability to copy it to clipboard.
- HTML
- CSS
- JavaScript
- JSON
- Python
- Go
- Rust
Here's an example of usage:
Tip: Currently there's no chat capability, so one prompt - one answer, without previous context. Please, keep that in mind.
To use Easy-Ollama extension, you need to preinstall Ollama (https://ollama.com), download models, and run Ollama service on your local device. Please, refer to docs on Ollama website. Currently supported models: DeepSeek-R1, Llama 3.2. VSCode minimum version: 1.97.0.
This extension contributes the following settings:
easy-ollama.model
: Choose between models to use.
There are currently no known issues.
- Works in background
- Code Snippet language badge
- Additional languages support for Code Snippets: JSON, Python, Go, Rust
- Clear Snippets button to remove snippets
- State for webview, which is destroyed on closing Easy-Ollama panel, but keeps data when switching tabs
Initial working version.
Enjoy!