Skip to content

Commit

Permalink
docs(providers): simplify provider docs and update API key config
Browse files Browse the repository at this point in the history
- Simplify provider documentation by consolidating API key instructions
- Update config example with new provider API keys (XAI, Groq, Deepseek)
- Improve local model section clarity

Co-authored-by: Bob <[email protected]>
  • Loading branch information
ErikBjare and TimeToBuildBob committed Dec 11, 2024
1 parent 84c8316 commit c86b47c
Show file tree
Hide file tree
Showing 2 changed files with 15 additions and 65 deletions.
6 changes: 4 additions & 2 deletions docs/config.rst
Original file line number Diff line number Diff line change
Expand Up @@ -33,15 +33,17 @@ Here is an example:
OPENAI_API_KEY = ""
ANTHROPIC_API_KEY = ""
OPENROUTER_API_KEY = ""
AZURE_OPENAI_API_KEY = ""
XAI_API_KEY = ""
GROQ_API_KEY = ""
DEEPSEEK_API_KEY = ""
# Uncomment to use with Ollama
#MODEL = "local/<model-name>"
#OPENAI_BASE_URL = "http://localhost:11434/v1"
The ``prompt`` section contains options for the prompt.

The ``env`` section contains environment variables that gptme will fall back to if they are not set in the shell environment. This is useful for setting defaults for API keys and models.
The ``env`` section contains environment variables that gptme will fall back to if they are not set in the shell environment. This is useful for setting the default model and API keys for :doc:`providers`.


Project config
Expand Down
74 changes: 11 additions & 63 deletions docs/providers.rst
Original file line number Diff line number Diff line change
Expand Up @@ -15,70 +15,20 @@ To select a provider and model, run ``gptme`` with the ``--model`` flag set to `
On first startup, if ``--model`` is not set, and no API keys are set in the config or environment it will be prompted for. It will then auto-detect the provider, and save the key in the configuration file.

You can persist the below environment variables in the :doc:`config` file.
Use the ``[env]`` section in the ``gptme.toml`` :doc:`config` file to store API keys using the same format as the environment variables:

OpenAI
------
- ``OPENAI_API_KEY="your-api-key"``
- ``ANTHROPIC_API_KEY="your-api-key"``
- ``OPENROUTER_API_KEY="your-api-key"``
- ``GEMINI_API_KEY="your-api-key"``
- ``GROQ_API_KEY="your-api-key"``
- ``XAI_API_KEY="your-api-key"``

To use OpenAI, set your API key:
.. rubric:: Local

.. code-block:: sh
export OPENAI_API_KEY="your-api-key"
Anthropic
---------

To use Anthropic, set your API key:

.. code-block:: sh
export ANTHROPIC_API_KEY="your-api-key"
OpenRouter
----------
You can use local LLM models using any OpenAI API-compatible server.

To use OpenRouter, set your API key:

.. code-block:: sh
export OPENROUTER_API_KEY="your-api-key"
Gemini
----------

To use Gemini, set your API key:

.. code-block:: sh
export GEMINI_API_KEY="your-api-key"
Groq
----

To use Groq, set your API key:

.. code-block:: sh
export GROQ_API_KEY="your-api-key"
xAI
---

To use xAI, set your API key:

.. code-block:: sh
export XAI_API_KEY="your-api-key"
Local
-----

There are several ways to run local LLM models in a way that exposes a OpenAI API-compatible server.

Here's we will cover how to achieve that with ``ollama``.

You first need to install ``ollama``, then you can run it with:
To achieve that with ``ollama``, install it then run:

.. code-block:: sh
Expand All @@ -88,6 +38,4 @@ You first need to install ``ollama``, then you can run it with:
.. note::

Small models will not reliably follow the system prompt, and will thus fail to use tools, severely limiting the usefulness of gptme.

The smallest model which performs somewhat adequately is Llama 3.1 70B. You can find an overview of how different models perform on the :doc:`evals` page.
Small models won't work well with tools, severely limiting the usefulness of gptme. You can find an overview of how different models perform on the :doc:`evals` page.

0 comments on commit c86b47c

Please sign in to comment.