Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docs: Improve Local LLM information and add WSL Troubleshooting #752

Merged
merged 7 commits into from
Jan 2, 2024
Merged
Show file tree
Hide file tree
Changes from 5 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 5 additions & 5 deletions docs/local_llm.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,15 +6,15 @@ category: 6580da9a40bb410016b8b0c3

> 📘 Need help?
>
> If you need help visit our [Discord server](https://discord.gg/9GEQrxmVyE) and post in the #support channel.
> Visit our [Discord server](https://discord.gg/9GEQrxmVyE) and post in the #support channel. Make sure to check the [local LLM troubleshooting page](local_llm_faq) to see common issues before raising a new issue or posting on Discord.

> 📘 Using Windows?
>
> You can also check the [GitHub discussion page](https://github.com/cpacker/MemGPT/discussions/67), but the Discord server is the official support channel and is monitored more actively.
> If you're using Windows and are trying to get MemGPT with local LLMs setup, we recommend using Anaconda Shell, or WSL (for more advanced users). See more Windows installation tips [here](local_llm_faq).

> ⚠️ MemGPT + open LLM failure cases
>
> When using open LLMs with MemGPT, **the main failure case will be your LLM outputting a string that cannot be understood by MemGPT**. MemGPT uses function calling to manage memory (eg `edit_core_memory(...)` and interact with the user (`send_message(...)`), so your LLM needs generate outputs that can be parsed into MemGPT function calls.
>
> Make sure to check the [local LLM troubleshooting page](local_llm_faq) to see common issues before raising a new issue or posting on Discord.
> When using open LLMs with MemGPT, **the main failure case will be your LLM outputting a string that cannot be understood by MemGPT**. MemGPT uses function calling to manage memory (eg `edit_core_memory(...)` and interact with the user (`send_message(...)`), so your LLM needs generate outputs that can be parsed into MemGPT function calls. See [LINK TO FAQ] for more information.
cpacker marked this conversation as resolved.
Show resolved Hide resolved

### Installing dependencies
To install dependencies required for running local models, run:
Expand Down
16 changes: 16 additions & 0 deletions docs/local_llm_faq.md
Original file line number Diff line number Diff line change
Expand Up @@ -71,3 +71,19 @@ This string is not correct JSON - it is missing closing brackets and has a stray
### "Got back an empty response string from ..."

MemGPT asked the server to run the LLM, but got back an empty response. Double-check that your server is running properly and has context length set correctly (it should be set to 8k if using Mistral 7B models).

### "Unable to connect to endpoint" using Windows + WSL

>⚠️ We recommend using Anaconda Shell, as WSL has been known to have issues passing network traffic between WSL and the windows host.
cpacker marked this conversation as resolved.
Show resolved Hide resolved

If you still would like to try WSL, you must be on WSL version 2.0.5 or above with the installation from the Microsoft Store app.
You will need to verify your WSL network mode is set to "mirrored"

You can do this by checking the `.wslconfig` file in `%USERPROFILE%'

Add the following if the file does not contain:
```
[wsl2]
networkingMode=mirrored # add this line if the wsl2 section already exists
```
sanegaming marked this conversation as resolved.
Show resolved Hide resolved

8 changes: 3 additions & 5 deletions docs/quickstart.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,17 +5,15 @@ category: 6580d34ee5e4d00068bf2a1d
---

### Installation
> 📘 Using Local LLMs?
cpacker marked this conversation as resolved.
Show resolved Hide resolved
>
> If you're using local LLMs refer to the MemGPT + open models page [here](local_llm) for additional installation requirements.

To install MemGPT, make sure you have Python installed on your computer, then run:
```sh
pip install pymemgpt
```

If you are running LLMs locally, you will want to install MemGPT with the local dependencies by running:
```sh
pip install pymemgpt[local]
```

If you already have MemGPT installed, you can update to the latest version with:
```sh
pip install pymemgpt -U
Expand Down
Loading