diff --git a/docs/local_llm.md b/docs/local_llm.md index e72287cfa3..655fba204c 100644 --- a/docs/local_llm.md +++ b/docs/local_llm.md @@ -6,15 +6,15 @@ category: 6580da9a40bb410016b8b0c3 > 📘 Need help? > -> If you need help visit our [Discord server](https://discord.gg/9GEQrxmVyE) and post in the #support channel. +> Visit our [Discord server](https://discord.gg/9GEQrxmVyE) and post in the #support channel. Make sure to check the [local LLM troubleshooting page](local_llm_faq) to see common issues before raising a new issue or posting on Discord. + +> 📘 Using Windows? > -> You can also check the [GitHub discussion page](https://github.com/cpacker/MemGPT/discussions/67), but the Discord server is the official support channel and is monitored more actively. +> If you're using Windows and are trying to get MemGPT with local LLMs setup, we recommend using Anaconda Shell, or WSL (for more advanced users). See more Windows installation tips [here](local_llm_faq). > ⚠️ MemGPT + open LLM failure cases > -> When using open LLMs with MemGPT, **the main failure case will be your LLM outputting a string that cannot be understood by MemGPT**. MemGPT uses function calling to manage memory (eg `edit_core_memory(...)` and interact with the user (`send_message(...)`), so your LLM needs generate outputs that can be parsed into MemGPT function calls. -> -> Make sure to check the [local LLM troubleshooting page](local_llm_faq) to see common issues before raising a new issue or posting on Discord. +> When using open LLMs with MemGPT, **the main failure case will be your LLM outputting a string that cannot be understood by MemGPT**. MemGPT uses function calling to manage memory (eg `edit_core_memory(...)` and interact with the user (`send_message(...)`), so your LLM needs generate outputs that can be parsed into MemGPT function calls. See [the local LLM troubleshooting page](local_llm_faq) for more information. ### Installing dependencies To install dependencies required for running local models, run: diff --git a/docs/local_llm_faq.md b/docs/local_llm_faq.md index d77c1cd32a..7a474b0725 100644 --- a/docs/local_llm_faq.md +++ b/docs/local_llm_faq.md @@ -71,3 +71,20 @@ This string is not correct JSON - it is missing closing brackets and has a stray ### "Got back an empty response string from ..." MemGPT asked the server to run the LLM, but got back an empty response. Double-check that your server is running properly and has context length set correctly (it should be set to 8k if using Mistral 7B models). + +### "Unable to connect to endpoint" using Windows + WSL + +>⚠️ We recommend using Anaconda Shell, as WSL has been known to have issues passing network traffic between WSL and the Windows host. +> Check the [WSL Issue Thread](https://github.com/microsoft/WSL/issues/5211) for more info. + +If you still would like to try WSL, you must be on WSL version 2.0.5 or above with the installation from the Microsoft Store app. +You will need to verify your WSL network mode is set to "mirrored" + +You can do this by checking the `.wslconfig` file in `%USERPROFILE%' + +Add the following if the file does not contain: +``` +[wsl2] +networkingMode=mirrored # add this line if the wsl2 section already exists +``` + diff --git a/docs/quickstart.md b/docs/quickstart.md index f64377cc37..9cc10d4051 100644 --- a/docs/quickstart.md +++ b/docs/quickstart.md @@ -5,17 +5,15 @@ category: 6580d34ee5e4d00068bf2a1d --- ### Installation +> 📘 Using Local LLMs? +> +> If you're using local LLMs refer to the MemGPT + open models page [here](local_llm) for additional installation requirements. To install MemGPT, make sure you have Python installed on your computer, then run: ```sh pip install pymemgpt ``` -If you are running LLMs locally, you will want to install MemGPT with the local dependencies by running: -```sh -pip install pymemgpt[local] -``` - If you already have MemGPT installed, you can update to the latest version with: ```sh pip install pymemgpt -U