From 922f79e7572350191ee88917365cd3a4752ceb86 Mon Sep 17 00:00:00 2001 From: writinwaters <93570324+writinwaters@users.noreply.github.com> Date: Mon, 2 Sep 2024 14:31:31 +0800 Subject: [PATCH] Fixed a broken link (#2190) To fix a broken link ### Type of change - [x] Documentation Update --- docs/references/faq.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/references/faq.md b/docs/references/faq.md index 3d0acc12a6..2be118aea2 100644 --- a/docs/references/faq.md +++ b/docs/references/faq.md @@ -357,7 +357,7 @@ This exception occurs when starting up the RAGFlow server. Try the following: 1. Right click the desired dialog to display the **Chat Configuration** window. 2. Switch to the **Model Setting** tab and adjust the **Max Tokens** slider to get the desired length. -3. Click **OK** to confirm your change. +3. Click **OK** to confirm your change. ### 2. What does Empty response mean? How to set it? @@ -370,7 +370,7 @@ You limit what the system responds to what you specify in **Empty response** if ### 4. How to run RAGFlow with a locally deployed LLM? -You can use Ollama to deploy local LLM. See [here](https://github.com/infiniflow/ragflow/blob/main/docs/guides/deploy_local_llm.md) for more information. +You can use Ollama to deploy local LLM. See [here](../guides/deploy_local_llm.mdx) for more information. ### 5. How to link up ragflow and ollama servers?