-
-
Notifications
You must be signed in to change notification settings - Fork 365
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
searxng.engines.wikidata: Fail to initialize #108
Comments
@t-dsai, I'm having the same issue , please tell me how did you resolved it |
Same issue here. |
me neither |
Same issue |
For me it was a proxy issue, I solved it by setting the proxy describes here in my settings.yml: https://docs.searxng.org/admin/settings/settings_outgoing.html outgoing: |
Cool, excuse my ignorance but what proxy? |
Hi,
Thank you for sharing your work.
On a Debian 12 machine, with docker running, I already have Ollama, with all-minilm, nomic-embed-text, and spooknik/hermes-2-pro-mistral-7b models available. In my
.env
file, I've only one lineOLLAMA_HOST=http://localhost:11434
.I can see from
docker ps -a
that following containers are present (in the exited state afterctrl+c
)0e363e666648 nilsherzig/llocalsearch-frontend:latest
2d415fe8d659 chromadb/chroma
40dfd8260b28 searxng/searxng:latest
a49ae53eaf6f redis:alpine
7bb5fd2c08d1 nilsherzig/llocalsearch-backend:latest
I cloned the LLocalSearch repo. When I run
docker compose up
, I get the following errors (and many more similar ones after these). Can anyone help in successfully running LLocalSearch?Please note that my system does not have
anyio, httpcore, and httpx
in the directory/usr/lib/python3.11
. I do have a few Python environments, where these packages are installed.If these packages are necessary to run LLocalSearch, can LLocalSearch use a custom Python environment to search these packages?
Thanks in advance.
The text was updated successfully, but these errors were encountered: