-
Notifications
You must be signed in to change notification settings - Fork 124
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
error pops on local search while embedding #39
Comments
我也遇到了类似的问题: |
error pops on local search while embedding, global search works fine.
creating embedding llm client with {'api_key': 'REDACTED,len=32', 'type': "openai_embedding", 'model': 'nomic-embed-text', 'max_tokens': 4000, 'temperature': 0, 'top_p': 1, 'request_timeout': 180.0, 'api_base': 'http://localhost:11434/api', 'api_version': None, 'organization': None, 'proxy': None, 'cognitive_services_endpoint': None, 'deployment_name': None, 'model_supports_json': None, 'tokens_per_minute': 0, 'requests_per_minute': 0, 'max_retries': 10, 'max_retry_wait': 10.0, 'sleep_on_rate_limit_recommendation': True, 'concurrent_requests': 50}
Error embedding chunk {'OpenAIEmbedding': "'NoneType' object is not iterable"}
Traceback (most recent call last):
File "/home/ec2-user/anaconda3/envs/pytorch_p310/lib/python3.10/runpy.py", line 196, in _run_module_as_main
return _run_code(code, main_globals, None,
File "/home/ec2-user/anaconda3/envs/pytorch_p310/lib/python3.10/runpy.py", line 86, in _run_code
exec(code, run_globals)
File "/home/ec2-user/SageMaker/DataScienceSharedRepo/Kbot/supplierKbot/graphrag-local-ollama/graphrag/query/main.py", line 76, in
run_local_search(
File "/home/ec2-user/SageMaker/DataScienceSharedRepo/Kbot/supplierKbot/graphrag-local-ollama/graphrag/query/cli.py", line 154, in run_local_search
result = search_engine.search(query=query)
File "/home/ec2-user/SageMaker/DataScienceSharedRepo/Kbot/supplierKbot/graphrag-local-ollama/graphrag/query/structured_search/local_search/search.py", line 118, in search
context_text, context_records = self.context_builder.build_context(
File "/home/ec2-user/SageMaker/DataScienceSharedRepo/Kbot/supplierKbot/graphrag-local-ollama/graphrag/query/structured_search/local_search/mixed_context.py", line 139, in build_context
selected_entities = map_query_to_entities(
File "/home/ec2-user/SageMaker/DataScienceSharedRepo/Kbot/supplierKbot/graphrag-local-ollama/graphrag/query/context_builder/entity_extraction.py", line 55, in map_query_to_entities
search_results = text_embedding_vectorstore.similarity_search_by_text(
File "/home/ec2-user/SageMaker/DataScienceSharedRepo/Kbot/supplierKbot/graphrag-local-ollama/graphrag/vector_stores/lancedb.py", line 118, in similarity_search_by_text
query_embedding = text_embedder(text)
File "/home/ec2-user/SageMaker/DataScienceSharedRepo/Kbot/supplierKbot/graphrag-local-ollama/graphrag/query/context_builder/entity_extraction.py", line 57, in
text_embedder=lambda t: text_embedder.embed(t),
File "/home/ec2-user/SageMaker/DataScienceSharedRepo/Kbot/supplierKbot/graphrag-local-ollama/graphrag/query/llm/oai/embedding.py", line 96, in embed
chunk_embeddings = np.average(chunk_embeddings, axis=0, weights=chunk_lens)
File "/home/ec2-user/anaconda3/envs/pytorch_p310/lib/python3.10/site-packages/numpy/lib/function_base.py", line 550, in average
raise ZeroDivisionError(
ZeroDivisionError: Weights sum to zero, can't be normalized
The text was updated successfully, but these errors were encountered: