Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error occurs when using DeepSeek and LLMGraphTransformer #29952

Closed
5 tasks done
M0rtzz opened this issue Feb 24, 2025 · 4 comments
Closed
5 tasks done

Error occurs when using DeepSeek and LLMGraphTransformer #29952

M0rtzz opened this issue Feb 24, 2025 · 4 comments
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature

Comments

@M0rtzz
Copy link

M0rtzz commented Feb 24, 2025

Checked other resources

  • I added a very descriptive title to this issue.
  • I searched the LangChain documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).

Example Code

Error occurs when using DeepSeek V3 and LLMGraphTransformer, the code is as follows:

from langchain_experimental.graph_transformers import LLMGraphTransformer
from langchain_community.graphs.graph_document import GraphDocument
from langchain.docstore.document import Document
from langchain_deepseek import ChatDeepSeek
from dotenv import load_dotenv
from typing import List
import os


env_path = "private/.env"
load_dotenv(env_path)


class TextToGraph:
    def __init__(
        self,
        model: str = None,
        api_base: str = None,
        api_key: str = None,
    ):
        self.llm = ChatDeepSeek(
            model=model,
            api_base=api_base,
            api_key=api_key,
            model_kwargs={"response_format": {"type": "json_object"}},
        )
        self.llm_transformer = LLMGraphTransformer(llm=self.llm)

    def processText(self, text: str) -> List[GraphDocument]:
        print(self.llm)
        doc = Document(page_content=text)
        return self.llm_transformer.convert_to_graph_documents([doc])

    def processFile(self, file_path: str) -> List[GraphDocument]:
        if not os.path.exists(file_path):
            raise FileNotFoundError(f"file {file_path} is no found")

        with open(file_path, "r", encoding="utf-8") as file:
            text = file.read()
        return self.processText(text)


if __name__ == "__main__":
    processor = TextToGraph(
        model="deepseek-chat",
        api_base="https://api.deepseek.com",
        api_key=os.getenv("DEEPSEEK_API_KEY"),
    )  # DeepSeek-V3
    file_path = "files/plaintxt/test.txt"
    graph_documents = processor.processFile(file_path)
    for graph_doc in graph_documents:
        print(graph_doc)

Error Message and Stack Trace (if applicable)

The error message is as follows:

Traceback (most recent call last):
  File "rag/_graph.py", line 56, in <module>
    graph_documents = processor.processFile(file_path)
  File "rag/_graph.py", line 43, in processFile
    return self.processText(text)
  File "rag/_graph.py", line 35, in processText
    return self.llm_transformer.convert_to_graph_documents([doc])
  File "~/Programs/miniconda3/envs/py310/lib/python3.10/site-packages/langchain_experimental/graph_transformers/llm.py", line 932, in convert_to_graph_documents
    return [self.process_response(document, config) for document in documents]
  File "~/Programs/miniconda3/envs/py310/lib/python3.10/site-packages/langchain_experimental/graph_transformers/llm.py", line 932, in <listcomp>
    return [self.process_response(document, config) for document in documents]
  File "~/Programs/miniconda3/envs/py310/lib/python3.10/site-packages/langchain_experimental/graph_transformers/llm.py", line 839, in process_response
    raw_schema = self.chain.invoke({"input": text}, config=config)
  File "~/Programs/miniconda3/envs/py310/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 3024, in invoke
    input = context.run(step.invoke, input, config)
  File "~/Programs/miniconda3/envs/py310/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 3729, in invoke
    output = {key: future.result() for key, future in zip(steps, futures)}
  File "~/Programs/miniconda3/envs/py310/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 3729, in <dictcomp>
    output = {key: future.result() for key, future in zip(steps, futures)}
  File "~/Programs/miniconda3/envs/py310/lib/python3.10/concurrent/futures/_base.py", line 451, in result
    return self.__get_result()
  File "~/Programs/miniconda3/envs/py310/lib/python3.10/concurrent/futures/_base.py", line 403, in __get_result
    raise self._exception
  File "~/Programs/miniconda3/envs/py310/lib/python3.10/concurrent/futures/thread.py", line 58, in run
    result = self.fn(*self.args, **self.kwargs)
  File "~/Programs/miniconda3/envs/py310/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 3713, in _invoke_step
    return context.run(
  File "~/Programs/miniconda3/envs/py310/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 5360, in invoke
    return self.bound.invoke(
  File "~/Programs/miniconda3/envs/py310/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 284, in invoke
    self.generate_prompt(
  File "~/Programs/miniconda3/envs/py310/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 860, in generate_prompt
    return self.generate(prompt_messages, stop=stop, callbacks=callbacks, **kwargs)
  File "~/Programs/miniconda3/envs/py310/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 690, in generate
    self._generate_with_cache(
  File "~/Programs/miniconda3/envs/py310/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 925, in _generate_with_cache
    result = self._generate(
  File "~/Programs/miniconda3/envs/py310/lib/python3.10/site-packages/langchain_openai/chat_models/base.py", line 775, in _generate
    response = self.root_client.beta.chat.completions.parse(**payload)
AttributeError: 'NoneType' object has no attribute 'beta'

Description

An object is None.

System Info

❯ python -m langchain_core.sys_info

System Information

OS: Linux
OS Version: #1 SMP Tue Nov 5 00:21:55 UTC 2024
Python Version: 3.10.16 (main, Dec 11 2024, 16:24:50) [GCC 11.2.0]

Package Information

langchain_core: 0.3.37
langchain: 0.3.19
langchain_community: 0.3.16
langsmith: 0.3.6
langchain_deepseek: 0.1.2
langchain_experimental: 0.3.4
langchain_openai: 0.3.6
langchain_text_splitters: 0.3.6

Optional packages not installed

langserve

Other Dependencies

aiohttp: 3.11.12
aiohttp<4.0.0,>=3.8.3: Installed. No version info available.
async-timeout<5.0.0,>=4.0.0;: Installed. No version info available.
dataclasses-json: 0.6.7
httpx: 0.28.1
httpx-sse: 0.4.0
jsonpatch<2.0,>=1.33: Installed. No version info available.
langchain-anthropic;: Installed. No version info available.
langchain-aws;: Installed. No version info available.
langchain-cohere;: Installed. No version info available.
langchain-community;: Installed. No version info available.
langchain-core<1.0.0,>=0.3.34: Installed. No version info available.
langchain-core<1.0.0,>=0.3.35: Installed. No version info available.
langchain-deepseek;: Installed. No version info available.
langchain-fireworks;: Installed. No version info available.
langchain-google-genai;: Installed. No version info available.
langchain-google-vertexai;: Installed. No version info available.
langchain-groq;: Installed. No version info available.
langchain-huggingface;: Installed. No version info available.
langchain-mistralai;: Installed. No version info available.
langchain-ollama;: Installed. No version info available.
langchain-openai;: Installed. No version info available.
langchain-openai<1.0.0,>=0.3.5: Installed. No version info available.
langchain-text-splitters<1.0.0,>=0.3.6: Installed. No version info available.
langchain-together;: Installed. No version info available.
langchain-xai;: Installed. No version info available.
langsmith-pyo3: Installed. No version info available.
langsmith<0.4,>=0.1.125: Installed. No version info available.
langsmith<0.4,>=0.1.17: Installed. No version info available.
numpy: 1.26.4
numpy<2,>=1.26.4;: Installed. No version info available.
numpy<3,>=1.26.2;: Installed. No version info available.
openai<2.0.0,>=1.58.1: Installed. No version info available.
orjson: 3.10.15
packaging<25,>=23.2: Installed. No version info available.
pydantic: 2.10.2
pydantic-settings: 2.8.0
pydantic<3.0.0,>=2.5.2;: Installed. No version info available.
pydantic<3.0.0,>=2.7.4: Installed. No version info available.
pydantic<3.0.0,>=2.7.4;: Installed. No version info available.
pytest: Installed. No version info available.
PyYAML: 6.0.2
PyYAML>=5.3: Installed. No version info available.
requests: 2.32.3
requests-toolbelt: 1.0.0
requests<3,>=2: Installed. No version info available.
rich: 13.9.4
SQLAlchemy: 2.0.38
SQLAlchemy<3,>=1.4: Installed. No version info available.
tenacity: 9.0.0
tenacity!=8.4.0,<10,>=8.1.0: Installed. No version info available.
tenacity!=8.4.0,<10.0.0,>=8.1.0: Installed. No version info available.
tiktoken<1,>=0.7: Installed. No version info available.
typing-extensions>=4.7: Installed. No version info available.
zstandard: 0.23.0

@dosubot dosubot bot added the 🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature label Feb 24, 2025
@iharshlalakiya
Copy link

hi @M0rtzz ,
I can help you with this problem.

@M0rtzz
Copy link
Author

M0rtzz commented Mar 2, 2025

hi @M0rtzz , I can help you with this problem.

How can we solve this

@iharshlalakiya
Copy link

try this code.

from langchain_experimental.graph_transformers import LLMGraphTransformer
from langchain_community.graphs.graph_document import GraphDocument
from langchain.docstore.document import Document
from langchain_deepseek import ChatDeepSeek
from dotenv import load_dotenv
from typing import List
import os

env_path = ".env"
load_dotenv(env_path)

class TextToGraph:
    def __init__(self, model: str = None, api_base: str = None, api_key: str = None):
        
        if not api_key:
            raise ValueError("DeepSeek API key is missing. Please set it in the .env file.")

        print(f"Initializing ChatDeepSeek with API Base: {api_base}")

        self.llm = ChatDeepSeek(
            model=model,
            api_base=api_base,
            api_key=api_key
        )

        if self.llm is None:
            raise ValueError("Failed to initialize ChatDeepSeek. Check your API credentials.")

        self.llm_transformer = LLMGraphTransformer(llm=self.llm)

    def process_text(self, text: str) -> List[GraphDocument]:
        if not text.strip():
            raise ValueError("Input text is empty. Provide valid content.")
        
        doc = Document(page_content=text)
        return self.llm_transformer.convert_to_graph_documents([doc])

    def process_file(self, file_path: str) -> List[GraphDocument]:
        if not os.path.exists(file_path):
            raise FileNotFoundError(f"File '{file_path}' not found.")

        with open(file_path, "r", encoding="utf-8", errors="ignore") as file:
            text = file.read()

        return self.process_text(text)


if __name__ == "__main__":
    deepseek_api_key = os.getenv("DEEPSEEK_API_KEY")
    deepseek_api_base = "https://api.deepseek.com"

    processor = TextToGraph(
        model="deepseek-chat",
        api_base=deepseek_api_base,
        api_key=deepseek_api_key
    )

    file_path = "test.txt"
    try:
        graph_documents = processor.process_file(file_path)
        for graph_doc in graph_documents:
            print(graph_doc)
    except Exception as e:
        print(f"Error: {e}")

Other Dependencies

aiohappyeyeballs==2.4.6

aiohttp==3.11.13

aiosignal==1.3.2

annotated-types==0.7.0

anyio==4.8.0

attrs==25.1.0

certifi==2025.1.31

charset-normalizer==3.4.1

colorama==0.4.6

dataclasses-json==0.6.7

distro==1.9.0

dotenv==0.9.9

frozenlist==1.5.0

greenlet==3.1.1

h11==0.14.0

httpcore==1.0.7

httpx==0.28.1

httpx-sse==0.4.0

idna==3.10

jiter==0.8.2

jsonpatch==1.33

jsonpointer==3.0.0

langchain==0.3.19

langchain-community==0.3.18

langchain-core==0.3.40

langchain-deepseek==0.1.2

langchain-experimental==0.3.4

langchain-openai==0.3.7

langchain-text-splitters==0.3.6

langsmith==0.3.11

marshmallow==3.26.1

multidict==6.1.0

mypy-extensions==1.0.0

numpy==1.26.4

openai==1.65.2

orjson==3.10.15

packaging==24.2

propcache==0.3.0

pydantic==2.10.6

pydantic-settings==2.8.1

pydantic_core==2.27.2

python-dotenv==1.0.1

PyYAML==6.0.2

regex==2024.11.6

requests==2.32.3

requests-toolbelt==1.0.0

sniffio==1.3.1

SQLAlchemy==2.0.38

tenacity==9.0.0

tiktoken==0.9.0

tqdm==4.67.1

typing-inspect==0.9.0

typing_extensions==4.12.2

urllib3==2.3.0

yarl==1.18.3

zstandard==0.23.0

@M0rtzz
Copy link
Author

M0rtzz commented Mar 2, 2025

@iharshlalakiya

Thank you, the issue has been resolved. I have found that certain platform APIs are causing this bug.

@M0rtzz M0rtzz closed this as completed Mar 2, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature
Projects
None yet
Development

No branches or pull requests

2 participants