Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ValueError: "Minimax" object has no field "_client" #11648

Closed
2 of 14 tasks
zirenlegend opened this issue Oct 11, 2023 · 2 comments
Closed
2 of 14 tasks

ValueError: "Minimax" object has no field "_client" #11648

zirenlegend opened this issue Oct 11, 2023 · 2 comments
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature Ɑ: models Related to LLMs or chat model modules

Comments

@zirenlegend
Copy link

zirenlegend commented Oct 11, 2023

System Info

langchain=0.0.312

code:

from langchain.llms import Minimax

minimax = Minimax(minimax_api_key=minimax_api_key, minimax_group_id=minimax_group_id)

print(minimax("What is the difference between panda and bear?"))

or:

from langchain.chat_models import MiniMaxChat
from langchain.schema import HumanMessage
chat = MiniMaxChat()
res = chat(
[
HumanMessage(
content="Translate this sentence from English to French. I love programming."
)
]
)
print(res)

error log:

File "E:\AI\Projects\AIProject\AIApiTest\langchain_minimax.py", line 43, in
llm = Minimax()
File "D:\anaconda3\envs\ai\lib\site-packages\langchain\llms\minimax.py", line 113, in init
self._client = _MinimaxEndpointClient(
File "pydantic\main.py", line 357, in pydantic.main.BaseModel.setattr
ValueError: "Minimax" object has no field "_client"

Who can help?

No response

Information

  • The official example notebooks/scripts
  • My own modified scripts

Related Components

  • LLMs/Chat Models
  • Embedding Models
  • Prompts / Prompt Templates / Prompt Selectors
  • Output Parsers
  • Document Loaders
  • Vector Stores / Retrievers
  • Memory
  • Agents / Agent Executors
  • Tools / Toolkits
  • Chains
  • Callbacks/Tracing
  • Async

Reproduction

code:

from langchain.llms import Minimax

minimax = Minimax(minimax_api_key=minimax_api_key, minimax_group_id=minimax_group_id)

print(minimax("What is the difference between panda and bear?"))

or:

from langchain.chat_models import MiniMaxChat
from langchain.schema import HumanMessage
chat = MiniMaxChat()
res = chat(
[
HumanMessage(
content="Translate this sentence from English to French. I love programming."
)
]
)
print(res)

Expected behavior

Fix this bug

@dosubot dosubot bot added Ɑ: models Related to LLMs or chat model modules 🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature labels Oct 11, 2023
@dosubot
Copy link

dosubot bot commented Oct 11, 2023

🤖

Hello,

Thank you for reaching out and providing detailed information about the issue you're facing. The error you're encountering is due to the "_client" attribute not being defined in the "Minimax" class or in one of its parent classes.

The "_client" attribute is an instance of the "_MinimaxEndpointClient" class, which is used to interact with the Minimax API. It is responsible for making HTTP requests to the Minimax API and handling the responses.

In the "Minimax" class, the "_client" attribute is used in the "_call" method to make a POST request to the Minimax API. The request is made with the parameters defined in the "request" dictionary. The response from the API is then returned as the output of the "_call" method.

Here is the relevant code:

def __init__(self, **data: Any):
    super().__init__(**data)
    self._client = _MinimaxEndpointClient(
        host=self.minimax_api_host,
        api_key=self.minimax_api_key,
        group_id=self.minimax_group_id,
    )

def _call(
    self,
    prompt: str,
    stop: Optional[List[str]] = None,
    run_manager: Optional[CallbackManagerForLLMRun] = None,
    **kwargs: Any,
) -> str:
    request = self._default_params
    request["messages"] = [{"sender_type": "USER", "text": prompt}]
    request.update(kwargs)
    text = self._client.post(request)
    if stop is not None:
        text = enforce_stop_tokens(text, stop)

    return text

You can find this code in the libs/langchain/langchain/llms/minimax.py file in the LangChain repository.

To resolve this issue, you need to ensure that the "_client" attribute is correctly initialized in the "Minimax" class. If you're still facing the issue after checking this, please provide more details about your environment and the steps you're taking so we can better assist you.

Best regards,
Dosu

Sources


This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

baskaryan added a commit that referenced this issue Oct 30, 2023
e #11648 Minimax
llm failed to initialize

The idea of this fix is
#10917 (comment)

do not use  underscore in python model class

---------

Co-authored-by: [email protected] <[email protected]>
Co-authored-by: Bagatur <[email protected]>
xieqihui pushed a commit to xieqihui/langchain that referenced this issue Nov 21, 2023
…i#12554)

e langchain-ai#11648 Minimax
llm failed to initialize

The idea of this fix is
langchain-ai#10917 (comment)

do not use  underscore in python model class

---------

Co-authored-by: [email protected] <[email protected]>
Co-authored-by: Bagatur <[email protected]>
Copy link

dosubot bot commented Feb 1, 2024

Hi, @zirenlegend,

I'm helping the LangChain team manage their backlog and am marking this issue as stale. From what I understand, you raised an issue regarding a bug in the Langchain library related to the "Minimax" object not having the "_client" field, resulting in a ValueError. Dosubot provided a detailed response explaining the cause of the error and suggested a resolution, which seems to have resolved the issue.

Could you please confirm if this issue is still relevant to the latest version of the LangChain repository? If it is, please let the LangChain team know by commenting on the issue. Otherwise, feel free to close the issue yourself, or it will be automatically closed in 7 days.

Thank you!

@dosubot dosubot bot added the stale Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed label Feb 1, 2024
hoanq1811 pushed a commit to hoanq1811/langchain that referenced this issue Feb 2, 2024
…i#12554)

e langchain-ai#11648 Minimax
llm failed to initialize

The idea of this fix is
langchain-ai#10917 (comment)

do not use  underscore in python model class

---------

Co-authored-by: [email protected] <[email protected]>
Co-authored-by: Bagatur <[email protected]>
@dosubot dosubot bot closed this as not planned Won't fix, can't repro, duplicate, stale Feb 8, 2024
@dosubot dosubot bot removed the stale Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed label Feb 8, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature Ɑ: models Related to LLMs or chat model modules
Projects
None yet
Development

No branches or pull requests

1 participant