Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can‘t use custom baseUrl #3565

Closed
weirdo21371480 opened this issue Dec 6, 2023 · 3 comments
Closed

Can‘t use custom baseUrl #3565

weirdo21371480 opened this issue Dec 6, 2023 · 3 comments
Labels
auto:bug Related to a bug, vulnerability, unexpected error with an existing feature auto:question A specific question about the codebase, product, project, or how to use a feature

Comments

@weirdo21371480
Copy link

const llm = new OpenAI({
modelName: "gpt-3.5-turbo-instruct",
openAIApiKey: process.env.Open_API_KEY,
configuration: {
baseURL: "https://api.132999.xyz/v1/",
},
temperature: 0.5
});
const chain = loadQAStuffChain(llm);

const fields = {
modelName: "gpt-3.5-turbo-instruct",
openAIApiKey: process.env.Open_API_KEY,
configuration: {
baseURL: "https://api.132999.xyz/v1/",
}
};
const embeddings = new OpenAIEmbeddings(fields);
Is it same as the langchain/embeddings/openai? It doesn't work yet.

Copy link

dosubot bot commented Dec 6, 2023

🤖

Hello @weirdo21371480,

I'm Dosu, a friendly bot here to assist while you wait for a human maintainer. I'm here to help you solve bugs, answer questions, and guide you in becoming a contributor to langchainjs. Let's get started!

Based on the information you've provided, it seems like you're trying to use a custom baseURL with the OpenAIEmbeddings instance. This is indeed possible with the langchainjs framework.

Here's how you can do it:

import { OpenAIEmbeddings } from "langchain/embeddings/openai";

const embeddings = new OpenAIEmbeddings({
  openAIApiKey: "YOUR-API-KEY", // In Node.js defaults to process.env.OPENAI_API_KEY
  batchSize: 512, // Default value if omitted is 512. Max is 2048
  configuration: {
    baseURL: "https://your_custom_url.com",
  },
});

In this code, the baseURL is set to "https://your_custom_url.com". You can replace this with your own custom URL. The model will then use this URL for all API requests. If you're part of an organization, you can set process.env.OPENAI_ORGANIZATION to your OpenAI organization id, or pass it in as organization when initializing the model. This solution was suggested in the issue OpenAIEmbeddings configuration of baseURL is not possible.

If you're trying to use a custom API endpoint with the same structure as Azure OpenAI, you can adjust the basePath option when creating a new instance of the OpenAI class. Here's how you can do it:

import { OpenAI } from "langchain/llms/openai";

const model = new OpenAI(
  { temperature: 0 },
  { basePath: "https://your-custom-api.com/v1" } // replace with your API endpoint
);

const res = await model.call(
  "What would be a good company name a company that makes colorful socks?"
);
console.log(res);

In this example, LangChain will send the request to https://your-custom-api.com/v1 instead of the default Azure OpenAI endpoint. The request body and headers will remain the same as if you were making a direct call to Azure OpenAI. This solution was suggested in the issue Call custom api with same structure as Azure OpenAI.

If you're still encountering issues, could you please provide more details about the error message or behavior you're experiencing? This will help me better understand the problem and provide a more accurate solution.

Sources


This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

@dosubot dosubot bot added auto:bug Related to a bug, vulnerability, unexpected error with an existing feature auto:question A specific question about the codebase, product, project, or how to use a feature labels Dec 6, 2023
@weirdo21371480
Copy link
Author

@dosu-bot.
I have tried , but it doesn't work.

Copy link

dosubot bot commented Dec 6, 2023

🤖

I'm here to help! Could you please provide more details about what you've tried and what specific issue you're encountering? This will help me understand the problem better and provide you with the most accurate assistance.


This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

@dosubot dosubot bot added the stale Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed label Mar 6, 2024
@dosubot dosubot bot closed this as not planned Won't fix, can't repro, duplicate, stale Mar 13, 2024
@dosubot dosubot bot removed the stale Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed label Mar 13, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
auto:bug Related to a bug, vulnerability, unexpected error with an existing feature auto:question A specific question about the codebase, product, project, or how to use a feature
Projects
None yet
Development

No branches or pull requests

1 participant