-
-
Notifications
You must be signed in to change notification settings - Fork 191
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature request] Support Azure OpenAI API #382
Comments
I completely agree. Allowing users to customize the API base URL, rather than hard-coding it to "https://api.openai.com/v1/chat/completions", would be extremely helpful in the current situation. Students living outside the service area of OpenAI would benefit from this. |
+1. Can't wait to use Azure API. |
I completely agree. Allowing users to customize the API base URL, rather than hard-coding it to "https://api.openai.com/v1/chat/completions", would be extremely helpful in the current situation. Students living outside the service area of OpenAI would benefit from this. Yes, I think this is very important. I have built an API myself, and I hope to change the interface to my own interface |
@gaojunyang666 thanks for the feedback This is already partially implemented in For non-OpenAI formats, there will be the ability to contribute additional formats to the module I created for implementing these models. This module will be released as an open-source NPM module very soon. 🌴 |
This is almost there using |
Hey @ChrisRomp This should be a pretty simple adapter, something similar to what's described here brianpetro/jsbrains#1 (comment) I don't currently have an Azure account for the necessary testing. But, maybe someone will see this and be able to handle building and testing the adapter 😊 🌴 |
I can (eventually) take a crack at it. I just want to test this in my Obsidian workflow today. If it works well for me then I'll try to make some time for this. But the gist for anyone else is it should work almost exactly like an OpenAI endpoint, except you need to provide a couple of params in the URL (see docs). E.g.: POST https://{your-resource-name}.openai.azure.com/openai/deployments/{deployment-id}/chat/completions?api-version={api-version} For anyone wanting to use this on Azure today, I've put Azure API Management in front of AOAI, and I'm using policy to compare a key to a named value in APIM. Then I use Azure RBAC auth to the AOAI service. You could optionally extract the Bearer value and pass that into an <policies>
<inbound>
<base />
<cors allow-credentials="false" terminate-unmatched-request="false">
<allowed-origins>
<origin>*</origin>
</allowed-origins>
<allowed-methods preflight-result-max-age="600">
<method>POST</method>
</allowed-methods>
<allowed-headers>
<header>authorization</header>
<header>content-type</header>
</allowed-headers>
</cors>
<check-header name="Authorization" failed-check-httpcode="401" failed-check-error-message="Unauthorized">
<value>Bearer {{aoai-custom-key}}</value>
</check-header>
<authentication-managed-identity resource="https://cognitiveservices.azure.com" output-token-variable-name="msi-access-token" ignore-error="false" />
<set-header name="Authorization" exists-action="override">
<value>@("Bearer " + (string)context.Variables["msi-access-token"])</value>
</set-header>
</inbound>
<backend>
<base />
</backend>
<outbound>
<base />
</outbound>
<on-error>
<base />
</on-error>
</policies> You can import the Swagger spec for AOAI into APIM from the Azure docs: https://learn.microsoft.com/en-us/azure/ai-services/openai/reference |
@brianpetro I could also give you short term access to an Azure OpenAI instance. |
Hey Brian, Any news on allowing us to enter our own API keys and URL for Azure OpenaI? |
Hi Brain,
Thank you very much for developing this amazing plugin. Would you update it to support the Azure OpenAI API?
The text was updated successfully, but these errors were encountered: