Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature request] Support Azure OpenAI API #382

Open
tuntunhuang opened this issue Dec 7, 2023 · 9 comments
Open

[Feature request] Support Azure OpenAI API #382

tuntunhuang opened this issue Dec 7, 2023 · 9 comments

Comments

@tuntunhuang
Copy link

Hi Brain,
Thank you very much for developing this amazing plugin. Would you update it to support the Azure OpenAI API?

@carllx
Copy link

carllx commented Dec 11, 2023

I completely agree. Allowing users to customize the API base URL, rather than hard-coding it to "https://api.openai.com/v1/chat/completions", would be extremely helpful in the current situation. Students living outside the service area of OpenAI would benefit from this.

@u3588064
Copy link

+1. Can't wait to use Azure API.

@gaojunyang666
Copy link

I completely agree. Allowing users to customize the API base URL, rather than hard-coding it to "https://api.openai.com/v1/chat/completions", would be extremely helpful in the current situation. Students living outside the service area of OpenAI would benefit from this.

Yes, I think this is very important. I have built an API myself, and I hope to change the interface to my own interface

@brianpetro
Copy link
Owner

@gaojunyang666 thanks for the feedback

This is already partially implemented in v2.1 for use with localhost APIs. A custom non-localhost endpoint, assuming that it uses the OpenAI API format, can be implemented rather quickly, but so far this hasn't been requested by anyone helping with beta testing the early release.

For non-OpenAI formats, there will be the ability to contribute additional formats to the module I created for implementing these models. This module will be released as an open-source NPM module very soon.

🌴

@ChrisRomp
Copy link

This is almost there using Custom API (OpenAI format) except that it sends the api-key header as Authorization: Bearer xxxx. If we could override the auth header, I could make this work with Azure OpenAI.

@brianpetro
Copy link
Owner

Hey @ChrisRomp

This should be a pretty simple adapter, something similar to what's described here brianpetro/jsbrains#1 (comment)

I don't currently have an Azure account for the necessary testing. But, maybe someone will see this and be able to handle building and testing the adapter 😊

🌴

@ChrisRomp
Copy link

ChrisRomp commented Apr 26, 2024

I can (eventually) take a crack at it. I just want to test this in my Obsidian workflow today. If it works well for me then I'll try to make some time for this.

But the gist for anyone else is it should work almost exactly like an OpenAI endpoint, except you need to provide a couple of params in the URL (see docs). E.g.:

POST https://{your-resource-name}.openai.azure.com/openai/deployments/{deployment-id}/chat/completions?api-version={api-version}

For anyone wanting to use this on Azure today, I've put Azure API Management in front of AOAI, and I'm using policy to compare a key to a named value in APIM. Then I use Azure RBAC auth to the AOAI service. You could optionally extract the Bearer value and pass that into an api-key header to AOAI instead of storing it as a secret in APIM. Disable requiring a subscription key on the API as well.

<policies>
    <inbound>
        <base />
        <cors allow-credentials="false" terminate-unmatched-request="false">
            <allowed-origins>
                <origin>*</origin>
            </allowed-origins>
            <allowed-methods preflight-result-max-age="600">
                <method>POST</method>
            </allowed-methods>
            <allowed-headers>
                <header>authorization</header>
                <header>content-type</header>
            </allowed-headers>
        </cors>
        <check-header name="Authorization" failed-check-httpcode="401" failed-check-error-message="Unauthorized">
            <value>Bearer {{aoai-custom-key}}</value>
        </check-header>
        <authentication-managed-identity resource="https://cognitiveservices.azure.com" output-token-variable-name="msi-access-token" ignore-error="false" />
        <set-header name="Authorization" exists-action="override">
            <value>@("Bearer " + (string)context.Variables["msi-access-token"])</value>
        </set-header>
    </inbound>
    <backend>
        <base />
    </backend>
    <outbound>
        <base />
    </outbound>
    <on-error>
        <base />
    </on-error>
</policies>

You can import the Swagger spec for AOAI into APIM from the Azure docs: https://learn.microsoft.com/en-us/azure/ai-services/openai/reference

@ChrisRomp
Copy link

@brianpetro I could also give you short term access to an Azure OpenAI instance.

@infinitelyloopy-bt
Copy link

@gaojunyang666 thanks for the feedback

This is already partially implemented in v2.1 for use with localhost APIs. A custom non-localhost endpoint, assuming that it uses the OpenAI API format, can be implemented rather quickly, but so far this hasn't been requested by anyone helping with beta testing the early release.

For non-OpenAI formats, there will be the ability to contribute additional formats to the module I created for implementing these models. This module will be released as an open-source NPM module very soon.

🌴

Hey Brian,

Any news on allowing us to enter our own API keys and URL for Azure OpenaI?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants