Support ChatCompletion streaming over Completions #642
Closed
aaronpowell
started this conversation in
2. Feature requests
Replies: 2 comments
-
Ah, I just realised that I need to be using |
Beta Was this translation helpful? Give feedback.
0 replies
-
correct, in the settings you can set the APIType to TextCompletion or ChatCompletion, depending on the model used |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
The OpenAI/AzureOpenAI
TextGeneration
classes use the legacy Completions API (deprecation from OpenAI, announcement and Azure model compatibility).This poses a challenge when working with Azure OpenAI Service in particular as you are required to deploy a
gpt-35-turbo
model version of0301
, which is deprecated, but due to quota limits on standard accounts you are unlikely to be able to deploy that plus the embeddings model and a0613
model forgpt-35-turbo
for the application to use.Having Semantic Memory move off the Completions API to ChatCompletions would unblock the usage in AOAI applications and ensure that applications aren't caught in the upcoming deprecations.
Beta Was this translation helpful? Give feedback.
All reactions