Skip to content

Commit

Permalink
Minor updates to package README.md. Remove /en-us/ from links shown o…
Browse files Browse the repository at this point in the history
…n samples and in source code (Azure#38554)
  • Loading branch information
dargilco authored Nov 15, 2024
1 parent b63200a commit 62ccd06
Show file tree
Hide file tree
Showing 14 changed files with 94 additions and 59 deletions.
3 changes: 2 additions & 1 deletion .vscode/cspell.json
Original file line number Diff line number Diff line change
Expand Up @@ -1330,7 +1330,8 @@
"aiservices",
"OTEL",
"GENAI",
"fspath"
"fspath",
"azureopenai"
]
},
{
Expand Down
111 changes: 75 additions & 36 deletions sdk/ai/azure-ai-projects/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,16 +5,66 @@ Use the AI Projects client library (in preview) to:

* **Enumerate connections** in your Azure AI Studio project and get connection properties.
For example, get the inference endpoint URL and credentials associated with your Azure OpenAI connection.
* **Get an already-authenticated Inference client** for the default Azure OpenAI or AI Services connections in your Azure AI Studio project. Supports the AzureOpenAI client from the `openai` package, or clients from the `azure-ai-inference` package.
* **Get an authenticated Inference client** to do chat completions, for the default Azure OpenAI or AI Services connections in your Azure AI Studio project. Supports the AzureOpenAI client from the `openai` package, or clients from the `azure-ai-inference` package.
* **Develop Agents using the Azure AI Agent Service**, leveraging an extensive ecosystem of models, tools, and capabilities from OpenAI, Microsoft, and other LLM providers. The Azure AI Agent Service enables the building of Agents for a wide range of generative AI use cases. The package is currently in private preview.
* **Run Evaluation tools** to assess the performance of generative AI applications using various evaluators and metrics. It includes built-in evaluators for quality, risk, and safety, and allows custom evaluators for specific needs.
* **Run Evaluations** to assess the performance of generative AI applications using various evaluators and metrics. It includes built-in evaluators for quality, risk, and safety, and allows custom evaluators for specific needs.
* **Enable OpenTelemetry tracing**.

[Product documentation](https://aka.ms/azsdk/azure-ai-projects/product-doc)
| [Samples][samples]
| [API reference documentation](https://aka.ms/azsdk/azure-ai-projects/python/reference)
| [Package (PyPI)](https://aka.ms/azsdk/azure-ai-projects/python/package)
| [SDK source code](https://aka.ms/azsdk/azure-ai-projects/python/code)
| [AI Starter Template](https://aka.ms/azsdk/azure-ai-projects/python/ai-starter-template)

## Table of contents

- [Getting started](#getting-started)
- [Prerequisite](#prerequisite)
- [Install the package](#install-the-package)
- [Key concepts](#key-concepts)
- [Create and authenticate the client](#create-and-authenticate-the-client)
- [Examples](#examples)
- [Enumerate connections](#enumerate-connections)
- [Get properties of all connections](#get-properties-of-all-connections)
- [Get properties of all connections of a particular type](#get-properties-of-all-connections-of-a-particular-type)
- [Get properties of a default connection](#get-properties-of-a-default-connection)
- [Get properties of a connection by its connection name](#get-properties-of-a-connection-by-its-connection-name)
- [Get an authenticated ChatCompletionsClient](#get-an-authenticated-chatcompletionsclient)
- [Get an authenticated AzureOpenAI client](#get-an-authenticated-azureopenai-client)
- [Agents (Private Preview)](#agents-private-preview)
- [Create an Agent](#create-agent) with:
- [File Search](#create-agent-with-file-search)
- [Code interpreter](#create-agent-with-code-interpreter)
- [Bing grounding](#create-agent-with-bing-grounding)
- [Azure AI Search](#create-agent-with-azure-ai-search)
- [Function call](#create-agent-with-function-call)
- [Create thread](#create-thread) with
- [Tool resource](#create-thread-with-tool-resource)
- [Create message](#create-message) with:
- [File search attachment](#create-message-with-file-search-attachment)
- [Code interpreter attachment](#create-message-with-code-interpreter-attachment)
- [Execute Run, Run_and_Process, or Stream](#create-run-run_and_process-or-stream)
- [Retrieve message](#retrieve-message)
- [Retrieve file](#retrieve-file)
- [Tear down by deleting resource](#teardown)
- [Tracing](#tracing)
- [Evaluation](#evaluation)
- [Evaluator](#evaluator)
- [Run Evaluation in the cloud](#run-evaluation-in-the-cloud)
- [Evaluators](#evaluators)
- [Data to be evaluated](#data-to-be-evaluated)
- [[Optional] Azure OpenAI Model](#optional-azure-openai-model)
- [Example Remote Evaluation](#example-remote-evaluation)
- [Tracing](#tracing)
- [Installation](#installation)
- [Tracing example](#tracing-example)
- [Troubleshooting](#troubleshooting)
- [Exceptions](#exceptions)
- [Logging](#logging)
- [Reporting issues](#reporting-issues)
- [Next steps](#next-steps)
- [Contributing](#contributing)

## Getting started

Expand All @@ -25,8 +75,7 @@ For example, get the inference endpoint URL and credentials associated with your
- A [project in Azure AI Studio](https://learn.microsoft.com/azure/ai-studio/how-to/create-projects?tabs=ai-studio).
- The project connection string. It can be found in your Azure AI Studio project overview page, under "Project details". Below we will assume the environment variable `PROJECT_CONNECTION_STRING` was defined to hold this value.
- Entra ID is needed to authenticate the client. Your application needs an object that implements the [TokenCredential](https://learn.microsoft.com/python/api/azure-core/azure.core.credentials.tokencredential) interface. Code samples here use [DefaultAzureCredential](https://learn.microsoft.com/python/api/azure-identity/azure.identity.defaultazurecredential). To get that working, you will need:
* The role `Azure AI Developer` assigned to you. Role assigned can be done via the "Access Control (IAM)" tab of your Azure AI Project resource in the Azure portal.
* The token must have the scope `https://management.azure.com/.default` or `https://ml.azure.com/.default`, depending on the set of client operation you will execute.
* The `Contributor` role. Role assigned can be done via the "Access Control (IAM)" tab of your Azure AI Project resource in the Azure portal.
* [Azure CLI](https://learn.microsoft.com/cli/azure/install-azure-cli) installed.
* You are logged into your Azure account by running `az login`.
* Note that if you have multiple Azure subscriptions, the subscription that contains your Azure AI Project resource must be your default subscription. Run `az account list --output table` to list all your subscription and see which one is the default. Run `az account set --subscription "Your Subscription ID or Name"` to change your default subscription.
Expand Down Expand Up @@ -78,9 +127,9 @@ project_client = AIProjectClient.from_connection_string(

### Enumerate connections

You Azure AI Studio project has a "Management center". When you enter it, you will see a tab named "Connected resources" under your project. The `.connections` operations on the client allow you to enumerate the connections and get connection properties. Connection properties include the resource URL and authentication credentials, among other things.
Your Azure AI Studio project has a "Management center". When you enter it, you will see a tab named "Connected resources" under your project. The `.connections` operations on the client allow you to enumerate the connections and get connection properties. Connection properties include the resource URL and authentication credentials, among other things.

Below are code examples of some simple connection operations. Additional samples can be found under the "connetions" folder in the [package samples][samples].
Below are code examples of the connection operations. Full samples can be found under the "connetions" folder in the [package samples][samples].

#### Get properties of all connections

Expand Down Expand Up @@ -113,7 +162,7 @@ with its authentication credentials:
```python
connection = project_client.connections.get_default(
connection_type=ConnectionType.AZURE_OPEN_AI,
include_credentials=True, # Optional. Defaults to "False"
include_credentials=True, # Optional. Defaults to "False".
)
print(connection)
```
Expand All @@ -123,18 +172,21 @@ will be populated. Otherwise both will be `None`.

#### Get properties of a connection by its connection name

To get the connection properties of a connection with name `connection_name`:
To get the connection properties of a connection named `connection_name`:

```python
connection = project_client.connections.get(
connection_name=connection_name, include_credentials=True # Optional. Defaults to "False"
connection_name=connection_name,
include_credentials=True # Optional. Defaults to "False"
)
print(connection)
```

### Get an authenticated ChatCompletionsClient

Your Azure AI Studio project may have one or more AI models deployed that support chat completions. These could be OpenAI models, Microsoft models, or models from other providers. Use the code below to get an already authenticated [ChatCompletionsClient](https://learn.microsoft.com/python/api/azure-ai-inference/azure.ai.inference.chatcompletionsclient?view=azure-python-preview) from the [azure-ai-inference](https://pypi.org/project/azure-ai-inference/) package, and execute a chat completions call. First, install the package:
Your Azure AI Studio project may have one or more AI models deployed that support chat completions. These could be OpenAI models, Microsoft models, or models from other providers. Use the code below to get an already authenticated [ChatCompletionsClient](https://learn.microsoft.com/python/api/azure-ai-inference/azure.ai.inference.chatcompletionsclient?view=azure-python-preview) from the [azure-ai-inference](https://pypi.org/project/azure-ai-inference/) package, and execute a chat completions call.

First, install the package:

```bash
pip install azure-ai-inference
Expand All @@ -157,13 +209,15 @@ See the "inference" folder in the [package samples][samples] for additional samp

### Get an authenticated AzureOpenAI client

Your Azure AI Studio project may have one or more OpenAI models deployed that support chat completions. Use the code below to get an already authenticated [AzureOpenAI](https://github.com/openai/openai-python?tab=readme-ov-file#microsoft-azure-openai) from the [openai](https://pypi.org/project/openai/) package, and execute a chat completions call. First, install the package:
Your Azure AI Studio project may have one or more OpenAI models deployed that support chat completions. Use the code below to get an already authenticated [AzureOpenAI](https://github.com/openai/openai-python?tab=readme-ov-file#microsoft-azure-openai) from the [openai](https://pypi.org/project/openai/) package, and execute a chat completions call.

First, install the package:

```bash
pip install openai
```

Then run this code (replace "gpt-4o" with your model deployment name):
Then run the code below. Replace `gpt-4o` with your model deployment name, and update the `api_version` value with one found in the "Data plane - inference" row [in this table](https://learn.microsoft.com/azure/ai-services/openai/reference#api-specs).

```python
aoai_client = project_client.inference.get_azure_openai_client(api_version="2024-06-01")
Expand All @@ -189,23 +243,6 @@ Agents in the Azure AI Projects client library are designed to facilitate variou

Agents are actively being developed. A sign-up form for private preview is coming soon.

- <a href='#create-agent'>Create an Agent</a> with:
- <a href='#create-agent-with-file-search'>File Search</a>
- <a href='#create-agent-with-code-interpreter'>Code interpreter</a>
- <a href='#create-agent-with-bing-grounding'>Bing grounding</a>
- <a href='#create-agent-with-azure-ai-search'>Azure AI Search</a>
- <a href='#create-agent-with-function-call'>Function call</a>
- <a href='#create-thread'>Create thread</a> with
- <a href='#create-thread-with-tool-resource'>Tool resource</a>
- <a href='#create-message'>Create message</a> with:
- <a href='#create-message-with-file-search-attachment'>File search attachment</a>
- <a href='#create-message-with-code-interpreter-attachment'>Code interpreter attachment</a>
- <a href='#create-run-run_and_process-or-stream'>Execute Run, Run_and_Process, or Stream</a>
- <a href='#retrieve-message'>Retrieve message</a>
- <a href='#retrieve-file'>Retrieve file</a>
- <a href='#teardown'>Tear down by deleting resource</a>
- <a href='#tracing'>Tracing</a>

#### Create Agent

Here is an example of how to create an Agent:
Expand Down Expand Up @@ -730,19 +767,19 @@ Evaluators are made available via [azure-ai-evaluation][azure_ai_evaluation] SDK

More details on built-in and custom evaluators can be found [here][evaluators].

#### Run Evaluation in cloud:
#### Run Evaluation in the cloud

To run evaluation in cloud the following are needed:
To run evaluation in the cloud the following are needed:

- Evaluators
- Data to be evaluated
- [Optional] Azure Open AI model.

##### Evaluators

For running evaluator in cloud, evaluator `ID` is needed. To get it via code you use [azure-ai-evaluation][azure_ai_evaluation]
For running evaluator in the cloud, evaluator `ID` is needed. To get it via code you use [azure-ai-evaluation][azure_ai_evaluation]

```
```python
# pip install azure-ai-evaluation

from azure.ai.evaluation import RelevanceEvaluator
Expand Down Expand Up @@ -831,11 +868,11 @@ print("----------------------------------------------------------------")

NOTE: For running evaluators locally refer to [Evaluate with the Azure AI Evaluation SDK][evaluators].

#### Tracing
### Tracing

You can add an Application Insights Azure resource to your Azure AI Studio project. See the Tracing tab in your studio. If one was enabled, you can get the Application Insights connection string, configure your Agents, and observe the full execution path through Azure Monitor. Typically, you might want to start tracing before you create an Agent.

##### Installation
#### Installation

Make sure to install OpenTelemetry and the Azure SDK tracing plugin via

Expand All @@ -852,7 +889,7 @@ To connect to Aspire Dashboard or another OpenTelemetry compatible backend, inst
pip install opentelemetry-exporter-otlp
```

##### Tracing example
#### Tracing example

Here is a code sample to be included above `create_agent`:

Expand Down Expand Up @@ -960,6 +997,8 @@ To report issues with the client library, or request additional features, please

Have a look at the [Samples](https://github.com/Azure/azure-sdk-for-python/tree/main/sdk/ai/azure-ai-projects/samples) folder, containing fully runnable Python code for synchronous and asynchronous clients.

Explore the [AI Starter Template](https://aka.ms/azsdk/azure-ai-projects/python/ai-starter-template). This template creates an Azure AI Studio hub, project and connected resources including Azure OpenAI Service, AI Search and more. It also deploys a simple chat application to Azure Container Apps.

## Contributing

This project welcomes contributions and suggestions. Most contributions require
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -222,7 +222,7 @@ async def get_azure_openai_client(self, *, api_version: Optional[str] = None, **
:keyword api_version: The Azure OpenAI api-version to use when creating the client. Optional.
See "Data plane - Inference" row in the table at
https://learn.microsoft.com/en-us/azure/ai-services/openai/reference#api-specs. If this keyword
https://learn.microsoft.com/azure/ai-services/openai/reference#api-specs. If this keyword
is not specified, you must set the environment variable `OPENAI_API_VERSION` instead.
:paramtype api_version: str
:return: An authenticated AsyncAzureOpenAI client
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -211,7 +211,7 @@ def get_azure_openai_client(self, *, api_version: Optional[str] = None, **kwargs
:keyword api_version: The Azure OpenAI api-version to use when creating the client. Optional.
See "Data plane - Inference" row in the table at
https://learn.microsoft.com/en-us/azure/ai-services/openai/reference#api-specs. If this keyword
https://learn.microsoft.com/azure/ai-services/openai/reference#api-specs. If this keyword
is not specified, you must set the environment variable `OPENAI_API_VERSION` instead.
:paramtype api_version: str
:return: An authenticated AzureOpenAI client
Expand Down Expand Up @@ -253,7 +253,7 @@ def get_azure_openai_client(self, *, api_version: Optional[str] = None, **kwargs
auth = "Creating AzureOpenAI using SAS authentication"
logger.debug("[InferenceOperations.get_azure_openai_client] %s", auth)
client = AzureOpenAI(
# See https://learn.microsoft.com/en-us/python/api/azure-identity/azure.identity?view=azure-python#azure-identity-get-bearer-token-provider # pylint: disable=line-too-long
# See https://learn.microsoft.com/python/api/azure-identity/azure.identity?view=azure-python#azure-identity-get-bearer-token-provider # pylint: disable=line-too-long
azure_ad_token_provider=get_bearer_token_provider(
connection.token_credential, "https://cognitiveservices.azure.com/.default"
),
Expand Down
7 changes: 1 addition & 6 deletions sdk/ai/azure-ai-projects/azure_ai_projects_tests.env
Original file line number Diff line number Diff line change
Expand Up @@ -5,11 +5,6 @@
# but do not commit these changes to the repository.
#

# The default here should be to run tests from recordings:
AZURE_TEST_RUN_LIVE=false
AZURE_SKIP_LIVE_RECORDING=true
PROXY_URL=http://localhost:5000

########################################################################################################################
# Connection tests
#
Expand All @@ -35,7 +30,7 @@ AZURE_AI_PROJECTS_CONNECTIONS_TESTS_AISERVICES_CONNECTION_NAME=
# - A default AIServices resource with at least one chat-completions model deployed (from OpenAI or non-OpenAI)
# - A default Azure OpenAI resource connected with at least one chat-completions OpenAI model deployed
# Populate the Azure OpenAI api-version and model deployment names below.
# Note: See Azure OpenAI api-versions here: https://learn.microsoft.com/en-us/azure/ai-services/openai/reference#api-specs
# Note: See Azure OpenAI api-versions here: https://learn.microsoft.com/azure/ai-services/openai/reference#api-specs
#
AZURE_AI_PROJECTS_INFERENCE_TESTS_PROJECT_CONNECTION_STRING=
AZURE_AI_PROJECTS_INFERENCE_TESTS_AOAI_API_VERSION=
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -86,20 +86,20 @@ async def sample_connections_async() -> None:
aoai_client = AsyncAzureOpenAI(
api_key=connection.key,
azure_endpoint=connection.endpoint_url,
api_version="2024-06-01", # See "Data plane - inference" row in table https://learn.microsoft.com/en-us/azure/ai-services/openai/reference#api-specs
api_version="2024-06-01", # See "Data plane - inference" row in table https://learn.microsoft.com/azure/ai-services/openai/reference#api-specs
)
elif connection.authentication_type == AuthenticationType.ENTRA_ID:
print("====> Creating AzureOpenAI client using Entra ID authentication")
from azure.identity.aio import get_bearer_token_provider

aoai_client = AsyncAzureOpenAI(
# See https://learn.microsoft.com/en-us/python/api/azure-identity/azure.identity?view=azure-python#azure-identity-get-bearer-token-provider
# See https://learn.microsoft.com/python/api/azure-identity/azure.identity?view=azure-python#azure-identity-get-bearer-token-provider
azure_ad_token_provider=get_bearer_token_provider(
cast(AsyncTokenCredential, connection.token_credential),
"https://cognitiveservices.azure.com/.default",
),
azure_endpoint=connection.endpoint_url,
api_version="2024-06-01", # See "Data plane - inference" row in table https://learn.microsoft.com/en-us/azure/ai-services/openai/reference#api-specs
api_version="2024-06-01", # See "Data plane - inference" row in table https://learn.microsoft.com/azure/ai-services/openai/reference#api-specs
)
else:
raise ValueError(f"Authentication type {connection.authentication_type} not supported.")
Expand Down
Loading

0 comments on commit 62ccd06

Please sign in to comment.