Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update samples and readme for Content Safety sdk #36784

Merged
merged 14 commits into from
Jun 6, 2023
2 changes: 1 addition & 1 deletion sdk/contentsafety/Azure.AI.ContentSafety/CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# Release History

## 1.0.0-beta.1 (2023-05-22)
## 1.0.0-beta.1 (2023-06-06)
mengaims marked this conversation as resolved.
Show resolved Hide resolved

- Initial version
356 changes: 327 additions & 29 deletions sdk/contentsafety/Azure.AI.ContentSafety/README.md

Large diffs are not rendered by default.

20 changes: 20 additions & 0 deletions sdk/contentsafety/Azure.AI.ContentSafety/samples/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
---
page_type: sample
languages:
- csharp
products:
- azure
- azure-cognitive-services
name: Azure AI Content Safety samples for .NET
description: Samples for the Azure.AI.ContentSafety client library
---

# Azure AI Content Safety client SDK Samples

These code samples show common scenario operations with the Content Safety client library.

|**Sample Name**|**Description**|
|----------------|-------------|
|[Sample1_AnalyzeText](https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/contentsafety/Azure.AI.ContentSafety/samples/Sample1_AnalyzeText.md) |Scans text for sexual content, violence, hate, and self harm with multi-severity levels.|
|[Sample2_AnalyzeImage](https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/contentsafety/Azure.AI.ContentSafety/samples/Sample2_AnalyzeImage.md) |Scans images for sexual content, violence, hate, and self harm with multi-severity levels.|
|[Sample3_ManageTextBlocklist](https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/contentsafety/Azure.AI.ContentSafety/samples/Sample3_ManageTextBlocklist.md) |The default AI classifiers are sufficient for most content safety needs; however, you might need to screen for terms that are specific to your use case. You can create blocklists of terms to use with the Text API.|
Original file line number Diff line number Diff line change
@@ -0,0 +1,46 @@
# Analyze Text

This sample shows how to analyze text using Azure AI Content Safety.
To get started, make sure you have satisfied all the prerequisites and got all the resources required by [README][README].

## Create a ContentSafetyClient

To create a new `ContentSafetyClient` you need the endpoint and credentials from your resource. In the sample below you'll use a Content Safety API key credential by creating an `AzureKeyCredential` object.

You can set `endpoint` and `key` based on an environment variable, a configuration setting, or any way that works for your application.

```C# Snippet:Azure_AI_ContentSafety_CreateClient
string endpoint = TestEnvironment.Endpoint;
string key = TestEnvironment.Key;

ContentSafetyClient client = new ContentSafetyClient(new Uri(endpoint), new AzureKeyCredential(key));
```

## Load text and analyze text

You can download our [sample data](https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/contentsafety/Azure.AI.ContentSafety/tests/Samples/sample_data), read in the text and initialize `AnalyzeTextOptions` with it. Then call `AnalyzeText` to get analysis result.

```C# Snippet:Azure_AI_ContentSafety_AnalyzeText
string datapath = Path.Combine(Path.GetDirectoryName(Assembly.GetExecutingAssembly().Location), "Samples", "sample_data", "text.txt");
string text = File.ReadAllText(datapath);

var request = new AnalyzeTextOptions(text);

Response<AnalyzeTextResult> response;
try
{
response = client.AnalyzeText(request);
}
catch (RequestFailedException ex)
{
Console.WriteLine("Analyze text failed.\nStatus code: {0}, Error code: {1}, Error message: {2}", ex.Status, ex.ErrorCode, ex.Message);
throw;
}

Console.WriteLine("Hate severity: {0}", response.Value.HateResult?.Severity ?? 0);
Console.WriteLine("SelfHarm severity: {0}", response.Value.SelfHarmResult?.Severity ?? 0);
Console.WriteLine("Sexual severity: {0}", response.Value.SexualResult?.Severity ?? 0);
Console.WriteLine("Violence severity: {0}", response.Value.ViolenceResult?.Severity ?? 0);
```

[README]: https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/contentsafety/Azure.AI.ContentSafety/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,46 @@
# Analyze Image

This sample shows how to analyze image using Azure AI Content Safety.
To get started, make sure you have satisfied all the prerequisites and got all the resources required by [README][README].

## Create a ContentSafetyClient

To create a new `ContentSafetyClient` you need the endpoint and credentials from your resource. In the sample below you'll use a Content Safety API key credential by creating an `AzureKeyCredential` object.

You can set `endpoint` and `key` based on an environment variable, a configuration setting, or any way that works for your application.

```C# Snippet:Azure_AI_ContentSafety_CreateClient
string endpoint = TestEnvironment.Endpoint;
string key = TestEnvironment.Key;

ContentSafetyClient client = new ContentSafetyClient(new Uri(endpoint), new AzureKeyCredential(key));
```

## Load image and analyze image

You can download our [sample data](https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/contentsafety/Azure.AI.ContentSafety/tests/Samples/sample_data), read in the image and initialize `AnalyzeImageOptions` with it. Then call `AnalyzeImage` to get analysis result.

```C# Snippet:Azure_AI_ContentSafety_AnalyzeImage
string datapath = Path.Combine(Path.GetDirectoryName(Assembly.GetExecutingAssembly().Location), "Samples", "sample_data", "image.jpg");
ImageData image = new ImageData() { Content = BinaryData.FromBytes(File.ReadAllBytes(datapath)) };

var request = new AnalyzeImageOptions(image);

Response<AnalyzeImageResult> response;
try
{
response = client.AnalyzeImage(request);
}
catch (RequestFailedException ex)
{
Console.WriteLine("Analyze image failed.\nStatus code: {0}, Error code: {1}, Error message: {2}", ex.Status, ex.ErrorCode, ex.Message);
throw;
}

Console.WriteLine("Hate severity: {0}", response.Value.HateResult?.Severity ?? 0);
Console.WriteLine("SelfHarm severity: {0}", response.Value.SelfHarmResult?.Severity ?? 0);
Console.WriteLine("Sexual severity: {0}", response.Value.SexualResult?.Severity ?? 0);
Console.WriteLine("Violence severity: {0}", response.Value.ViolenceResult?.Severity ?? 0);
```

[README]: https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/contentsafety/Azure.AI.ContentSafety/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,163 @@
# Manage Text Blocklist

This sample shows how to create a text blocklist and analyze text with blocklists using Azure AI Content Safety.
To get started, make sure you have satisfied all the prerequisites and got all the resources required by [README][README].

## Create a ContentSafetyClient

To create a new `ContentSafetyClient` you need the endpoint and credentials from your resource. In the sample below you'll use a Content Safety API key credential by creating an `AzureKeyCredential` object.

You can set `endpoint` and `key` based on an environment variable, a configuration setting, or any way that works for your application.

```C# Snippet:Azure_AI_ContentSafety_CreateClient
string endpoint = TestEnvironment.Endpoint;
string key = TestEnvironment.Key;

ContentSafetyClient client = new ContentSafetyClient(new Uri(endpoint), new AzureKeyCredential(key));
```

## Create a text blocklist

You can create or update a text blocklist by `CreateOrUpdateTextBlocklist`, the blocklist name is unique. If the new blocklist is created successfully it will return `201`, if an existed blocklist is updated successfully it will return `200`.

```C# Snippet:Azure_AI_ContentSafety_CreateNewBlocklist
var blocklistName = "TestBlocklist";
var blocklistDescription = "Test blocklist management";

var data = new
{
description = blocklistDescription,
};

var createResponse = client.CreateOrUpdateTextBlocklist(blocklistName, RequestContent.Create(data));
if (createResponse.Status == 201)
{
Console.WriteLine("\nBlocklist {0} created.", blocklistName);
}
else if (createResponse.Status == 200)
{
Console.WriteLine("\nBlocklist {0} updated.", blocklistName);
}
```

## Add blockItems to text blocklist

You can add multiple blockItems once by calling `AddBlockItems`. There is a maximum limit of **10,000 items** in total across all lists. You can add at most **100 blockItems** in one request.

```C# Snippet:Azure_AI_ContentSafety_AddBlockItems
string blockItemText1 = "k*ll";
string blockItemText2 = "h*te";

var blockItems = new TextBlockItemInfo[] { new TextBlockItemInfo(blockItemText1), new TextBlockItemInfo(blockItemText2) };
var addedBlockItems = client.AddBlockItems(blocklistName, new AddBlockItemsOptions(blockItems));

if (addedBlockItems != null && addedBlockItems.Value != null)
{
Console.WriteLine("\nBlockItems added:");
foreach (var addedBlockItem in addedBlockItems.Value.Value)
{
Console.WriteLine("BlockItemId: {0}, Text: {1}, Description: {2}", addedBlockItem.BlockItemId, addedBlockItem.Text, addedBlockItem.Description);
}
}
```

## Load text and analyze text with blocklists

You can read in the text and initialize `AnalyzeTextOptions` with it. Then attach the blocklists you would like to use by adding their blocklist names, and call `AnalyzeText` to get analysis result. Note that after you edit your blocklist, it usually takes effect in **5 minutes**, please wait some time before analyzing with blocklist after editing.

```C# Snippet:Azure_AI_ContentSafety_AnalyzeTextWithBlocklist
// After you edit your blocklist, it usually takes effect in 5 minutes, please wait some time before analyzing with blocklist after editing.
var request = new AnalyzeTextOptions("I h*te you and I want to k*ll you");
request.BlocklistNames.Add(blocklistName);
request.BreakByBlocklists = true;

Response<AnalyzeTextResult> response;
try
{
response = client.AnalyzeText(request);
}
catch (RequestFailedException ex)
{
Console.WriteLine("Analyze text failed.\nStatus code: {0}, Error code: {1}, Error message: {2}", ex.Status, ex.ErrorCode, ex.Message);
throw;
}

if (response.Value.BlocklistsMatchResults != null)
{
Console.WriteLine("\nBlocklist match result:");
foreach (var matchResult in response.Value.BlocklistsMatchResults)
{
Console.WriteLine("Blockitem was hit in text: Offset: {0}, Length: {1}", matchResult.Offset, matchResult.Length);
Console.WriteLine("BlocklistName: {0}, BlockItemId: {1}, BlockItemText: {2}, ", matchResult.BlocklistName, matchResult.BlockItemId, matchResult.BlockItemText);
}
}
```

## Other text blocklist management samples

### List text blocklists

```C# Snippet:Azure_AI_ContentSafety_ListBlocklists
var blocklists = client.GetTextBlocklists();
Console.WriteLine("\nList blocklists:");
foreach (var blocklist in blocklists)
{
Console.WriteLine("BlocklistName: {0}, Description: {1}", blocklist.BlocklistName, blocklist.Description);
}
```

### Get text blocklist

```C# Snippet:Azure_AI_ContentSafety_GetBlocklist
var getBlocklist = client.GetTextBlocklist(blocklistName);
if (getBlocklist != null && getBlocklist.Value != null)
{
Console.WriteLine("\nGet blocklist:");
Console.WriteLine("BlocklistName: {0}, Description: {1}", getBlocklist.Value.BlocklistName, getBlocklist.Value.Description);
}
```

### List blockItems

```C# Snippet:Azure_AI_ContentSafety_ListBlockItems
var allBlockitems = client.GetTextBlocklistItems(blocklistName);
Console.WriteLine("\nList BlockItems:");
foreach (var blocklistItem in allBlockitems)
{
Console.WriteLine("BlockItemId: {0}, Text: {1}, Description: {2}", blocklistItem.BlockItemId, blocklistItem.Text, blocklistItem.Description);
}
```

### Get blockItem

```C# Snippet:Azure_AI_ContentSafety_GetBlockItem
var getBlockItemId = addedBlockItems.Value.Value[0].BlockItemId;
var getBlockItem = client.GetTextBlocklistItem(blocklistName, getBlockItemId);
Console.WriteLine("\nGet BlockItem:");
Console.WriteLine("BlockItemId: {0}, Text: {1}, Description: {2}", getBlockItem.Value.BlockItemId, getBlockItem.Value.Text, getBlockItem.Value.Description);
```

### Remove blockItems

```C# Snippet:Azure_AI_ContentSafety_RemoveBlockItems
var removeBlockItemId = addedBlockItems.Value.Value[0].BlockItemId;
var removeBlockItemIds = new List<string> { removeBlockItemId };
var removeResult = client.RemoveBlockItems(blocklistName, new RemoveBlockItemsOptions(removeBlockItemIds));

if (removeResult != null && removeResult.Status == 204)
{
Console.WriteLine("\nBlockItem removed: {0}.", removeBlockItemId);
}
```

### Delete text blocklist

```C# Snippet:Azure_AI_ContentSafety_DeleteBlocklist
var deleteResult = client.DeleteTextBlocklist(blocklistName);
if (deleteResult != null && deleteResult.Status == 204)
{
Console.WriteLine("\nDeleted blocklist.");
}
```

[README]: https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/contentsafety/Azure.AI.ContentSafety/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@
using System.Collections.Generic;
using System.IO;
using System.Threading.Tasks;
using Azure.Core;
using Azure.Core.TestFramework;
using NUnit.Framework;

Expand Down Expand Up @@ -68,7 +69,30 @@ public async Task TestAnalyzeImage()
var response = await client.AnalyzeImageAsync(request);

Assert.IsNotNull(response);
Assert.IsNotNull(response.Value.ViolenceResult);
Assert.Greater(response.Value.ViolenceResult.Severity, 0);
Assert.IsNotNull(response.Value.HateResult);
Assert.IsNotNull(response.Value.SexualResult);
Assert.IsNotNull(response.Value.SelfHarmResult);
}

[RecordedTest]
public async Task TestCreateOrUpdateBlocklist()
{
var client = CreateContentSafetyClient();

var blocklistName = "TestBlocklist";
var blocklistDescription = "Test blocklist management";

var data = new
{
description = blocklistDescription,
};

Response response = await client.CreateOrUpdateTextBlocklistAsync(blocklistName, RequestContent.Create(data));

Assert.IsNotNull(response);
Assert.GreaterOrEqual(response.Status, 200);
}
}
}
Loading