Introduction to Azure Cognitive Services Content Moderator API in C#
Working with Azure Cognitive Services means harnessing powerful AI algorithms to process your data. The Azure Content Moderator API helps in filtering out harmful, offensive, or unwanted content. To successfully integrate this API into a C# application, you will leverage the library provided by Azure to interact with the service endpoints.
Setting Up the Environment
- Ensure that you have the .NET SDK installed on your machine, as the SDK is required to compile and run C# applications.
- Use your preferred Integrated Development Environment (IDE) such as Visual Studio or JetBrains Rider. Make sure it is configured to work with .NET projects.
Installing Required Packages
- Start by installing the
Microsoft.Azure.CognitiveServices.ContentModerator
package, which provides the necessary tools to interact with the API.
dotnet add package Microsoft.Azure.CognitiveServices.ContentModerator
Initializing the Content Moderator Client
- Create an instance of the Content Moderator client by utilizing your API key and endpoint, which you can retrieve from the Azure portal.
using Microsoft.Azure.CognitiveServices.ContentModerator;
using Microsoft.Rest;
// Replace 'your-subscription-key' and 'your-endpoint' with your actual key and endpoint
string subscriptionKey = "your-subscription-key";
string endpoint = "your-endpoint";
// Create a new client
ContentModeratorClient client = new ContentModeratorClient(new ApiKeyServiceClientCredentials(subscriptionKey))
{
Endpoint = endpoint
};
Text Moderation Example
- To moderate text content, use the
ScreenText
method. It helps in identifying potential offensive content within a text.
using System;
using Microsoft.Azure.CognitiveServices.ContentModerator;
namespace ContentModerationDemo
{
class Program
{
static void Main(string[] args)
{
var client = new ContentModeratorClient(new ApiKeyServiceClientCredentials("<subscription-key>"))
{
Endpoint = "<endpoint-url>"
};
// Text moderation example
var text = "This is some sample text to analyze.";
var language = "eng"; // Example language
var result = client.TextModeration.ScreenText("text/plain", text, language);
Console.WriteLine("Terms detected:");
foreach (var term in result.Terms)
{
Console.WriteLine($"Term: {term.Term}, ListId: {term.ListId}, Offset: {term.Index}");
}
}
}
}
Image Moderation Example
- For image moderation, use the
ModerateImage
functionality to detect adult or racy content.
using System;
using System.IO;
using Microsoft.Azure.CognitiveServices.ContentModerator;
namespace ContentModerationDemo
{
class Program
{
static void Main(string[] args)
{
var client = new ContentModeratorClient(new ApiKeyServiceClientCredentials("<subscription-key>"))
{
Endpoint = "<endpoint-url>"
};
// Image moderation example
string imageUrl = "https://example.com/sample.jpg";
var imageResult = client.ImageModeration.EvaluateUrlInput("application/json", new BodyModel("url", imageUrl));
Console.WriteLine($"Adult Content Score: {imageResult.AdultClassificationScore}");
Console.WriteLine($"Racy Content Score: {imageResult.RacyClassificationScore}");
}
}
}
Handling Results and Errors
- When handling results, check for specific fields such as confidence scores to determine the likelihood of inappropriate content.
- Always implement error handling using try-catch blocks to manage exceptions when API requests fail.
try
{
// Your moderation logic here
}
catch (APIErrorException e)
{
Console.WriteLine($"API Error: {e.Message}");
}
catch (Exception e)
{
Console.WriteLine($"Error: {e.Message}");
}