Preparation

Completed

The first step in moderating content by using Azure AI Content Safety is to provision an Azure OpenAI Service resource in your Azure subscription. You can get started by creating a hub resource in Azure AI Studio.

In Azure, resources enable access to Azure services for individuals and teams. Resources also provide a container for billing, security configuration, and monitoring.

A hub is the top-level Azure resource for Azure AI Studio. It provides the working environment for a team to build and manage AI applications.

Create an Azure AI Studio hub resource

Important

This exercise uses the East US region because that region supports all features that this module includes. If you select a different region, you might not be able to complete the exercises. To learn more, see Region availability.

  1. In Azure AI Studio, under Management, select All resources.
  2. Select + New hub.
  3. Complete the following fields:
    • Hub name: Provide a name for your hub.
    • Subscription: Select your Azure subscription.
    • Resource group: Select an existing resource group or create a new one.
    • Location: Select East US.
    • Connect Azure AI Services or Azure OpenAI: Select Create new AI Services, enter a name, and then select Create.
    • Connect Azure AI Search: Select Skip connecting.
  4. Select Next.
  5. Review the hub details, and then select Create.

Go to Content Safety

You access the Content Safety capabilities within Azure AI Studio.

  1. In Azure AI Studio, under Get started, select AI Services.
  2. On the Integrate with generative AI page, select Content Safety.
  3. On the Content Safety page, you can view the features that you'll use in the exercises of this module.

Screenshot of the Content Safety page of Azure AI Studio. It shows the built-in features of moderate text content, groundedness detection, protected material detection, and prompt shields.

Download the project files

Project files are available in the data folder. Download the repository to access the required files for the upcoming exercises about text moderation and image moderation. To download the repository, select Code > Download ZIP.

Note

Don't unzip the bulk-image-moderation-dataset.zip file in the data folder. The .zip format is required for the image moderation exercise.

Warning

The Image Moderation folder contains graphic images that depict scenes of harm. These image files have blood in the name and are used for the image moderation exercise. You don't need to view the images to complete the exercise. The image moderation feature includes a Blur image toggle that hides the view of uploaded images by default.