Επεξεργασία

Κοινή χρήση μέσω


Adult content detection

Azure AI Vision can detect adult material in images so that developers can restrict the display of these images in their software. Content flags are applied with a score between zero and one so developers can interpret the results according to their own preferences.

Try out the adult content detection features quickly and easily in your browser using Vision Studio.

Tip

Azure AI Content Safety is the latest offering in AI content moderation. For more information, see the Azure AI Content Safety overview.

Content flag definitions

The "adult" classification contains several different categories:

  • Adult images are explicitly sexual in nature and often show nudity and sexual acts.
  • Racy images are sexually suggestive in nature and often contain less sexually explicit content than images tagged as Adult.
  • Gory images show blood/gore.

Use the API

You can detect adult content with the Analyze Image 3.2 API. When you add the value of Adult to the visualFeatures query parameter, the API returns three boolean properties—isAdultContent, isRacyContent, and isGoryContent—in its JSON response. The method also returns corresponding properties—adultScore, racyScore, and goreScore—which represent confidence scores between zero and one for each respective category.