Partager via


Making media content accessible (XAML)

[This article is for Windows 8.x and Windows Phone 8.x developers writing Windows Runtime apps. If you’re developing for Windows 10, see the latest documentation]

Looking for the HTML/JavaScript version of this topic? See Making media content accessible (HTML).

If your app includes media content, ensure that you provide captions for the content, or provide an alternate audio track that describes key visual elements that are happening in the video portion, or both. Captions are visible text equivalents of the speech and essential non-speech audio portions of a media presentation. They include both all of the spoken material and any audio effects that are required for understanding the media. Note that captions are not just for traditional video-based material; they are also required for audio-only media, such as podcasts, voice-overs in animations (including games) and presentations (including screen captures), and other, similar content.

Providing accessible transport controls for media

Many accessibility standards or recommendations state that users should have control over any media playback that is part of an app's behavior. This control should be quick and easy to discover as part of the UI. The practical reason for this is that many users with accessibility needs are using screen readers. If media starts playing an audio track and the user cannot quickly discover how to stop the media, it becomes impossible for the user to hear the screen reader's descriptions that would help them find the necessary control. In fact, the user cannot do anything else useful with your app through the screen reader until the media playback ends.

Some accessibility recommendations state that you never automatically play media when the app is first started. Instead, give users a chance to review the overall app structure through techniques such as traversing the tab order or relying on other assistive technologies. Users can then discover how to play the media themselves, at a time when they have enough information about how to interact with your app.

The main element that displays video content in a Windows Runtime app using C++, C#, or Visual Basic is a MediaElement object. A MediaElement can either use a default set of transport controls, or custom transport controls that control the media playback by calling MediaElement methods on the associated MediaElement instance. Using the default transport controls is probably more common; you enable this by setting the property AreTransportControlsEnabled to true.

The default transport controls have accessibility support built-in as part of their system-provided template. For example, the Play/Pause button has a Name and tooltip to describe the current action, it is focusable and invokable, the slider for Seek supports arrow keys for navigation and reports its value through the RangeValue pattern, and so on. But if you're not using these default controls, make sure that your transport controls provide the basic accessibility information and support. You might want to examine the existing accessibility support in the default transport controls, and make sure the accessibility of your transport controls is at least as good as that.

One common design metaphor for transport controls for media is to use icons only. This is typically done with a Button that has Image content or possibly a graphics composite such as a Path-defined shape. Whenever you do this, make sure that the Button also has an AutomationProperties.Name value that describes the action, and/or tooltips with that same description, so that screen readers can read this info to users. Also make sure all your transport controls including the buttons are in a useful tab order to focus with keyboard only, and that buttons can be invoked with Spacebar or Enter keys for activation when focused.

Captions

Captions can be closed (the display of the captions can be switched on and off by the user) or open (the captions are always visible to all users—typically burned into the video). Open captions don't require any additional work by the app other than accessing the correct media. But closed captions rely on finding the text information in a sideband text file or format and then displaying those captions in a prominent part of the UI.

If your application includes audio or audiovisual media, provide text alternatives or captions for the hearing impaired. You typically add captions directly to your media files by using media production tools.

As mentioned previously, the MediaElement class has no default UI, and so it also has no default UI area for displaying the text from closed captions. It is up to you to specify a control for that area and to implement the behavior that displays the caption text at the appropriate synchronization with the media. The synchronization is achieved by handling the MarkerReached event. Each time a MarkerReached event fires, it means that there is more text to display from a closed caption. You can then get the text from the event data and render that text into a text area or control. It is up to you whether to display only the most current caption text or instead to maintain an automatically scrolling area where successive captions replace each other.

You can display captions in your app by using code like this:

<MediaElement x:Name="media" Width="300" Height="200"
  MarkerReached="OnMarkerReached" Source="media.wmv"/>
...
<TextBlock x:Name="CaptionTextBlock" />
Public Sub OnMarkerReached(ByVal sender As Object,
    ByVal e As TimelineMarkerRoutedEventArgs)

    CaptionTextBlock.Text = e.Marker.Text

End Sub
public void OnMarkerReached(object sender,
    TimelineMarkerRoutedEventArgs e)
{
    CaptionTextBlock.Text = e.Marker.Text;
}

Alternate audio streams

The original producers of the media that you play in an app can add alternate audio streams to media files using media production tools. The format that enables the streams is part of the definition of the video format beng used (WMV and so on). These streams are useful to provide spoken commentary, including descriptions of the video stream for use by the visually impaired.

To access these alternate streams from your app code, use the AudioStreamCount and AudioStreamIndex properties. In this next example, media is a named instance of the MediaElement class, and it has loaded a source media file that it expects to contain multiple audio streams. This code is a handler for a user control that enables the user to toggle between audio streams after the media is loaded.

private void AltAudioBtn_Click(object sender, RoutedEventArgs e)
{
    if (media.AudioStreamCount > 1)
    {
         if (media.AudioStreamIndex == 1)
         {
            media.AudioStreamIndex = 0;
            (sender as Button).Content = "Play full-description audio";
         } else {
            media.AudioStreamIndex = 1;
            (sender as Button).Content = "Play default audio";
         }
     } else
     {
         (sender as Control).IsEnabled = false;
     }
}

For more info about alternate audio streams, see How to select audio tracks in different languages. That topic is written from the perspective of providing streams for different languages rather than a stream that includes audio descriptions, but the technical concepts are the same.

XAML accessibility sample

XAML media playback sample

MediaElement

Playing and previewing audio and video