Tutorial: Using Azure AI Studio with a screen reader

Note

Azure AI Studio is currently in public preview. This preview is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities. For more information, see Supplemental Terms of Use for Microsoft Azure Previews.

This article is for people who use screen readers such as Microsoft's Narrator, JAWS, NVDA or Apple's Voiceover. You learn how to use the Azure AI Studio with a screen reader.

Getting started in the Azure AI Studio

Most Azure AI Studio pages are composed of the following structure:

  • Banner (contains Azure AI Studio app title, settings, and profile information)
  • Primary navigation (contains Home, Explore, Build, and Manage)
  • Secondary navigation
  • Main page content
    • Contains a breadcrumb navigation element
    • Usually contains a command toolbar

For efficient navigation, it might be helpful to navigate by landmarks to move between these sections on the page.

Explore

In Explore you can explore the different capabilities of Azure AI before creating a project. You can find this page in the primary navigation landmark.

Within Explore, you can explore many capabilities found within the secondary navigation. These include model catalog, model benchmarks, and pages for Azure AI services such as Speech, Vision, and Content Safety.

  • Model catalog contains three main areas: Announcements, Models, and Filters. You can use Search and Filters to narrow down model selection
  • Azure AI service pages such as Speech consist of many cards containing links. These cards lead you to demo experiences where you can sample our AI capabilities and might link out to another webpage.

Projects

To work within the Azure AI Studio, you must first create a project:

  1. In Azure AI Studio, navigate to the Build tab in the primary navigation.
  2. Press the Tab key until you hear new project and select this button.
  3. Enter the information requested in the Create a new project dialog.

You then get taken to the project details page.

Within a project, you can explore many capabilities found within the secondary navigation. These include playground, prompt flow, evaluation, and deployments. The secondary navigation contains an H2 heading with the project title, which can be used for efficient navigation.

Using the playground

The playground is where you can chat with models and experiment with different prompts and parameters.

From the Build tab, navigate to the secondary navigation landmark and press the down arrow until you hear playground.

Playground structure

When you first arrive, the playground mode dropdown is set to Chat by default. In this mode, the playground is composed of the command toolbar and three main panes: Assistant setup, Chat session, and Configuration. If you added your own data in the playground, the Citations pane also appears when selecting a citation as part of the model response.

You can navigate by heading to move between these panes, as each pane has its own H2 heading.

Assistant setup pane

The assistant setup pane is where you can set up the chat assistant according to your organization's needs.

Once you edit the system message or examples, your changes don't save automatically. Press the Save changes button to ensure your changes are saved.

Chat session pane

The chat session pane is where you can chat to the model and test out your assistant.

  • After you send a message, the model might take some time to respond, especially if the response is long. You hear a screen reader announcement "Message received from the chatbot" when the model finishes composing a response.

Using prompt flow

Prompt flow is a tool to create executable flows, linking LLMs, prompts, and Python tools through a visualized graph. You can use this to prototype, experiment, and iterate on your AI applications before deploying.

With the Build tab selected, navigate to the secondary navigation landmark and press the down arrow until you hear prompt flow.

The prompt flow UI in Azure AI Studio is composed of the following main sections: Command toolbar, Flow (includes list of the flow nodes), Files and the Graph view. The Flow, Files, and Graph sections each have their own H2 headings that can be used for navigation.

Flow

  • This is the main working area where you can edit your flow, for example adding a new node, editing the prompt, selecting input data
  • You can also choose to work in code instead of the editor by navigating to the Raw file mode toggle button to view the flow in code.
  • You can also open your flow in VS Code Web by selecting the Open project in VS Code (Web) button.
  • Each node has its own H3 heading, which can be used for navigation.

Files

  • This section contains the file structure of the flow. Each flow has a folder that contains a flow.dag.yaml file, source code files, and system folders.
  • You can export or import a flow easily for testing, deployment, or collaborative purposes by navigating to the Add and Zip and download all files buttons.

Graph view

  • The graph is a visual representation of the flow. This view isn't editable or interactive.
  • You hear the following alt text to describe the graph: "Graph view of [flow name] – for visualization only." We don't currently provide a full screen reader description for this graphical chart. To get all equivalent information, you can read and edit the flow by navigating to Flow, or by toggling on the Raw file view. 

Evaluations

Evaluation is a tool to help you evaluate the performance of your generative AI application. You can use this to prototype, experiment, and iterate on your applications before deploying.

Creating an evaluation

To review evaluation metrics, you must first create an evaluation.

  1. Navigate to the Build tab in the primary navigation.
  2. Navigate to the secondary navigation landmark and press the down arrow until you hear evaluation.
  3. Press the Tab key until you hear new evaluation and select this button.
  4. Enter the information requested in the Create a new evaluation dialog. Once complete, your focus is returned to the evaluations list.

Viewing evaluations

Once you create an evaluation, you can access it from the list of evaluations.

Evaluation runs are listed as links within the Evaluations grid. Selecting a link takes you to a dashboard view with information about your specific evaluation run.

You might prefer to export the data from your evaluation run so that you can view it in an application of your choosing. To do this, select your evaluation run link, then navigate to the Export result button and select it.

There's also a dashboard view provided to allow you to compare evaluation runs. From the main Evaluations list page, navigate to the Switch to dashboard view button.

Technical support for customers with disabilities

Microsoft wants to provide the best possible experience for all our customers. If you have a disability or questions related to accessibility, contact the Microsoft Disability Answer Desk for technical assistance. The Disability Answer Desk support team is trained in using many popular assistive technologies. They can offer assistance in English, Spanish, French, and American Sign Language. Go to the Microsoft Disability Answer Desk site to find out the contact details for your region.

If you're a government, commercial, or enterprise customer, contact the enterprise Disability Answer Desk.

Next steps