Share via


Study guide for Exam DP-100: Designing and Implementing a Data Science Solution on Azure

Purpose of this document

This study guide should help you understand what to expect on the exam and includes a summary of the topics the exam might cover and links to additional resources. The information and materials in this document should help you focus your studies as you prepare for the exam.

Useful links Description
How to earn the certification Some certifications only require passing one exam, while others require passing multiple exams.
Certification renewal Microsoft associate, expert, and specialty certifications expire annually. You can renew by passing a free online assessment on Microsoft Learn.
Your Microsoft Learn profile Connecting your certification profile to Microsoft Learn allows you to schedule and renew exams and share and print certificates.
Exam scoring and score reports A score of 700 or greater is required to pass.
Exam sandbox You can explore the exam environment by visiting our exam sandbox.
Request accommodations If you use assistive devices, require extra time, or need modification to any part of the exam experience, you can request an accommodation.
Take a free Practice Assessment Test your skills with practice questions to help you prepare for the exam.

Updates to the exam

We always update the English language version of the exam first. Some exams are localized into other languages, and those are updated approximately eight weeks after the English version is updated. While Microsoft makes every effort to update localized versions as noted, there may be times when the localized versions of an exam are not updated on this schedule. Other available languages are listed in the Schedule Exam section of the Exam Details webpage. If the exam isn't available in your preferred language, you can request an additional 30 minutes to complete the exam.

Note

The bullets that follow each of the skills measured are intended to illustrate how we are assessing that skill. Related topics may be covered in the exam.

Note

Most questions cover features that are general availability (GA). The exam may contain questions on Preview features if those features are commonly used.

Skills measured as of April 11, 2025

Audience profile

As a candidate for this exam, you should have subject matter expertise in applying data science and machine learning to implement and run machine learning workloads on Azure. Additionally, you should have knowledge of optimizing language models for AI applications using Azure AI.

Your responsibilities for this role include:

  • Designing and creating a suitable working environment for data science workloads.

  • Exploring data.

  • Training machine learning models.

  • Implementing pipelines.

  • Running jobs to prepare for production.

  • Managing, deploying, and monitoring scalable machine learning solutions.

  • Using language models for building AI applications.

As a candidate for this exam, you should have knowledge and experience in data science by using:

  • Azure Machine Learning

  • MLflow

  • Azure AI services, including Azure AI Search

  • Azure AI Foundry

Skills at a glance

  • Design and prepare a machine learning solution (20–25%)

  • Explore data, and run experiments (20–25%)

  • Train and deploy models (25–30%)

  • Optimize language models for AI applications (25–30%)

Design and prepare a machine learning solution (20–25%)

Design a machine learning solution

  • Identify the structure and format for datasets

  • Determine the compute specifications for machine learning workload

  • Select the development approach to train a model

Create and manage resources in an Azure Machine Learning workspace

  • Create and manage a workspace

  • Create and manage datastores

  • Create and manage compute targets

  • Set up Git integration for source control

Create and manage assets in an Azure Machine Learning workspace

  • Create and manage data assets

  • Create and manage environments

  • Share assets across workspaces by using registries

Explore data, and run experiments (20–25%)

Use automated machine learning to explore optimal models

  • Use automated machine learning for tabular data

  • Use automated machine learning for computer vision

  • Use automated machine learning for natural language processing

  • Select and understand training options, including preprocessing and algorithms

  • Evaluate an automated machine learning run, including responsible AI guidelines

Use notebooks for custom model training

  • Use the terminal to configure a compute instance

  • Access and wrangle data in notebooks

  • Wrangle data interactively with attached Synapse Spark pools and serverless Spark compute

  • Retrieve features from a feature store to train a model

  • Track model training by using MLflow

  • Evaluate a model, including responsible AI guidelines

Automate hyperparameter tuning

  • Select a sampling method

  • Define the search space

  • Define the primary metric

  • Define early termination options

Train and deploy models (25–30%)

Run model training scripts

  • Consume data in a job

  • Configure compute for a job run

  • Configure an environment for a job run

  • Track model training with MLflow in a job run

  • Define parameters for a job

  • Run a script as a job

  • Use logs to troubleshoot job run errors

Implement training pipelines

  • Create custom components

  • Create a pipeline

  • Pass data between steps in a pipeline

  • Run and schedule a pipeline

  • Monitor and troubleshoot pipeline runs

Manage models

  • Define the signature in the MLmodel file

  • Package a feature retrieval specification with the model artifact

  • Register an MLflow model

  • Assess a model by using responsible AI principles

Deploy a model

  • Configure settings for online deployment

  • Deploy a model to an online endpoint

  • Test an online deployed service

  • Configure compute for a batch deployment

  • Deploy a model to a batch endpoint

  • Invoke the batch endpoint to start a batch scoring job

Optimize language models for AI applications (25–30%)

Prepare for model optimization

  • Select and deploy a language model from the model catalog

  • Compare language models using benchmarks

  • Test a deployed language model in the playground

  • Select an optimization approach

Optimize through prompt engineering and prompt flow

  • Test prompts with manual evaluation

  • Define and track prompt variants

  • Create prompt templates

  • Define chaining logic with the prompt flow SDK

  • Use tracing to evaluate your flow

Optimize through Retrieval Augmented Generation (RAG)

  • Prepare data for RAG, including cleaning, chunking, and embedding

  • Configure a vector store

  • Configure an Azure AI Search-based index store

  • Evaluate your RAG solution

Optimize through fine-tuning

  • Prepare data for fine-tuning

  • Select an appropriate base model

  • Run a fine-tuning job

  • Evaluate your fine-tuned model

Study resources

We recommend that you train and get hands-on experience before you take the exam. We offer self-study options and classroom training as well as links to documentation, community sites, and videos.

Study resources Links to learning and documentation
Get trained Choose from self-paced learning paths and modules or take an instructor-led course
Find documentation Azure Databricks
Azure Machine Learning
Azure Synapse Analytics
MLflow and Azure Machine Learning
Ask a question Microsoft Q&A | Microsoft Docs
Get community support AI - Machine Learning - Microsoft Tech Community
AI - Machine Learning Blog - Microsoft Tech Community
Follow Microsoft Learn Microsoft Learn - Microsoft Tech Community
Find a video Microsoft Learn Shows

Change log

The table below summarizes the changes between the current and previous version of the skills measured. The functional groups are in bold typeface followed by the objectives within each group. The table is a comparison between the previous and current version of the exam skills measured and the third column describes the extent of the changes.

Skill area prior to January 16, 2025 Skill area as of January 16, 2025 Change
Audience profile Minor
Optimize language models for AI applications Optimize language models for AI applications No % change
Optimize through prompt engineering and Prompt flow Optimize through prompt engineering and prompt flow Minor