Aqra bl-Ingliż Editja

Ixxerja permezz ta’


Quickstart: Use the Copy Data tool in Azure Data Factory Studio to copy data

APPLIES TO: Azure Data Factory Azure Synapse Analytics

Suġġeriment

Try out Data Factory in Microsoft Fabric, an all-in-one analytics solution for enterprises. Microsoft Fabric covers everything from data movement to data science, real-time analytics, business intelligence, and reporting. Learn how to start a new trial for free!

In this quickstart, you use the Copy Data tool in Azure Data Factory Studio to create a pipeline that copies data from a source folder in Azure Blob Storage to a target folder.

Prerequisites

Azure subscription

If you don't have an Azure subscription, create a free account before you begin.

Prepare source data in Azure Blob Storage

To prepare source data by using a template:

  1. Select the following button.

    Try your first data factory demo

  2. You're directed to the configuration page to deploy the template. On this page:

    1. For Resource group, select Create new to create a resource group. You can leave all the other values with their defaults.

    2. Select Review + create, and then select Create to deploy the resources.

    Screenshot of the page for deploying a template for the creation of resources.

Nota

The user who deploys the template needs to assign a role to a managed identity. This step requires permissions that can be granted through the Owner, User Access Administrator, or Managed Identity Operator role.

A new Blob Storage account is created in the new resource group. The moviesDB2.csv file is stored in a folder called input in Blob Storage.

Create a data factory

You can use your existing data factory, or you can create a new one as described in Quickstart: Create a data factory.

Use the Copy Data tool to copy data

The Copy Data tool has five pages that walk you through the task of copying data. To start the tool:

  1. In Azure Data Factory Studio, go to your data factory.

  2. Select the Ingest tile.

Screenshot that shows the page for a data factory and the Ingest tile in Azure Data Factory Studio.

Step 1: Select the task type

  1. On the Properties page of the Copy Data tool, choose Built-in copy task under Task type.

  2. Select Next.

Screenshot that shows the Properties page of the Copy Data tool.

Step 2: Complete source configuration

  1. On the Source page of the Copy Data tool, select + Create new connection to add a connection.

  2. Select the linked service type that you want to create for the source connection. (The example in this quickstart uses Azure Blob Storage.) Then select Continue.

    Screenshot that shows the gallery of service types in the dialog for a new connection, with Azure Blob Storage selected.

  3. In the New connection (Azure Blob Storage) dialog:

    1. For Name, specify a name for your connection.
    2. Under Account selection method, select From Azure subscription.
    3. In the Azure subscription list, select your Azure subscription.
    4. In the Storage account name list, select your storage account.
    5. Select Test connection and confirm that the connection is successful.
    6. Select Create.

    Screenshot that shows configuration details for an Azure Blob Storage account.

  4. Under Source data store:

    1. For Connection, select the newly created connection.
    2. In the File or folder section, select Browse to go to the adftutorial/input folder. Select the moviesDB2.csv file, and then select OK.
    3. Select the Binary copy checkbox to copy the file as is.
    4. Select Next.

    Screenshot that shows settings for a source data store.

Step 3: Complete destination configuration

  1. On the Target page of the Copy Data tool, for Connection, select the AzureBlobStorage connection that you created.

  2. In the Folder path section, enter adftutorial/output.

    Screenshot that shows settings for a destination data store.

  3. Leave other settings as default. Select Next.

Step 4: Enter a name and description for the pipeline

  1. On the Settings page of the Copy Data tool, specify a name for the pipeline and its description.

  2. Select Next to use other default configurations.

    Screenshot that shows the Settings page of the Copy Data tool.

Step 5: Review settings and deploy

  1. On the Review and finish page, review all settings.

  2. Select Next.

The Deployment complete page shows whether the deployment is successful.

Monitor the running results

After you finish copying the data, you can monitor the pipeline that you created:

  1. On the Deployment complete page, select Monitor.

    Screenshot of the page that that shows a completed deployment.

  2. The application switches to the Monitor tab, which shows the status of the pipeline. Select Refresh to refresh the list of pipelines. Select the link under Pipeline name to view activity run details or to rerun the pipeline.

    Screenshot that shows the button for refreshing the list of pipelines.

  3. On the page that shows the details of the activity run, select the Details link (eyeglasses icon) in the Activity name column for more details about the copy operation. For information about the properties, see the overview article about the copy activity.

The pipeline in this sample copies data from one location to another location in Azure Blob Storage. To learn about using Data Factory in more scenarios, see the following tutorial:


Riżorsi addizzjonali