Avvenimenti
Ingħaqad mal-Isfida tal-Fest tal-Ħiliet tal-AI
Apr 8, 3 PM - May 28, 7 AM
Sharpen your AI skills and enter the sweepstakes to win a free Certification exam
Register now!Dan il-brawżer m'għadux appoġġjat.
Aġġorna għal Microsoft Edge biex tieħu vantaġġ mill-aħħar karatteristiċi, aġġornamenti tas-sigurtà, u appoġġ tekniku.
Nota
L-aċċess għal din il-paġna jeħtieġ l-awtorizzazzjoni. Tista’ tipprova tidħol jew tibdel id-direttorji.
L-aċċess għal din il-paġna jeħtieġ l-awtorizzazzjoni. Tista’ tipprova tibdel id-direttorji.
APPLIES TO:
Azure Data Factory
Azure Synapse Analytics
Suġġeriment
Try out Data Factory in Microsoft Fabric, an all-in-one analytics solution for enterprises. Microsoft Fabric covers everything from data movement to data science, real-time analytics, business intelligence, and reporting. Learn how to start a new trial for free!
In this quickstart, you use the Copy Data tool in Azure Data Factory Studio to create a pipeline that copies data from a source folder in Azure Blob Storage to a target folder.
If you don't have an Azure subscription, create a free account before you begin.
To prepare source data by using a template:
Select the following button.
You're directed to the configuration page to deploy the template. On this page:
For Resource group, select Create new to create a resource group. You can leave all the other values with their defaults.
Select Review + create, and then select Create to deploy the resources.
Nota
The user who deploys the template needs to assign a role to a managed identity. This step requires permissions that can be granted through the Owner, User Access Administrator, or Managed Identity Operator role.
A new Blob Storage account is created in the new resource group. The moviesDB2.csv file is stored in a folder called input in Blob Storage.
You can use your existing data factory, or you can create a new one as described in Quickstart: Create a data factory.
The Copy Data tool has five pages that walk you through the task of copying data. To start the tool:
In Azure Data Factory Studio, go to your data factory.
Select the Ingest tile.
On the Properties page of the Copy Data tool, choose Built-in copy task under Task type.
Select Next.
On the Source page of the Copy Data tool, select + Create new connection to add a connection.
Select the linked service type that you want to create for the source connection. (The example in this quickstart uses Azure Blob Storage.) Then select Continue.
In the New connection (Azure Blob Storage) dialog:
Under Source data store:
On the Target page of the Copy Data tool, for Connection, select the AzureBlobStorage connection that you created.
In the Folder path section, enter adftutorial/output.
Leave other settings as default. Select Next.
On the Settings page of the Copy Data tool, specify a name for the pipeline and its description.
Select Next to use other default configurations.
On the Review and finish page, review all settings.
Select Next.
The Deployment complete page shows whether the deployment is successful.
After you finish copying the data, you can monitor the pipeline that you created:
On the Deployment complete page, select Monitor.
The application switches to the Monitor tab, which shows the status of the pipeline. Select Refresh to refresh the list of pipelines. Select the link under Pipeline name to view activity run details or to rerun the pipeline.
On the page that shows the details of the activity run, select the Details link (eyeglasses icon) in the Activity name column for more details about the copy operation. For information about the properties, see the overview article about the copy activity.
The pipeline in this sample copies data from one location to another location in Azure Blob Storage. To learn about using Data Factory in more scenarios, see the following tutorial:
Avvenimenti
Ingħaqad mal-Isfida tal-Fest tal-Ħiliet tal-AI
Apr 8, 3 PM - May 28, 7 AM
Sharpen your AI skills and enter the sweepstakes to win a free Certification exam
Register now!