Събитие
31.03, 23 ч. - 2.04, 23 ч.
Най-голямото събитие за обучение на Fabric, Power BI и SQL. 31 март – 2 април. Използвайте код FABINSIDER, за да спестите $400.
Регистрирайте се днесТози браузър вече не се поддържа.
Надстройте до Microsoft Edge, за да се възползвате от най-новите функции, актуализации на защитата и техническа поддръжка.
In this scenario, a Copy activity was used in a data pipeline to load 1 TB of Parquet table data stored in Azure Data Lake Storage (ADLS) Gen2 to a data warehouse with staging in Microsoft Fabric.
The prices used in the following example are hypothetical and don’t intend to imply exact actual pricing. These are just to demonstrate how you can estimate, plan, and manage cost for Data Factory projects in Microsoft Fabric. Also, since Fabric capacities are priced uniquely across regions, we use the pay-as-you-go pricing for a Fabric capacity at US West 2 (a typical Azure region), at $0.18 per CU per hour. Refer here to Microsoft Fabric - Pricing to explore other Fabric capacity pricing options.
To accomplish this scenario, you need to create a pipeline with the following configuration:
The data movement operation utilized 267,480 CU seconds with a 1504.42 second (25.07 minute) duration while activity run operation was null since there weren’t any non-copy activities in the pipeline run.
Бележка
Although reported as a metric, the actual duration of the run isn't relevant when calculating the effective CU hours with the Fabric Metrics App since the CU seconds metric it also reports already accounts for its duration.
Metric | Data Movement Operation |
---|---|
CU seconds | 267,480 CU seconds |
Effective CU-hours | (267,480) / (60*60) CU-hours = 74.3 CU-hours |
Total run cost at $0.18/CU hour = (74.3 CU-hours) * ($0.18/CU hour) ~= $13.37
Събитие
31.03, 23 ч. - 2.04, 23 ч.
Най-голямото събитие за обучение на Fabric, Power BI и SQL. 31 март – 2 април. Използвайте код FABINSIDER, за да спестите $400.
Регистрирайте се днесОбучение
Модул
Orchestrate processes and data movement with Microsoft Fabric - Training
Use Data Factory pipelines in Microsoft Fabric
Сертифициране
Microsoft Certified: Fabric Data Engineer Associate - Certifications
As a Fabric Data Engineer, you should have subject matter expertise with data loading patterns, data architectures, and orchestration processes.
Документация
Pricing scenario - Data pipelines load 1 TB of CSV data to Lakehouse files - Microsoft Fabric
This article provides an example pricing scenario for loading 1 TB of CSV data to Lakehouse files with binary copy using Data Factory in Microsoft Fabric.
Pricing scenario - Data pipelines load 1 TB of Parquet data to a Lakehouse table - Microsoft Fabric
This article provides an example pricing scenario for loading 1 TB of Parquet data to a Lakehouse table using Data Factory in Microsoft Fabric.
Pricing scenario - Data pipelines load 1 TB of CSV data to a Lakehouse table. - Microsoft Fabric
This article provides an example pricing scenario for loading 1 TB of CSV data to a Lakehouse table using Data Factory in Microsoft Fabric.