Kaganapan
Mar 31, 11 PM - Abr 2, 11 PM
Ang pinakamalaking Tela, Power BI, at SQL learning event. Marso 31 – Abril 2. Gamitin ang code FABINSIDER upang makatipid ng $ 400.
Magparehistro naHindi na suportado ang browser na ito.
Mag-upgrade sa Microsoft Edge para samantalahin ang mga pinakabagong tampok, update sa seguridad, at teknikal na suporta.
The Lakehouse in Microsoft Fabric provides the Table maintenance feature to efficiently manage delta tables and to keep them always ready for analytics. This guide describes the table maintenance feature in Lakehouse and its capabilities.
Key capabilities of the lakehouse table maintenance feature:
Note
For advanced maintenance tasks, such as grouping multiple table maintenance commands, orchestrating it based on a schedule, a code-centric approach is the recommended choice. To learn more, see Delta Lake table optimization and V-Order article. It is also possible to use the Lakehouse API to automate table maintenance operations, to learn more see Manage the Lakehouse with Microsoft Fabric REST API.
Lakehouse table maintenance applies only to delta Lake tables. The legacy Hive tables that use PARQUET, ORC, AVRO, CSV, and other formats aren't supported.
The table maintenance feature offers three operations.
Mahalaga
Setting a shorter retention period impacts Delta's time travel capabilities. It's a general best practice to set a retention interval to at least seven days, because old snapshots and uncommitted files can still be in use by the concurrent table readers and writers. Cleaning up active files with the VACUUM command might lead to reader failures or even table corruption if the uncommitted files are removed. Table maintenance experiences in the user interface and in the Public APIs will fail by default when intervals are less than 7 days. In order to force lower retention intervals for the vacuum command, configure the spark.databricks.delta.retentionDurationCheck.enabled
to false
in the workspace. Table Maintenance jobs will then pick-up the configuration and allow the lower rentention during the job execution.
How to use the feature:
From your Microsoft Fabric account, navigate to the desired Lakehouse.
From the Lakehouse explorer's Tables section, either right-click on the table or use the ellipsis to access the contextual menu.
Select the Maintenance menu entry.
Check the maintenance options in the dialog per your requirement. For more information, see the Table maintenance operations section of this article.
Select Run now to execute the table maintenance job.
Track maintenance job execution by the notifications pane, or the Monitoring Hub.
After Run now is selected, a Spark maintenance job is submitted for execution.
Kaganapan
Mar 31, 11 PM - Abr 2, 11 PM
Ang pinakamalaking Tela, Power BI, at SQL learning event. Marso 31 – Abril 2. Gamitin ang code FABINSIDER upang makatipid ng $ 400.
Magparehistro naPagsasanay
Module
Work with Delta Lake tables in Microsoft Fabric - Training
Tables in a Microsoft Fabric lakehouse are based on the Delta Lake technology commonly used in Apache Spark. By using the enhanced capabilities of delta tables, you can create advanced analytics solutions.
Sertipikasyon
Microsoft Certified: Fabric Data Engineer Associate - Certifications
As a Fabric Data Engineer, you should have subject matter expertise with data loading patterns, data architectures, and orchestration processes.