Įvykiai
03-31 23 - 04-02 23
Didžiausias "Fabric", Power BI ir SQL mokymosi įvykis. Kovo 31 d. – balandžio 2 d. Naudokite kodą FABINSIDER, kad įrašytumėte $400.
Registruokitės šiandienŠi naršyklė nebepalaikoma.
Atnaujinkite į „Microsoft Edge“, kad pasinaudotumėte naujausiomis funkcijomis, saugos naujinimais ir techniniu palaikymu.
Real-Time Intelligence integrates with the lifecycle management capabilities in Microsoft Fabric, providing standardized collaboration between all development team members throughout the product's life. This functionality is delivered via git integration and deployment pipelines.
In this article, you learn about the configuration options available through Microsoft Fabric's lifecycle management for Real-Time Intelligence.
Real-Time Intelligence supports git integration for eventhouses, KQL databases, KQL querysets, and Real-Time dashboards. The git integration allows you to track changes to these items in a git-connected workspace. The integration provides a way to manage the lifecycle of these items, including versioning, branching, and merging.
All items include metadata, and eventhouses and KQL databases also contain data referenced by multiple objects in the workspace.
The following metadata elements are included within Real-Time Intelligence items:
From a development workflow perspective, the following dependent objects might reference an eventhouse or KQL database:
The git integration applies at the platform for all items and at the data level for eventhouses and KQL databases.
The following information is serialized and tracked in a git-connected workspace:
Eventhouse
KQL database
KQL queryset
Real Time Dashboard
Data-level integration is achieved by using a KQL script to create or modify database objects schemas, properties, and policies. However, it's important to note that not all commands supported in a KQL script are compatible with Microsoft Fabric ALM.
KQL database
The following database objects are supported in the KQL script:
For information about supported commands, see the DatabaseSchema.kql file description under KQL database files.
Each eventhouse and KQL Database items synced with git appear in its own folder named using the following format: <ItemName>
.<ItemType>
where <ItemName>
is the name of the item and <ItemType>
is the type of the item. For example, for an eventhouse named Example that has a single KQL database named ExampleDB, the following folders appear in the git repository:
The following files are contained in an eventhouse folder:
.platform
The file uses the following schema to define an eventhouse:
{
"$schema": "https://developer.microsoft.com/json-schemas/fabric/gitIntegration/platformProperties/2.0.0/schema.json",
"metadata": {
"type": "Eventhouse",
"displayName": "",
"description": ""
},
"config": {
"version": "2.0",
"logicalId": ""
}
}
EventhouseProperties.json
The file allows you to configure platform-level settings for the eventhouse item.
The following files are contained in a KQL database folder:
.platform
The file uses the following schema to define a KQL database:
{
"$schema": "https://developer.microsoft.com/json-schemas/fabric/gitIntegration/platformProperties/2.0.0/schema.json",
"metadata": {
"type": "KQLDatabase",
"displayName": "",
"description": ""
},
"config": {
"version": "2.0",
"logicalId": ""
}
}
DatabaseProperties.json
The file uses the following schema to configure platform-level settings for the KQL database item:
{
"databaseType": "ReadWrite",
"parentEventhouseItemId": "",
"oneLakeCachingPeriod": "P36500D",
"oneLakeStandardStoragePeriod": "P36500D"
}
The following table describes the properties in the DatabaseProperties.json
file:
Property | Description |
---|---|
databaseType | Valid values: ReadWrite |
parentEventhouseItemId | The logical ID of the parent eventhouse. This value shouldn't be modified. |
oneLakeCachingPeriod | Database level setting for the caching policy. |
oneLakeStandardStoragePeriod | Database level setting for the retention policy. |
DatabaseSchema.kql
The file is a KQL script that configures the data-level settings for the KQL database. It's automatically generated when the KQL database is synchronized to git. The file is executed when syncing to your Fabric Workspace.
You can make changes to this script by adding or modifying the following supported commands:
Database object | Supported commands |
---|---|
Table | Create or merge |
Function | Create or alter |
Table policy update | Alter |
Column encoding policy | Alter |
Materialized view | Create or alter |
Table ingestion mapping | Create or alter |
The following example is a KQL script to create a table and its ingestion mapping.
// KQL script
// Use management commands in this script to configure your database items, such as tables, functions, materialized views, and more.
.create-merge table SampleTable (UsageDate:datetime, PublisherType:string, ChargeType:string, ServiceName:string, ServiceTier:string, Meter:string, PartNumber:string, CostUSD:real, Cost:real, Currency:string)
.create-or-alter table SampleTable ingestion csv mapping 'SampleTable_mapping' "[{'Properties':{'Ordinal':'0'},'column':'UsageDate','datatype':''},{'Properties':{'Ordinal':'1'},'column':'PublisherType','datatype':''}]"
The file uses the following schema to define a KQL queryset:
{
"queryset": {
"version": "1.0.0",
"tabs": [
{
"id": "",
"title": "",
"content": "",
"dataSourceId": "Guid1"
}
],
"dataSources": [
{
"id": "",
"clusterUri": "",
"type": "AzureDataExplorer",
"databaseName": ""
},
{
"id": "Guid1",
"clusterUri": "",
"type": "Fabric",
"databaseItemId": "",
"databaseItemName": ""
}
]
}
}
The file uses the following schema to define a Real-Time Dashboard:
{
"$schema": "",
"id": "",
"eTag": "\"\"",
"schema_version": "",
"title": "",
"tiles": [
{
"id": "",
"title": "",
"visualType": "",
"pageId": "",
"layout": {
"x": ,
"y": ,
"width": ,
"height":
},
"queryRef": {
"kind": "",
"queryId": ""
},
"visualOptions": {
"multipleYAxes": {
"base": {
"id": "",
"label": "",
"columns": [],
"yAxisMaximumValue": ,
"yAxisMinimumValue": ,
"yAxisScale": "",
"horizontalLines": []
},
"additional": [],
"showMultiplePanels":
},
"hideLegend": ,
"legendLocation": "",
"xColumnTitle": "",
"xColumn": ,
"yColumns": ,
"seriesColumns": ,
"xAxisScale": "",
"verticalLine": "",
"crossFilterDisabled": ,
"drillthroughDisabled": ,
"crossFilter": [
{
"interaction": "",
"property": "",
"parameterId": "",
"disabled":
}
],
"drillthrough": [],
"selectedDataOnLoad": {
"all": ,
"limit":
},
"dataPointsTooltip": {
"all": ,
"limit":
}
}
}
],
"baseQueries": [],
"parameters": [
{
"kind": "",
"id": "",
"displayName": "",
"description": "",
"variableName": "",
"selectionType": "",
"includeAllOption": ,
"defaultValue": {
"kind": ""
},
"dataSource": {
"kind": "",
"columns": {
"value": ""
},
"queryRef": {
"kind": "",
"queryId": ""
}
},
"showOnPages": {
"kind": ""
},
"allIsNull":
},
],
"dataSources": [
{
"id": "",
"name": "",
"clusterUri": "",
"database": "",
"kind": "",
"scopeId": ""
}
],
"pages": [
{
"name": "",
"id": ""
}
],
"queries": [
{
"dataSource": {
"kind": "",
"dataSourceId": ""
},
"text": "",
"id": "",
"usedVariables": [
"",
""
]
}
]
}
Įvykiai
03-31 23 - 04-02 23
Didžiausias "Fabric", Power BI ir SQL mokymosi įvykis. Kovo 31 d. – balandžio 2 d. Naudokite kodą FABINSIDER, kad įrašytumėte $400.
Registruokitės šiandienMokymas
Modulis
Implement continuous integration and continuous delivery (CI/CD) in Microsoft Fabric - Training
Learn the key concepts and strategies for implementing continuous integration and continuous deployment (CI/CD) in Microsoft Fabric.
Sertifikatas
Microsoft Certified: Fabric Data Engineer Associate - Certifications
As a Fabric Data Engineer, you should have subject matter expertise with data loading patterns, data architectures, and orchestration processes.
Dokumentacija
End-to-end sample in real-time - Microsoft Fabric
Use the sample gallery to create an end-to-end real-time solution that shows how to stream, analyze, and visualize real-time data in a real-world context.
Discover and respond to events - Microsoft Fabric
Learn about the Real-Time Intelligence tutorial user flow 1 and how to discover and respond to events in Microsoft Fabric.
Get and transform events from streaming sources - Microsoft Fabric
Learn about Real-Time Intelligence tutorial user flow 2 - Transform events from streaming sources in Microsoft Fabric.