Sự kiện
23 giờ 31 thg 3 - 23 giờ 2 thg 4
Sự kiện học tập Fabric, Power BI và SQL lớn nhất. 31 tháng 3 - 2 tháng 4. Sử dụng mã FABINSIDER để tiết kiệm 400 đô la.
Đăng ký ngay hôm nayTrình duyệt này không còn được hỗ trợ nữa.
Hãy nâng cấp lên Microsoft Edge để tận dụng các tính năng mới nhất, bản cập nhật bảo mật và hỗ trợ kỹ thuật.
This Microsoft Fabric workload development sample repository is a starting point for building applications that require integration with various services and for integration with lakehouse architecture. This article helps you set up the environment and configure the necessary components to get started. The article outlines key components and their roles in the architecture.
The frontend is where you manage the user experience (UX) and behavior. It communicates with the Fabric frontend portal via an iFrame to facilitate seamless interaction.
For more information, see Microsoft Fabric Workload Development Kit frontend.
The backend stores both data and metadata. It uses Create, Read, Update, and Delete (CRUD) operations to create workload items and metadata, and it executes jobs to populate data in storage. Communication between the frontend and backend is established through public APIs.
Azure Relay enables communication between the local development environment and the Fabric backend in developer mode. In developer mode, the workload operates on the developer's machine.
The DevGateway utility has two roles:
Workload Control API calls are made directly from the workload to Fabric. The Azure Relay channel isn't required for the calls.
The workload development kit architecture integrates seamlessly with a lakehouse architecture for operations like saving, reading, and fetching data. The interaction is facilitated through Azure Relay and the Fabric SDK to help ensure secure and authenticated communication. For more information, see working with customer data.
Microsoft Entra ID is used for secure authentication, ensuring that all interactions within the architecture are authorized and secure.
The development kit overview provides a glimpse into our architecture. For more information about how projects are configured, for authentication guidelines, and to get started, see the following articles:
The frontend establishes communication with the Fabric frontend portal via an iFrame. The portal in turn interacts with the Fabric backend by making calls to its exposed public APIs.
For interactions between the backend development box and the Fabric backend, the Azure Relay serves as a conduit. Additionally, the backend development box seamlessly integrates with Lakehouse. The communication is facilitated by using Azure Relay and the Fabric Software Development Kit (SDK) installed on the backend development box.
The authentication for all communication within these components is ensured through Microsoft Entra. Microsoft Entra provides a secure and authenticated environment for the interactions between the frontend, backend, Azure Relay, Fabric SDK, and Lakehouse.
Ensure that the NuGet Package Manager is integrated into your Visual Studio installation. This tool is required for streamlined management of external libraries and packages essential for our project.
<NuspecFile>Packages\manifest\ManifestPackageDebug.nuspec</NuspecFile>
and <NuspecFile>Packages\manifest\ManifestPackageRelease.nuspec</NuspecFile>
: These properties specify the path to the NuSpec files used for creating the NuGet package for Debug and Release modes. The NuSpec file contains metadata about the package, such as its ID, version, dependencies, and other relevant information.
<GeneratePackageOnBuild>true</GeneratePackageOnBuild>
: When set to true
, this property instructs the build process to automatically generate a NuGet package during each build. This property is useful to ensure that the package is always up-to-date with the latest changes in the project.
<IsPackable>true</IsPackable>
: When set to true
, this property indicates that the project can be packaged into a NuGet package. Being packable is an essential property for projects that are intended to produce NuGet packages during the build process.
The generated NuGet package for debug mode is located in the src\bin\Debug directory after the build process.
When you work in cloud mode, you can change the Visual Studio build configuration to Release and build your package. The generated package is located in the src\bin\Release
directory. For more information, see Working in cloud mode guide.
The backend Boilerplate sample depends on the following Azure SDK packages:
To configure NuGet Package Manager, specify the path in the Package Sources section before you begin the build process.
<Project Sdk="Microsoft.NET.Sdk.Web">
<PropertyGroup>
<TargetFramework>net7.0</TargetFramework>
<AutoGenerateBindingRedirects>true</AutoGenerateBindingRedirects>
<BuildDependsOn>PreBuild</BuildDependsOn>
<GeneratePackageOnBuild>true</GeneratePackageOnBuild>
<IsPackable>true</IsPackable>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)' == 'Release'">
<NuspecFile>Packages\manifest\ManifestPackageRelease.nuspec</NuspecFile>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)' == 'Debug'">
<NuspecFile>Packages\manifest\ManifestPackageDebug.nuspec</NuspecFile>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="Azure.Core" Version="1.38.0" />
<PackageReference Include="Azure.Identity" Version="1.11.0" />
<PackageReference Include="Azure.Storage.Files.DataLake" Version="12.14.0" />
<PackageReference Include="Microsoft.AspNet.WebApi.Client" Version="5.2.9" />
<PackageReference Include="Microsoft.AspNetCore.Mvc.NewtonsoftJson" Version="7.0.5" />
<PackageReference Include="Microsoft.Extensions.Logging.Debug" Version="7.0.0" />
<PackageReference Include="Microsoft.Identity.Client" Version="4.60.3" />
<PackageReference Include="Microsoft.IdentityModel.Protocols" Version="6.30.1" />
<PackageReference Include="Microsoft.IdentityModel.Protocols.OpenIdConnect" Version="6.30.1" />
<PackageReference Include="Microsoft.IdentityModel.Tokens" Version="6.30.1" />
<PackageReference Include="Swashbuckle.AspNetCore" Version="6.5.0" />
</ItemGroup>
<ItemGroup>
<Folder Include="Properties\ServiceDependencies\" />
</ItemGroup>
<Target Name="PreBuild" BeforeTargets="PreBuildEvent">
<Exec Command="powershell.exe -ExecutionPolicy Bypass -File ValidationScripts\RemoveErrorFile.ps1 -outputDirectory ValidationScripts\" />
<Exec Command="powershell.exe -ExecutionPolicy Bypass -File ValidationScripts\ManifestValidator.ps1 -inputDirectory .\Packages\manifest\ -inputXml WorkloadManifest.xml -inputXsd WorkloadDefinition.xsd -outputDirectory ValidationScripts\" />
<Exec Command="powershell.exe -ExecutionPolicy Bypass -File ValidationScripts\ItemManifestValidator.ps1 -inputDirectory .\Packages\manifest\ -inputXsd ItemDefinition.xsd -outputDirectory ValidationScripts\" />
<Exec Command="powershell.exe -ExecutionPolicy Bypass -File ValidationScripts\ValidateNoDefaults.ps1 -outputDirectory ValidationScripts\" />
<Error Condition="Exists('ValidationScripts\ValidationErrors.txt')" Text="Validation errors with either manifests or default values" File="ValidationScripts\ValidationErrors.txt" />
</Target>
</Project>
To set up the workload sample project on your local machine:
Clone the repository: Run git clone https://github.com/microsoft/Microsoft-Fabric-workload-development-sample.git
.
In Visual Studio 2022, open the solution.
Set up an app registration by following instructions in the authentication tutorial. Ensure that both your frontend and backend projects have the necessary setup that's described in the article. Microsoft Entra is used for secure authentication to help ensure that all interactions within the architecture are authorized and secure.
Update the Microsoft OneLake DFS base URL. Depending on your Fabric environment, you might be able to update the value for OneLakeDFSBaseURL
in the src\Constants folder. The default is onelake.dfs.fabric.microsoft.com
, but you can update the URL to reflect your environment. For more information about DFS paths, see the OneLake documentation.
Set up the workload configuration.
https://app.powerbi.com/groups/<WorkspaceID>/
.<AppId>
: The client ID (Application ID) of the workload Microsoft Entra application.<RedirectUri>
: The redirect URIs. You can find this value in the app registration that you created, under Authentication.<ResourceId>
: The audience for the incoming Microsoft Entra tokens. You can find this information in the app registration that you created, under Expose an API.Generate a manifest package.
To generate a manifest package file, build Fabric_Extension_BE_Boilerplate. The build is a three-step process that generates the manifest package file. It runs these steps:
Copy the ManifestPackage.1.0.0.nupkg file to the path that's defined in the workload-dev-mode.json configuration file.
Program.cs is the entry point and startup script for your application. In this file, you can configure various services, initialize the application, and start the web host.
Build to ensure your project can access the required dependencies for compilation and execution.
Download the DevGateway from Microsoft's Download Center
Run the Microsoft.Fabric.Workload.DevGateway.exe application and sign in with a user that has workspace admin privileges for the workspace specified in the WorkspaceGuid
field of workload-dev-mode.json.
After authentication, external workloads establish communication with the Fabric backend through Azure Relay. This process involves relay registration and communication management that's facilitated by a designated proxy node. The package that contains the workload manifest is uploaded and published.
At this stage, Fabric detects the workload and incorporates its allocated capacity.
You can monitor for potential errors in the console.
If no errors are shown, the connection is established, registration is successfully executed, and the workload manifest is systematically uploaded.
In Visual Studio, change your startup project to the Boilerplate project and select Run.
We use the workload Boilerplate C# ASP.NET Core sample to demonstrate how to build a workload by using REST APIs. The sample starts with generating server stubs and contract classes based on the Workload API Swagger specification. You can generate the code by using any of several Swagger code-generation tools. The Boilerplate sample uses NSwag. The sample contains the GenerateServerStub.cmd command line script, which wraps the NSwag code generator. The script takes a single parameter, which is a full path to NSwag installation directory. It also checks for the Swagger definition file (swagger.json) and the configuration file (nswag.json) in the folder.
Executing this script produces a C# file named WorkloadAPI_Generated.cs. The contents of this file can be logically divided into three parts as explained in the next sections.
ItemLifecycleController
and JobsController
classes are thin implementations of ASP.NET Core controllers for two subsets of the Workload API: item lifecycle management and jobs. These classes plug into the ASP.NET Core HTTP pipeline. They serve as the entrypoints for the API methods that are defined in the Swagger specification. The classes forward the calls to the "real" implementation that's provided by the workload.
Here's an example of the CreateItem
method:
/// <summary>
/// Called by Microsoft Fabric for creating a new item.
/// </summary>
/// <remarks>
/// Upon item creation Fabric performs some basic validations, creates the item with 'provisioning' state and calls this API to notify the workload. The workload is expected to perform required validations, store the item metadata, allocate required resources, and update the Fabric item metadata cache with item relations and ETag. To learn more see [Microsoft Fabric item update flow](https://updateflow).
/// <br/>
/// <br/>This API should accept [SubjectAndApp authentication](https://subjectandappauthentication).
/// <br/>
/// <br/>##Permissions
/// <br/>Permissions are checked by Microsoft Fabric.
/// </remarks>
/// <param name="workspaceId">The workspace ID.</param>
/// <param name="itemType">The item type.</param>
/// <param name="itemId">The item ID.</param>
/// <param name="createItemRequest">The item creation request.</param>
/// <returns>Successfully created.</returns>
[Microsoft.AspNetCore.Mvc.HttpPost, Microsoft.AspNetCore.Mvc.Route("workspaces/{workspaceId}/items/{itemType}/{itemId}")]
public System.Threading.Tasks.Task CreateItem(System.Guid workspaceId, string itemType, System.Guid itemId, [Microsoft.AspNetCore.Mvc.FromBody] CreateItemRequest createItemRequest)
{
return _implementation.CreateItemAsync(workspaceId, itemType, itemId, createItemRequest);
}
IItemLifecycleController
and IJobsController
are interfaces for the previously mentioned "real" implementations. They define the same methods, which the controllers implement.
C# contract classes are classes that the APIs use.
The next step after generating code is implementing the IItemLifecycleController
and IJobsController
interfaces. In the Boilerplate sample, ItemLifecycleControllerImpl
and JobsControllerImpl
implement these interfaces.
For example, this code is the implementation of the CreateItem API:
/// <inheritdoc/>
public async Task CreateItemAsync(Guid workspaceId, string itemType, Guid itemId, CreateItemRequest createItemRequest)
{
var authorizationContext = await _authenticationService.AuthenticateControlPlaneCall(_httpContextAccessor.HttpContext);
var item = _itemFactory.CreateItem(itemType, authorizationContext);
await item.Create(workspaceId, itemId, createItemRequest);
}
Several API methods accept various types of "payload" as part of the request body, or they return payloads as part of the response. For example, CreateItemRequest
has the creationPayload
property.
"CreateItemRequest": {
"description": "Create item request content.",
"type": "object",
"additionalProperties": false,
"required": [ "displayName" ],
"properties": {
"displayName": {
"description": "The item display name.",
"type": "string",
"readOnly": false
},
"description": {
"description": "The item description.",
"type": "string",
"readOnly": false
},
"creationPayload": {
"description": "Creation payload specific to the workload and item type, passed by the item editor or as Fabric Automation API parameter.",
"$ref": "#/definitions/CreateItemPayload",
"readOnly": false
}
}
}
The types for these payload properties are defined in the Swagger specification. There's a dedicated type for every kind of payload. These types don't define any specific properties, and they allow any property to be included.
Here's an example of the CreateItemPayload
type:
"CreateItemPayload": {
"description": "Creation payload specific to the workload and item type.",
"type": "object",
"additionalProperties": true
}
The generated C# contract classes are defined as partial
. They have a dictionary with properties defined.
Here's an example:
/// <summary>
/// Creation payload specific to the workload and item type.
/// </summary>
[System.CodeDom.Compiler.GeneratedCode("NJsonSchema", "13.20.0.0 (NJsonSchema v10.9.0.0 (Newtonsoft.Json v13.0.0.0))")]
public partial class CreateItemPayload
{
private System.Collections.Generic.IDictionary<string, object> _additionalProperties;
[Newtonsoft.Json.JsonExtensionData]
public System.Collections.Generic.IDictionary<string, object> AdditionalProperties
{
get { return _additionalProperties ?? (_additionalProperties = new System.Collections.Generic.Dictionary<string, object>()); }
set { _additionalProperties = value; }
}
}
The code can use this dictionary to read and return properties. However, a better approach is to define specific properties by using corresponding types and names. You can use the partial
declaration on the generated classes to efficiently define properties.
For example, the CreateItemPayload.cs file contains a complementary definition for the CreateItemPayload
class.
In this example, the definition adds the Item1Metadata
property:
namespace Fabric_Extension_BE_Boilerplate.Contracts.FabricAPI.Workload
{
/// <summary>
/// Extend the generated class by adding item-type-specific fields.
/// In this sample every type will have a dedicated property. Alternatively, polymorphic serialization could be used.
/// </summary>
public partial class CreateItemPayload
{
[Newtonsoft.Json.JsonProperty("item1Metadata", Required = Newtonsoft.Json.Required.Default, NullValueHandling = Newtonsoft.Json.NullValueHandling.Ignore)]
public Item1Metadata Item1Metadata { get; init; }
}
}
However, if the workload supports multiple item types, the CreateItemPayload
class must be able to handle different types of creation payload at one per item type. You have two options. The simpler way, used in the Boilerplate sample, is to define multiple optional properties, each representing the creation payload for a different item type. Every request then has just one of these sets of properties, according to the item type being created. Alternatively, you can implement polymorphic serialization, but this option isn't demonstrated in the sample because the option doesn't provide any significant benefits.
For example, to support two item types, the class definition must be extended like in the following example:
namespace Fabric_Extension_BE_Boilerplate.Contracts.FabricAPI.Workload
{
public partial class CreateItemPayload
{
[Newtonsoft.Json.JsonProperty("item1Metadata", Required = Newtonsoft.Json.Required.Default, NullValueHandling = Newtonsoft.Json.NullValueHandling.Ignore)]
public Item1Metadata Item1Metadata { get; init; }
[Newtonsoft.Json.JsonProperty("item2Metadata", Required = Newtonsoft.Json.Required.Default, NullValueHandling = Newtonsoft.Json.NullValueHandling.Ignore)]
public Item2Metadata Item2Metadata { get; init; }
}
}
Lưu ý
The payload that's sent to the workload is generated by the client. It can be the item editor iFrame or the Fabric Automation REST API. The client is responsible for sending the correct payload and matching the item type. The workload is responsible for verification. Fabric treats this payload as an opaque object and only transfers it from the client to the workload. Similarly, for a payload that's returned by the workload to the client, it is responsibility of the workload and the client to handle the payload correctly.
For example, this code shows how the Boilerplate sample item1 implementation handles the payload:
protected override void SetDefinition(CreateItemPayload payload)
{
if (payload == null)
{
Logger.LogInformation("No payload is provided for {0}, objectId={1}", ItemType, ItemObjectId);
_metadata = Item1Metadata.Default.Clone();
return;
}
if (payload.Item1Metadata == null)
{
throw new InvalidItemPayloadException(ItemType, ItemObjectId);
}
if (payload.Item1Metadata.Lakehouse == null)
{
throw new InvalidItemPayloadException(ItemType, ItemObjectId)
.WithDetail(ErrorCodes.ItemPayload.MissingLakehouseReference, "Missing Lakehouse reference");
}
_metadata = payload.Item1Metadata.Clone();
}
The next sections describe how to troubleshoot and debug your deployment.
Get information about known issues and ways to resolve them.
Error:
Microsoft.Identity.Client.MsalServiceException: A configuration issue is preventing authentication. Check the error message from the server for details. You can modify the configuration in the application registration portal. See https://aka.ms/msal-net-invalid-client
for details.
Original exception: AADSTS7000215: An invalid client secret was provided. Ensure that the secret that is sent in the request is the client secret value and not the client secret ID for a secret added to the app app_guid
setting.
Resolution: Make sure that you have the correct client secret defined in appsettings.json.
Error:
Microsoft.Identity.Client.MsalUiRequiredException: AADSTS65001: The user or administrator didn't consent to use the application with ID <example ID>
. Send an interactive authorization request for this user and resource.
Resolution:
In the item editor, go to the bottom of the pain and select Navigate to Authentication Page.
Under Scopes, enter .default, and then select Get Access token.
In the dialog, approve the revision.
Error:
PriorityPlacement: No core services are available for priority placement. Only name
, guid
, and workload-name
are available.
Resolution:
As a user, you might have access only to Trial capacity. Make sure that you use a capacity you have access to.
Error:
Creating a new file failed for filePath: 'workspace-id'/'lakehouse-id'/Files/data.json. The response status code doesn't indicate success: 404 (NotFound).
Resolution:
Make sure that you're working with the OneLake DFS URL that fits your environment. For example, if you work with a PPE environment, change EnvironmentConstants.OneLakeDFSBaseUrl
in Constants.cs to the appropriate URL.
When you troubleshoot various operations, you can set breakpoints in the code to analyze and debug the behavior. Follow these steps for effective debugging:
OnCreateFabricItemAsync
for CRUD operations or an endpoint in a controller for execute
operations).The debugger pauses execution at the specified breakpoints so that you can examine variables, step through the code, and identify issues.
If you're a connecting a backend to the sample workload project, your item must belong to a workspace that is associated with a capacity. By default, the My Workspace workspace isn't associated with a capacity. Otherwise, you might get the error that's shown in the following screenshot:
Switch to a named workspace. Leave the default workspace name My workspace.
From the correct workspace, load the sample workload and proceed with the tests:
We welcome contributions to this project. If you find any issues or want to add new features, follow these steps:
Sự kiện
23 giờ 31 thg 3 - 23 giờ 2 thg 4
Sự kiện học tập Fabric, Power BI và SQL lớn nhất. 31 tháng 3 - 2 tháng 4. Sử dụng mã FABINSIDER để tiết kiệm 400 đô la.
Đăng ký ngay hôm nayĐào tạo
Lộ trình học tập
Use advance techniques in canvas apps to perform custom updates and optimization - Training
Use advance techniques in canvas apps to perform custom updates and optimization
Chứng chỉ
Microsoft Certified: Fabric Data Engineer Associate - Certifications
Là một kỹ sư dữ liệu vải, bạn nên có chuyên môn đối tượng vấn đề với dữ liệu tải mẫu hình, kiến trúc dữ liệu, và quá trình điều phối.