Automating Table Activation and Deployment for Synapse Link in Dynamics 365 Dataverse

Sriram M 40 Reputation points
2024-08-02T15:17:52.7333333+00:00

I am using Synapse Link for Dataverse to bring data from Dynamics 365, where I need to manually activate each table to load data into Azure Data Lake Storage (ADLS) Gen2. I seek a solution to automate this activation process and to deploy the same activated tables across different environments (development, test, pre-prod, prod) without manual intervention.

Details:

Automating Table Activation in Synapse Link for Dataverse:

  • Currently, I need to navigate through a list of tables in the Synapse Link for Dataverse interface and manually activate each one to enable data loading into ADLS Gen2.
    • This process is time-consuming and prone to human error, especially when dealing with a large number of tables.
      • Question: Is there a way to automate the activation of multiple tables in Synapse Link for Dataverse?
      Automating Deployment Across Environments:
      - After setting up and activating the necessary tables in the development environment, I need to repeat the same activation process in the test, pre-production, and production environments.
      
         - Manually activating tables in each environment is not efficient and can lead to inconsistencies.
      
            - **Question**: Can the table activation settings from the development environment be exported and deployed to other environments (test, pre-prod, prod) in an automated manner? If yes, how can this be achieved?
      

Desired Outcome:

I am looking for a streamlined, automated approach to:

  • Activate multiple tables in Synapse Link for Dataverse without manual effort.
  • Deploy the activated table configurations across different environments seamlessly.

Any guidance, best practices, or references to relevant documentation/scripts would be greatly appreciated.

Thank you for your support! Details:

Automating Table Activation in Synapse Link for Dataverse:

  • Currently, I need to navigate through a list of tables in the Synapse Link for Dataverse interface and manually activate each one to enable data loading into ADLS Gen2.
    • This process is time-consuming and prone to human error, especially when dealing with a large number of tables.
      • Question: Is there a way to automate the activation of multiple tables in Synapse Link for Dataverse?
      Automating Deployment Across Environments:
      - After setting up and activating the necessary tables in the development environment, I need to repeat the same activation process in the test, pre-production, and production environments.
      
         - Manually activating tables in each environment is not efficient and can lead to inconsistencies.
      
            - **Question**: Can the table activation settings from the development environment be exported and deployed to other environments (test, pre-prod, prod) in an automated manner? If yes, how can this be achieved?
      

Desired Outcome:

I am looking for a streamlined, automated approach to:

  • Activate multiple tables in Synapse Link for Dataverse without manual effort.
  • Deploy the activated table configurations across different environments seamlessly.

Any guidance, best practices, or references to relevant documentation/scripts would be greatly appreciated.

Thank you for your support!

Azure Synapse Analytics
Azure Synapse Analytics
An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.
4,927 questions
Microsoft Power Platform Training
Microsoft Power Platform Training
Microsoft Power Platform: An integrated set of Microsoft business intelligence services.Training: Instruction to develop new skills.
412 questions
Microsoft Dataverse Training
Microsoft Dataverse Training
Microsoft Dataverse: A Microsoft service that enables secure storage and management of data used by business apps. Previously known as Common Data Service.Training: Instruction to develop new skills.
34 questions
0 comments No comments
{count} votes

1 answer

Sort by: Most helpful
  1. Amira Bedhiafi 24,711 Reputation points
    2024-08-05T16:25:31.5033333+00:00

    To automate the activation of tables in Synapse Link for Dataverse and deploy these configurations across different environments, you can use a combination of Power Automate, PowerShell scripts, and the Dataverse API.

    Automating Table Activation

    1. Power Automate Flow: Create a Power Automate flow to automate the table activation process.
      • Step 1: Use the HTTP connector in Power Automate to call the Dataverse API.
      • Step 2: Fetch the list of tables using the GET /api/data/v9.0/entities endpoint.
      • Step 3: Iterate through the list of tables and activate each one by updating the respective table settings.
    2. PowerShell Script: Alternatively, you can use PowerShell to automate this process.
      
         # Connect to Dataverse
      
         $ConnectionUri = "https://<your-organization>.crm.dynamics.com"
      
         $clientId = "<your-client-id>"
      
         $clientSecret = "<your-client-secret>"
      
         $tenantId = "<your-tenant-id>"
      
         # Get Access Token
      
         $body = @{
      
             client_id = $clientId
      
             scope = "https://<your-organization>.crm.dynamics.com/.default"
      
             client_secret = $clientSecret
      
             grant_type = "client_credentials"
      
         }
      
         $response = Invoke-RestMethod -Method Post -Uri "https://login.microsoftonline.com/$tenantId/oauth2/v2.0/token" -ContentType "application/x-www-form-urlencoded" -Body $body
      
         $accessToken = $response.access_token
      
         # Get List of Tables
      
         $tablesUri = "$ConnectionUri/api/data/v9.0/entities"
      
         $headers = @{
      
             Authorization = "Bearer $accessToken"
      
         }
      
         $tablesResponse = Invoke-RestMethod -Method Get -Uri $tablesUri -Headers $headers
      
         # Activate Tables
      
         foreach ($table in $tablesResponse.value) {
      
             $tableName = $table.LogicalName
      
             $activateTableUri = "$ConnectionUri/api/data/v9.0/$tableName"
      
             $body = @{
      
                 "isManaged" = true
      
             } | ConvertTo-Json
      
             Invoke-RestMethod -Method Patch -Uri $activateTableUri -Headers $headers -Body $body -ContentType "application/json"
      
         }
      
      

    Deploying Table Configurations Across Environments

    1. Export Configuration from Development Environment: Use the Configuration Migration tool from the Power Platform to export the table activation settings from the development environment.
      • Step 1: Download and install the Configuration Migration tool.
      • Step 2: Use the tool to create a schema file and export the data.
    2. Import Configuration to Other Environments: Use the Configuration Migration tool to import the exported configuration into the test, pre-production, and production environments.
      • Step 1: Connect to the target environment using the Configuration Migration tool.
      • Step 2: Import the previously exported schema and data.
    3. Automate the Deployment: Create a Power Automate flow or a PowerShell script to automate the import process in each environment.
      
         # Import Configuration to Target Environment
      
         $targetEnvironmentUri = "https://<target-organization>.crm.dynamics.com"
      
         # Update Connection Details for Target Environment
      
         $headers = @{
      
             Authorization = "Bearer $accessToken"
      
         }
      
         # Import Configuration
      
         $importUri = "$targetEnvironmentUri/api/data/v9.0/Import"
      
         $importBody = @{
      
             "schemaFile" = "path-to-exported-schema-file"
      
             "dataFile" = "path-to-exported-data-file"
      
         } | ConvertTo-Json
      
         Invoke-RestMethod -Method Post -Uri $importUri -Headers $headers -Body $importBody -ContentType "application/json"
      
      


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.