Transform data in (Preview) using Azure Data Factory or Synapse Analytics

APPLIES TO: Azure Data Factory Azure Synapse Analytics

This article outlines how to use Data Flow to transform data in (Preview). To learn more, read the introductory article for Azure Data Factory or Azure Synapse Analytics.


This connector is currently in preview. You can try it out and give us feedback. If you want to take a dependency on preview connectors in your solution, please contact Azure support.

Supported capabilities

This connector is supported for the following capabilities:

Supported capabilities IR
Mapping data flow (source/-)

① Azure integration runtime ② Self-hosted integration runtime

For a list of data stores that are supported as sources/sinks, see the Supported data stores table.

Create a linked service using UI

Use the following steps to create a linked service in the Azure portal UI.

  1. Browse to the Manage tab in your Azure Data Factory or Synapse workspace and select Linked Services, then select New:

  2. Search for (Preview) and select the (Preview) connector.

    Screenshot showing selecting connector.

  3. Configure the service details, test the connection, and create the new linked service.

    Screenshot of configuration for linked service.

Connector configuration details

The following sections provide information about properties that are used to define Data Factory and Synapse pipeline entities specific to

Linked service properties

The following properties are supported for the linked service:

Property Description Required
type The type property must be set to Dataworld. Yes
apiToken Specify an API token for the Mark this field as SecureString to store it securely. Or, you can reference a secret stored in Azure Key Vault. Yes


    "name": "DataworldLinkedService",
    "properties": {
        "type": "Dataworld",
        "typeProperties": {
            "apiToken": {
                "type": "SecureString",
                "value": "<API token>"

Mapping data flow properties

When transforming data in mapping data flow, you can read tables from For more information, see the source transformation in mapping data flows. You can only use an inline dataset as source type.

Source transformation

The below table lists the properties supported by source. You can edit these properties in the Source options tab.

Name Description Required Allowed values Data flow script property
Dataset name The ID of the dataset in Yes String datasetId
Table name The ID of the table within the dataset in No (if query is specified) String tableId
Query Enter a SQL query to fetch data from An example is select * from MyTable. No (if tableId is specified) String query
Owner The owner of the dataset in Yes String owner source script example

When you use as source type, the associated data flow script is:

source(allowSchemaDrift: true,
	validateSchema: false,
	store: 'dataworld',
	format: 'rest',
	owner: 'owner1',
	datasetId: 'dataset1',
	tableId: 'MyTable') ~> DataworldSource

Next steps

For a list of data stores supported as sources and sinks by the copy activity, see Supported data stores.