can use a dataset for pipeline but can't use for dataflow in Azure Data Factory

Jerry Lee 10 Reputation points

Created a dataset for Azure SQL and it works fine in any pipelines of Azure Data Factory. However, it gives me the connection error from Data Flows - refer to the below.

The Azure SQL is configured to be accessible from any Azure services.

Spark job failed: { "text/plain": "{"runId":"a92816d0-6e36-4fee-9a7f-31849c5f32b9","sessionId":"b9423c7d-6a3b-46c3-b7bf-fc30e54cc46f","status":"Failed","payload":{"statusCode":400,"shortMessage":" server is a required property for AzureSqlDatabase1.\ Could not extract value from AzureSqlDatabase1","detailedMessage":"Failure 2024-05-20 05:07:51.915 failed DebugManager.processJob, run=a92816d0-6e36-4fee-9a7f-31849c5f32b9, server is a required property for AzureSqlDatabase1.\ Could not extract value from AzureSqlDatabase1"}}\n" } - RunId: a92816d0-6e36-4fee-9a7f-31849c5f32b9

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
9,919 questions
{count} votes

2 answers

Sort by: Most helpful
  1. Aswin 332 Reputation points Microsoft Vendor

    @Jerry Lee @Mahesh chandak I am also facing the same issue. I guess this is bug. You can edit the linked service and change the version from recommended to legacy. This works for me.


    2 people found this answer helpful.

  2. ShaikMaheer-MSFT 38,311 Reputation points Microsoft Employee

    Hi Jerry Lee,

    Thank you for posting query in Microsoft Q&A Platform.

    From error message, it seems server name might be not given. Could you please check you linked service and datasets to see is server name property gave correctly?
    If you are using parameters for server name, then make sure you pass value for that parameter.

    Hope this helps. Please let me know how it goes.