Getting error "{"code":"BadRequest","message":null,"target":"pipeline..." when trying to debug pipeline

Anonymous
2021-03-08T02:37:14.197+00:00

Have confirmed through reproducing a couple of times. I have 3 data factories, dev/uat/prod. I develop something on DEV, then use Azure devops to run through and release the ARM templates onto UAT and prod. This all works OK.

However I then change the prod KeyVault Linked Service's default value to something else, using the UI. This is used in other Linked Services to obtain login information.

After changing the linked service "Default value" parameter only and publishing, I then get this error trying to debug a pipeline:

{"code":"BadRequest","message":null,"target":"pipeline//runid/XXX Run ID XXX","details":null,"error":null} (note I removed the GUID run ID in the code)

Seems like that should work without issue, might be a bug?

EDIT:

I have tried a simple pipeline which sets some variables, which works fine. However any pipeline that requires pass through to Key Vault doesn't seem to work. This was all working last week, so something seems to have been screwed up.

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
9,753 questions
{count} votes

Accepted answer
  1. Saurabh Sharma 23,761 Reputation points Microsoft Employee
    2021-03-17T05:07:19.303+00:00

    @Anonymous Products team has identified this as a product bug and they have created an workitem internally and working on to provide a fix for the same. Meanwhile, you can use the workaround you have found to circumvent this issue. Thank you so much for reporting this. Appreciate your patience on the same.

    ----------

    Please do not forget to "Accept the answer" wherever the information provided helps you to help others in the community.

    3 people found this answer helpful.

5 additional answers

Sort by: Most helpful
  1. AM 1 Reputation point
    2021-04-22T13:22:21.163+00:00

    I wanted to copy multiple tables and convert them to csv (and transfer the csv files into an Azure Storage Account) using Azure Data Factory and looked at this documentation: https://www.modern-dataengineering.com/post/how-to-copy-multiple-tables-in-azure-data-factory

    The documentation says in the Sink part:

    7. Click source > Query 8. Input the following dynamic sql code: SELECT * FROM @{item().Table_Name}

    and below it there is a picture that suggests to use @{item().Table_Name} in the File Name part of File path.

    I did as instructed using DelimitedText as the Sink dataset. In there under Connection --> File path --> Filename, I wrote @{item().Table_Name} as instructed. This produced the error {"code":"BadRequest","message":null,"target":"pipeline//runid/...

    After we changed @{item().Table_Name} to @{dataset().table_name}.csv the pipeline debugging started to work and the csv files were transferred to the Storage Account with no problems. It seems that the dataset filename must not be hard coded to that field and the @dataset parameter must be used with it.