How to fix MSSQL_101 "INSERTABLE columns list or UPDATABLE column list or key column list is not properly passed to the stored procedure." in Data Flow Sink?

Martin 70 Reputation points
2023-03-15T09:32:16.0933333+00:00

I am trying to load CSV rows from Data Lake into an Azure SQL Database. I use the upsert functionality of the data flow and parameterize it to be able to load the same data into different tables.

this is how i designed my data flow:
grafik

In the alter row activity i set UpsertIf to true() and allowed only upserts in the sink and added the corresponding key columns.

When I run this i get the following error:

Error code: DFExecutorUserError
Failure type: User configuration issue
Details: Job failed due to reason: at Sink 'sink1': ERRORCODE:MSSQL_101, STAGE:INPUT-VALIDATION, MESSAGE:INSERTABLE columns list or UPDATABLE column list or key column list is not properly passed to the stored procedure.

The thing is that in some tables the data is upserted without any problems and in a few tables this issue comes up.

Any suggestions on how to fix it?
Thanks for your input! :)

Azure SQL Database
Azure Data Lake Storage
Azure Data Lake Storage
An Azure service that provides an enterprise-wide hyper-scale repository for big data analytic workloads and is integrated with Azure Blob Storage.
1,337 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
9,515 questions
0 comments No comments
{count} votes

Accepted answer
  1. AnnuKumari-MSFT 30,601 Reputation points Microsoft Employee
    2023-03-16T15:09:17.5966667+00:00

    @Martin ,

    Thankyou for using Microsoft Q&A platform and thanks for posting your question here.

    As per my understanding , you are trying to perform upsert while copying data from csv file to multiple SQL table. However, it is failing due to the above stated reason.

    From the screenshot of the dataflow, it's not clear that how are you passing the keycolumn in the sink transformation. Kindly share the full screenshot of the pipeline.

    Since your requirement is to perform upsert for multiple tables while copying data from muliple files, assuming the schema (number of columns , order of columns or columnnames) might be different for different sink tables , it's important to keep two things assured:

    1. Mapping should be dynamic in order to handle multiple table scenario
    2. Key column should be passed dynamically for each of the tables using dataflow parameter. Pass the relevant value for keyColumn via ADF pipeline. User's image

    User's image

    Relevant resources for point 1: Implement Upsert logic in Mapping data flow


    Hope it helps. For better implementation ,kindly share the screenshot of what you have tried so far. Please let me know in case any further assistance is required. If it helps, kindly accept the answer. Thankyou


1 additional answer

Sort by: Most helpful
  1. Mike Matthews 1 Reputation point
    2023-10-26T18:54:09.59+00:00

    I was seeing this error message when I had two key columns (an int and a guid) specified for my sink. I realised that the int column was not needed to uniquely identify rows, and removing it from the key solved the problem.