ADF mapping dataflow using AlterRow sink error

Gayatri Krishnan 41 Reputation points
2022-01-04T04:34:29.36+00:00

Hi,

I want to perform a merge operation to destination Azure mySQL. I am using mapping data flow AlterRow transformation to achieve the same. The source and destination column names and types are a match but still on run mapping data flow is trying to create a table in destination whereas the table already exists with correct columns and data types. Error is returned indicating it is trying to create a table in destination and denying user doesn't have permission. Why is the table creation getting fired when the options are not set to do so? Am I missing something?

Error
{"message":"Job failed due to reason: at Sink 'sink1': java.sql.SQLSyntaxErrorException: CREATE command denied to user 'portal_data'@'20.211.6.147' for table 't_5175_f35eff520ab74d6faefe6744c5a95793'. Details:java.sql.SQLSyntaxErrorException: CREATE command denied to user 'portal_data'@'20.211.6.147' for table 't_5175_f35eff520ab74d6faefe6744c5a95793'\n\tat shaded.msdataflow.com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:120)\n\tat shaded.msdataflow.com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:97)\n\tat shaded.msdataflow.com.mysql.cj.jdbc.exceptions.SQLExceptionsMapping.translateException(SQLExceptionsMapping.java:122)\n\tat shaded.msdataflow.com.mysql.cj.jdbc.StatementImpl.executeUpdateInternal(StatementImpl.java:1335)\n\tat shaded.msdataflow.com.mysql.cj.jdbc.StatementImpl.executeLargeUpdate(StatementImpl.java:2108)\n\tat shaded.msdataflow.com.mysql.cj.jdbc.StatementImpl.executeUpdate(StatementImpl.java:1245)\n\tat org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.createTable(JdbcUtils.scala:863)\n\tat com.microsoft.dataflow.transformers.store.JDBCWriter.createTable(JDBCStore.scala:591)\n\tat com.microsoft.dataflow.transformers.store.JDBCWriter.createTable(JDBCStore.scala","failureType":"UserError","target":"Data flow1","errorCode":"DFExecutorUserError"}

162049-image.png

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
{count} votes

3 answers

Sort by: Most helpful
  1. ShaikMaheer-MSFT 38,631 Reputation points Microsoft Employee Moderator
    2022-01-04T16:48:53.973+00:00

    Hi @Gayatri Krishnan ,

    Thank you for posting query in Microsoft Q&A Platform.

    Could you please confirm below,

    • Did you used and pre SQL scripts or Post SQL scripts in Sink settings?
    • Could you please try using normal Dataset instead of inline dataset?
    • Could you please check in Data preview of alter row is all required rows getting update policy?

  2. ShaikMaheer-MSFT 38,631 Reputation points Microsoft Employee Moderator
    2022-01-12T16:48:35.707+00:00

    Hi @Gayatri Krishnan ,

    Looks like error message is confusing here.

    Could you please check manually using SSMS, to make your user id which is used in SQL linked service can able to update records in your destination table?

    If not, kindly make sure to get related permission on your user id and try it updating records using SSMS and also using ADF.

    Kindly have CREATE permission also on your user id and try.

    Hope that may helps. Please let us know how it goes. Thank you.


  3. DavidR 40 Reputation points
    2024-02-10T11:11:18.5833333+00:00

    I have a very similar problem and I am not able to solve it.

    My user has grants to read/create/update/delete registars on the sink table.

    I'm using an integrated dataset, I don't have any script and my Data Preview works fine, the problem cames out when I excute the pipeline, I have posted a question here if you want to view more details https://learn.microsoft.com/en-us/answers/questions/1527655/azure-data-factory-denied-permission-when-trying-t

    It seems that is a bug of Azure Data Factory.

    0 comments No comments

Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.