Getting an error when we try to run our data flow in Azure Data Factory.

Rithika S 20 Reputation points
2023-11-13T03:57:30.0333333+00:00

In Azure Data Factory,we are getting an error when we try to run data flow,which takes information from citus and sends it to snowflake.Here the data flow works until putting the info into snowflake which can be previewed.

The error is as follows:

Operation on target Data flow1 failed: {"StatusCode":"DFExecutorUserError","Message":"Job failed due to reason: at Sink 'sink1': Status of query associated with resultSet is FAILED_WITH_ERROR. Results not generated.","Details":"java.sql.SQLException: Status of query associated with resultSet is FAILED_WITH_ERROR. Results not generated.\n\tat net.snowflake.client.jdbc.SFAsyncResultSet.getRealResults(SFAsyncResultSet.java:127)\n\tat net.snowflake.client.jdbc.SFAsyncResultSet.getMetaData(SFAsyncResultSet.java:262)\n\tat net.snowflake.spark.snowflake.io.StageWriter$.executeCopyIntoTable(StageWriter.scala:547)\n\tat net.snowflake.spark.snowflake.io.StageWriter$.writeToTableWithStagingTable(StageWriter.scala:430)\n\tat net.snowflake.spark.snowflake.io.StageWriter$.writeToTable(StageWriter.scala:286)\n\tat net.snowflake.spark.snowflake.io.StageWriter$.writeToStage(StageWriter.scala:231)\n\tat net.snowflake.spark.snowflake.io.package$.writeRDD(package.scala:51)\n\tat net.snowflake.spark.snowflake.SnowflakeWriter.save(SnowflakeWriter.scala:74)\n\tat net.snowflake.spark.snowflake.DefaultSource.createRelation(DefaultSource.scala:141)\n\tat org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:46)\n\tat org.apache.s"}

"sink1" is the part of the data flow where we are sinking the data into Snowflake. Were are not able to figure out why we are getting this error. Thank you!

Azure Synapse Analytics
Azure Synapse Analytics
An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.
5,373 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
11,623 questions
0 comments No comments
{count} votes

Accepted answer
  1. Harishga 6,000 Reputation points Microsoft External Staff
    2023-11-13T11:38:46.96+00:00

    Hi@Rithika S ,
    Welcome to Microsoft Q&A platform and thanks for posting your question here.

    As per my understanding, you are encountering an error when trying to run a data flow in Azure Data Factory. The error message indicates that the status of the query associated with the resultSet is FAILED_WITH_ERROR, and the results are not generated in the sink . Please let me know if that is not the correct understanding.

    The issue might have been occurring due to one of the reasons such as incorrect credentials at sink, network connectivity issues, or datatype mismatch between source and sink.

    Kindly try the below troubleshooting options to resolve the issue and see if it works:

    Check the credentials: Ensure that the credentials used to connect to Snowflake are correct in the linked service used in the sink dataset.

    Check the network connectivity: Check the firewall settings between Azure Data Factory and Snowflake are not blocked.

    Reimport the projection: Try reimporting the projections from the source.

    Ensure that there is no datatype mismatch: Check if all my datatypes in the sink are the same as the destination database.

    Follow the given best practices to avoid this issue:

    Clean up duplicate columns as often as possible after performing certain transforms.

    Import your schema projection at the source level every time there are major/minor schema changes.

    Make sure there is no mismatch between source and sink schema.

    I hope this information helps you. Let me know if you have any further questions or concerns.

    0 comments No comments

0 additional answers

Sort by: Most helpful

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.