Copy Activity fails due to empty data returned from Source data store

Manish Patel 46 Reputation points
2020-12-29T16:20:26.693+00:00

I have created a data pipeline which was scheduled to run once per day and suddenly it started failing as there was no data returned by the source and due to which copy activity fails to execute.

Below is the error message:

Failure happened on 'Source' side. ErrorCode=UserErrorTypeInSchemaTableNotSupported,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Failed to get the type from schema table. This could be caused by missing Sql Server System CLR Types.,Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.InvalidCastException,Message=Unable to cast object of type 'System.DBNull' to type 'System.Type'.,Source=Microsoft.DataTransfer.ClientLibrary,'

What are the suggestions or recommendations in such scenario?

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
11,624 questions
0 comments No comments
{count} votes

Accepted answer
  1. Nasreen Akter 10,811 Reputation points Volunteer Moderator
    2020-12-29T19:05:45.263+00:00

    Hi @Manish Patel ,

    you can have a lookup activity prior to the CopyActivity to check number of total rows you have in the table (please see the screenshots below). Thank you!

    51979-lookup-1.jpg

    52103-lookup-2.jpg

    ----------

    If the above response helps, please "accept the answer" and "up-vote" it! Thank you!

    2 people found this answer helpful.

3 additional answers

Sort by: Most helpful
  1. JoyDutt 831 Reputation points
    2020-12-29T16:38:39.18+00:00

    In default/auto mapping, copy activity maps source data to sink by column names in case-sensitive manner. But it is strange as your are seeing this error even for explicit mapping. One additional verification you can do is:

    1. Clear your data schema in your source dataset
    2. Import schema in your source dataset
    3. Clear your mapping in your copy settings
    4. Import schemas on mapping of copy
    5. Now copy paste the errored column name from source file, import schema of your dataset and column name from mapping of copy are exactly same.

    Other things to check --

    2 steps previous the Copy Data activity:

    1. one Web activity that calls the API with the values of the @Items
    2. an If Condition activity that evaluates if the activity of previous step is empty: something like (@bool(empty(activity('CalculateLenOfData').output.tables[0].rows))).

    In my opinion, things with Copy Data could be more easy, not always there will be data to copy. We can expect that some times will bring no object, or an empty dataframe, table... In that cases Copy Data activity shouldn´t block the execution of the following steps in the data factory. This is a clear point of improvement for me.

    (Please don’t forget to **"Accept the answer" & “up-vote” **, this can be beneficial to other members. Thank You)

    Regards,
    ** J.D. **

    2 people found this answer helpful.
    0 comments No comments

  2. HarithaMaddi-MSFT 10,146 Reputation points
    2020-12-30T10:49:54.467+00:00

    Hi @Manish Patel ,

    Thanks for posting the query. I looked at the similar issues from customers in past and it is observed that removing the explicit mapping helped them in resolving the error. I was unable to reproduce as it is working fine for me when there is no data from source as well. Please remove the mapping and let us know source type (SQL?) and also pipeline run id with us to investigate it further if issue persists.

    1 person found this answer helpful.

  3. Андрей 16 Reputation points
    2022-02-03T13:22:42.63+00:00

    Hello!
    It is 2022.02.03 and i'm facing the same issue: trying to load from REST API Json to Parquet!
    No mapping - works great, but copies only upper-level JSON fields. When i do "Import schema" button, i can load full JSON structureb but hav an error in one of load iterations.
    Maybe this page should have raised an issue backthen in 2020 and now it was fixed? =)
    Maybe some wise MS Azure developers should start working on this BUG in the near future? ;)

    Error code 2200  
    Failure type User configuration issue  
    Details  
    ErrorCode=UserErrorTypeInSchemaTableNotSupported,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Failed to get the type from schema table. This could be caused by missing Sql Server System CLR Types.,Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.InvalidCastException,Message=Unable to cast object of type 'System.DBNull' to type 'System.Type'.,Source=Microsoft.DataTransfer.ClientLibrary,'  
    

    Please note that all requests, except one, are Successful and data is really loaded into Parquet:
    171008-image.png


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.