question

rajivgandhiveerabathiran-3463 avatar image
0 Votes"
rajivgandhiveerabathiran-3463 asked ShaikMaheer-MSFT commented

How to correct string to date data type conversion error in azure data factory pipeline

How to correct string to date data type conversion error in azure data factory pipeline.
Actually we have loaded data from BW to data lake which is loaded all fields as string types. After that we are trying to load azure synapse table while loading we are getting invalid data type error bacause azure tables having dob as date data type

azure-data-factory
· 5
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Hi, thanks for posting your question in Microsoft q&a. Please elaborate your question with the below information, that would help us narrow down the solution
1. Where is the error happening. While inserting records to synapse tables from the data lake files ?
2. How are you moving records , using copy activity or data flow?






0 Votes 0 ·

Yes, copy activity through pipeline

0 Votes 0 ·
SubashriVasudevan-1752 avatar image SubashriVasudevan-1752 rajivgandhiveerabathiran-3463 ·

Hi @rajivgandhiveerabathiran-3463 ,

Are you doing explicit mapping?

0 Votes 0 ·
Show more comments
KiranKolte avatar image
0 Votes"
KiranKolte answered

Hi @rajivgandhiveerabathiran-3463

Thank you for using the Microsoft Q&A platform and posting your query.

While creating or defining a dataset, you can specify explicitly schema and data type mapping. Please refer

https://docs.microsoft.com/en-us/azure/data-factory/copy-activity-schema-and-type-mapping

5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

SubashriVasudevan-1752 avatar image
0 Votes"
SubashriVasudevan-1752 answered ShaikMaheer-MSFT commented

Hi @rajivgandhiveerabathiran-3463 ,

my csv file data:

"sno","dob"
"1","00000"
"2","1989/07/21"
"3","1999/07/21"

Data flow:
source csv--derived column--sink(az sql table)

source schema:
sno--string
dob string

expression in derived column: toDate(replace(dob,'00000','9999/12/31'),'yyyy/MM/dd')

this properly loads the date column to sink,

Please revert if this helps.

· 5
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Yes , it would support through Data flow but we are doing through copy activity on pipeline bulk insert

0 Votes 0 ·
SubashriVasudevan-1752 avatar image SubashriVasudevan-1752 rajivgandhiveerabathiran-3463 ·

Hi @rajivgandhiveerabathiran-3463

In that case, please replace the file content using az. Functions and use it in the copy activity.

0 Votes 0 ·

Thanks for your help, Sorry AZ means,?

0 Votes 0 ·
Show more comments