Hi Zhenqi Liu
I'm glad that you were able to resolve your issue and thank you for posting your solution so that others experiencing the same thing can easily reference this! Since the Microsoft Q&A community has a policy that "The question author cannot accept their own answer. They can only accept answers by others ", I'll repost your solution in case you'd like to accept the answer .
Ask:
I think it relates to this issue [http://michaelcummings.net/mathoms/using-a-custom-jsonconverter-to-fix-bad-json-results/]
Our procedure was like below:
- Create a Copy Activity in ADF pipeline: calling workday API, and sink response in Azure Data Lake Storage Gen2 as json
- Create a Data Flow in ADF to do some transformation, using the json files created in step1
Problem :
When importing schema in Step 2, some of the fields expected to be array, but resulted in string. We found those fields which has bad schema might because that their types in json files are strange: when 2 elements in those fields, their types shows like array e.g. {Email_Address_Data:[{name:AAAA},{name:BBB}]}, but when there are less than 2 elements in those fields, their types shows as json. e.g. {Email_Address_Data:{name:AAAA}},
Solution:
We changed to use Python code instead of ADF, as the logic is really complex for workday API response.
If I missed anything please let me know and I'd be happy to add it to my answer, or feel free to comment below with any additional information.
I hope this helps!
If you have any other questions, please let me know. Thank you again for your time and patience throughout this issue.
Please don’t forget to Accept Answer
and Yes
for "was this answer helpful" wherever the information provided helps you, this can be beneficial to other community members.