@Gerald Rupp Thanks for using Microsoft Q&A forum and posting your query.
As per my understanding, you would like to split the large JSON files into smaller JSON files before copying them from Container A to Container B. Please correct me if I miss anything.
At this point I'm not sure of the structure of your JSON file and how you are expecting the structure to be in your sink after splitting them. But I'm sharing a generic approach using Azure Data Factory mapping data flows. You can choose the specific options in Data flow setting based on your specific need.
You can use mapping data flow to read through your source JSON files which are large and set the partition type on the sink transformation as shown below to copy them as smaller json files:
For more details about each of these partition types, please refer to this document: Data flow performance - Optimize tab
Hope this info helps.
Please don’t forget to Accept Answer
and Yes
for "was this answer helpful" wherever the information provided helps you, this can be beneficial to other community members.