Hello @Abhay Chandramouli,
You can use combination of the Mapping dataflows and copy activity to achieve your requirement. But here, in this approach you should be aware of the JSON string schema.
The dataflow will be used for the creation of the JSON from the JSON strings. It will create the outer array as an object. To convert it into an array, copy activity will be used after the dataflow activity.
I took your sample JSON data in an input file.
[
{
"requests": ["{\"action\":\"UPDATE\",\"endpoint\":\"v75.0/records/Product__c/b07XY0000098z5BZU\"}","{\"action\":\"UPDATE\",\"endpoint\":\"v75.0/records/Product__c/b07XY0000098z5AAB\"}"]
},
{
"requests": ["{\"action\":\"UPDATE\",\"endpoint\":\"v75.0/records/Product__c/b07XY0000098z5BZU\"}","{\"action\":\"UPDATE\",\"endpoint\":\"v75.0/records/Product__c/b07XY0000098z5AAB\"}"]
}
]
First create a source dataset for the above and target JSON dataset with required filename in the specific location.
Give the source dataset as the source of the dataflow and go to Source options->JSON settings ->Array of documents.
Then, take a derived column transformation and create a new column one with below expression.
replace(replace(replace(toString(requests),'"{','{'),'}"','}'),'\\"','"')

This will create a new string column with JSON array as JSON string.

Now, apply the Parse transformation on this column with the desired schema of the JSON string.
(action as string,endpoint as string)[]

This will create the required JSON array column.

Now, use Select transformation on this to remove the extra columns like request, one and rename the new column as requests.

Take sink of the dataflow with the JSON target dataset. In the dataset give the file name and to get a single file, you need to give the same file name in the sink settings as well.

Upon the execution of the dataflow from pipeline, the result file jsontarget1.json will be generated as shown below.
{"requests":[{"action":"UPDATE","endpoint":"v75.0/records/Product__c/b07XY0000098z5BZU"},{"action":"UPDATE","endpoint":"v75.0/records/Product__c/b07XY0000098z5AAB"}]}
{"requests":[{"action":"UPDATE","endpoint":"v75.0/records/Product__c/b07XY0000098z5BZU"},{"action":"UPDATE","endpoint":"v75.0/records/Product__c/b07XY0000098z5AAB"}]}
To convert the outer objects as JSON array, use copy activity with same target JSON dataset for both source and sink after the dataflow activity. In the copy activity sink, select the File pattern as Array of objects.

Now, upon the execution of the pipeline, the target JSON file will be generated with the desired JSON structure.

Hope this helps.
If the answer is helpful, please click Accept Answer and kindly upvote it. If you have any further questions about this answer, please click Comment.