You can create a new linked service for the REST API where you provide the base URL and any necessary authentication details.
Then, create a dataset for the Data Lake Storage where the JSON files will be stored.
Configure it to use the linked service you created earlier.
Create a dataset for the REST API, set the request method and then configure it to use the REST linked service you created earlier.
In ADF, drag the ForEach activity into the pipeline canvas.
Set the items property of the ForEach activity to the list of IDs you want to use in the API calls. This could be a JSON array or a dataset containing the IDs.
Inside the ForEach Activity, add a Copy Data activity inside the ForEach activity.
Source Configuration:
- Set the source dataset to the REST API dataset you created.
- In the dataset settings, use dynamic content to construct the URL with the current item from the ForEach loop. For example, if your list of IDs is called
IDList
, you can use@concat('https://url.com/api/', item().ID)
to dynamically set the URL for each API call.
Sink Configuration:
- Set the sink dataset to the Azure Data Lake Storage dataset.
- Configure the file path dynamically to store each JSON result as a separate file. For example, you could use
@concat('output/', item().ID, '.json')
to create a unique file for each API response.
Your dynamic content expressions :
- Source Dataset URL:
@concat('https://url.com/api/', item().ID)
- Sink Dataset File Path:
@concat('output/', item().ID, '.json')
If your list of IDs is static and you want to define it directly within the pipeline, you can use:
{
"IDList": [
{"ID": "123"},
{"ID": "456"},
{"ID": "789"}
]
}
And set the ForEach activity items property to @activity('GetIDList').output.IDList
.