So it looks like you want to set up an Azure Logic App pipeline that does the following every day:
- Fetch JSON metadata (the one at
https://data.cms.gov/provider-data/api/...). - Extract the
distribution → data → downloadURLvalue (the actual CSV link). - Download the CSV file from that link.
- Store the CSV in Azure (e.g., in Blob Storage, Data Lake, or SQL depending on your use case).
If so, here's a high-level flow:
- Trigger
- Recurrence trigger → runs daily.
- HTTP Request (GET)
- Use HTTP action to call the JSON metadata API (
https://data.cms.gov/provider-data/api/...).
- Use HTTP action to call the JSON metadata API (
- Parse JSON
- Use Parse JSON action with the schema from the API response.
- This will let you navigate to
distribution[0].data.downloadURL.
- Extract URL
- From the parsed JSON, get the field
downloadURL.
- From the parsed JSON, get the field
- HTTP Request (GET CSV)
- Use the extracted
downloadURLin another HTTP action to download the CSV file.
- Use the extracted
- Save File to Azure
- Use Azure Blob Storage - Create blob action (or Data Lake equivalent).
- File name can include timestamp (e.g.,
NH_HlthInsp_@{utcNow()}.csv).
+------------------+ | Recurrence | | (Daily Trigger) | +------------------+ | v +---------------------+ | HTTP - GET JSON | | (CMS API Metadata) | +---------------------+ | v +---------------------+ | Parse JSON | | (extract downloadURL)| +---------------------+ | v +---------------------+ | HTTP - GET CSV | | (use downloadURL) | +---------------------+ | v +--------------------------+ | Azure Blob Storage | | (Save CSV file) | +--------------------------+
---
If the above response helps answer your question, remember to "Accept Answer" so that others in the community facing similar issues can easily find the solution. Your contribution is highly appreciated.
hth
Marcin