Start by creating an ADF instance in the Azure portal then create a linked service for the Dataverse and another for the storage or database where you want to store the report.
After that, create datasets for the source data (Dataverse entities) and the destination data (storage or database).
Now to build your pipeline you need to :
- Add activities to the pipeline to extract data from the Dataverse.
- Transform the data if needed using data flow activities or custom scripts.
- Load the transformed data into the storage or database.
For the automation part, you can :
- Schedule the pipeline to run at desired intervals (e.g., daily, weekly).
- Use triggers in Azure Data Factory to automate the execution of the pipeline.
To create a report
- Use a reporting tool (for example Power BI) to create a report based on the data stored in the storage or database.
- Connect the reporting tool to the storage or database.
- Design the report layout and add necessary visualizations.
Links to help you :
https://learn.microsoft.com/en-us/power-apps/maker/data-platform/export-to-data-lake-data-adf
https://community.dynamics.com/blogs/post/?postid=4a59cba1-3887-4c7c-9a1c-6b89e9c071da