If the out-of-the-box solutions do not meet your needs, consider using Azure Log Analytics custom logs :
- Modifying your Spark notebooks to write output details to a log file or directly to Log Analytics using the Log Analytics Data Collector API.
- Using Azure Functions or Automation Runbooks to periodically read these outputs and send them to Log Analytics.
For real-time processing and logging, you can use Azure Event Grid to subscribe to events from Azure Data Factory or Synapse Analytics. Or simply, trigger an Azure Function upon completion of activities to capture the output and log it to Azure Log Analytics.
For your specific case of capturing outputs from Spark notebook activities:
- Modify your Spark notebooks to include logging statements that write directly to Azure Log Analytics through the Data Collector API.
- Alternatively, write the outputs to an intermediate storage (like Azure Blob Storage) with detailed logging information, and then use a scheduled process (Azure Functions or Logic Apps) to ingest these logs into Log Analytics.