What about using Azure Data Factory (ADF), Event Grid, Logic Apps, or Service Bus Queue that can simplify the process of triggering a Python Databricks notebook when a file arrives in any of the specified folders? Here's an approach that combines ADF, Event Grid, and Logic Apps:
Set up Event Grid subscription: Create an Event Grid subscription for the parent folder where the files are located. This subscription will listen for file arrival events.
Create an Event Grid trigger in Logic Apps: Create a Logic App with an Event Grid trigger. Configure the trigger to listen to the events from the Event Grid subscription.
Add a condition in Logic Apps: In the Logic App workflow, add a condition step to check if the file arrival event occurred in one of the specified folders. You can use expressions or functions to evaluate the file path and determine if it matches any of the desired folders.
Trigger an ADF pipeline: If the condition in step 3 evaluates to true, use the "Azure Data Factory - Create a pipeline run" action in Logic Apps to trigger an ADF pipeline. Pass the necessary parameters, such as the file path, to the pipeline.
Execute the Python Databricks notebook: In the triggered ADF pipeline, add an activity to execute the Python Databricks notebook. You can use the "Databricks Notebooks" activity in ADF to run the notebook and pass the required parameters.
P.S. The above solution assumes you have connectivity between your on-premises environment and Azure. You must set up appropriate connectivity options like Azure ExpressRoute or Azure Virtual Network Gateway to establish a secure connection.