Azure Databricks provides a rich set of REST APIs which you can leverage for this purpose. As understood from your question, you want to run your Databricks job you wrote in a notebook in SQL.
- You need to authenticate first to the Databricks. For that there are multiple ways. You can either use PAT (personal access token) or use Azure Active Directory Authentication.
- Once authenticated, you can call Job API to invoke either Run Now or Run Submit depending on your scenario to trigger the job.
From the Azure Function perspective, you will need to maintain configurations required for the authentication and the your Databricks API endpoint (any secrets are recommended to be stored in Key Vault). The code involved would be dependent on the language you are using in your Function app.