@Rohit Dobbanaboina - Thanks for the question and using MS Q&A platform.
As metioned in the official documentation: Orchestrate Azure Databricks jobs with Apache Airflow
You can access locally in [http://localhost:8080/]. Credentials are exposed in the command line (normally it is admin) while installing the Airflow Azure Databricks integration.
You can configure an Azure Databricks connection using PAT token as metioned here: Create an Azure Databricks personal access token for Airflow.
Your Airflow installation contains a default connection for Azure Databricks. To update the connection to connect to your workspace using the personal access token you created above:
- In a browser window, open [http://localhost:8080/connection/list/].
- Under Conn ID, locate databricks_default and click the Edit record button.
- Replace the value in the Host field with the workspace instance name of your Azure Databricks deployment.
- In the Extra field, enter the following value:
{"token": "PERSONAL_ACCESS_TOKEN"}
Replace PERSONAL_ACCESS_TOKEN
with your Azure Databricks personal access token.
Hope this helps. Do let us know if you any further queries.
If this answers your query, do click Accept Answer
and Yes
for was this answer helpful. And, if you have any further query do let us know.