Expose spark metrics to prometheus
I want to expose spark cluster metrics in azure databrick to prometheus using Prometheus Serverlet. So I tried to edit the metrics.properties file to something like this
using the init script
cp /dbfs/databricks/metrics-conf/metrics.properties /databricks/spark/conf/
The file is successfully edited but I still cannot access the prometheus endpoint. As I checked the logs from spark, I don't see any prometheus servlet was created
Hello and welcome to Microsoft Q&A @Long Tran .
I haven't touched Prometheus before, but to my understanding, it doesn't come on Azure Databricks by default. If it isn't installed, then it can't be created. You can get Prometheus from
To sink metrics to Prometheus, you can use this third-party library: https://github.com/banzaicloud/spark-metrics.
Sign in to comment
Our prometheus is already setup in azure VM. I just need to expose spark metrics in prometheus format to some Rest endpoint so our prometheus can call that endpoint and get metrics. You can see in this link https://dzlab.github.io/bigdata/2020/07/03/spark3-monitoring-1/ spark 3 support expose metrics to Rest endpoint in prometheus format but I cannot do that with spark inside azure databricks