Expose spark metrics to prometheus

I want to expose spark cluster metrics in azure databrick to prometheus using Prometheus Serverlet. So I tried to edit the metrics.properties file to something like this
*.sink.prometheusServlet.class=org.apache.spark.metrics.sink.PrometheusServlet
*.sink.prometheusServlet.path=/metrics/prometheus
master.sink.prometheusServlet.path=/metrics/master/prometheus
applications.sink.prometheusServlet.path=/metrics/applications/prometheus
using the init script
!/bin/bash
cp /dbfs/databricks/metrics-conf/metrics.properties /databricks/spark/conf/
The file is successfully edited but I still cannot access the prometheus endpoint. As I checked the logs from spark, I don't see any prometheus servlet was created
Hi MartinJaffer
Our prometheus is already setup in azure VM. I just need to expose spark metrics in prometheus format to some Rest endpoint so our prometheus can call that endpoint and get metrics. You can see in this link https://dzlab.github.io/bigdata/2020/07/03/spark3-monitoring-1/ spark 3 support expose metrics to Rest endpoint in prometheus format but I cannot do that with spark inside azure databricks