Hello Todd Lazure,
Your understanding is correct that Azure Data Factory Mapping Data Flows functionally run on a Databricks or otherwise Spark cluster in the background. However, the Mapping Data Flow script is a dataflow-specific scripting language that allows programmatic development of these Mapping Data Flows, and this is different from the Scala language used in Databricks.
While it is possible to see some of the generated Spark code in Scala when debugging Mapping Data Flows in Azure Data Factory, this code is not publicly available in full. so, it is not possible to directly convert Mapping Data Flow code to Scala on Azure Databricks.
To transition your Mapping Data Flows to Azure Databricks, you will need to manually convert the logic and syntax used in your Mapping Data Flows to the appropriate Databricks libraries and syntax. This may involve significant modifications to your existing code, depending on the complexity of your Mapping Data Flows.
I hope this answers your question.