How to run .NET Spark jobs on Databricks from Azure Data Factory?

nikhil.sharma3 1 Reputation point
2020-08-05T06:43:23.207+00:00

In azure data factory, you have a Databricks Acvitiy. This activity supports running python, jar and notebooks. And These notebooks may be written in scala, python, java, and R but not c#/.net.

Is there inherent or direct support where I can write my .NET spark code and run it on Databricks from Data Factory?

Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
2,222 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
10,868 questions
0 comments No comments
{count} votes

2 answers

Sort by: Most helpful
  1. Vaibhav Chaudhari 38,746 Reputation points
    2020-08-05T07:15:41.147+00:00

    I'm not sure if it is possible in Azure Databricks and Data factory.

    But this might be possible in Azure synapse notebook where you can write .Net for spark code and may be schedule synapse notebook in Synapse pipeline.

    See this video : Scaling-NET-for-Apache-Spark-processing-jobs-with-Azure-Synapse

    ===============================================
    If the response helped, do "Accept Answer" and upvote it -- Vaibhav


  2. HimanshuSinha-msft 19,476 Reputation points Microsoft Employee
    2020-08-10T15:53:06.977+00:00

    Hello ,

    Thanks for asking the question and using the forum .

    We have something in preview at this time . We don't have a timeline when it will GA .
    https://learn.microsoft.com/en-us/dotnet/spark/tutorials/databricks-deployment.
    Thanks
    Himanshu

    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.