Can I specify command line arguments when deploying .NET for Apache Spark with Spark Submit in Azure Databricks?

大介 西野 21 Reputation points

I've modified "Get started in 10 minutes" to allow you to specify a file name.

spark-submit ^

--class org.apache.spark.deploy.dotnet.DotnetRunner ^
--master local ^
microsoft-spark-2-4_2.11-1.0.0.jar ^
dotnet MySparkApp.dll input.txt

How do I get it to work this with Azure Databricks?


Microsoft Technologies based on the .NET software framework.
3,323 questions
Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
1,901 questions
An object-oriented and type-safe programming language that has its roots in the C family of languages and includes support for component-oriented programming.
10,164 questions
0 comments No comments
{count} votes

Accepted answer
  1. PRADEEPCHEEKATLA-MSFT 76,361 Reputation points Microsoft Employee

    Hello @大介 西野 ,

    Thanks for the question and using the MS Q&A platform.

    You can run your .NET for Apache Spark jobs on Databricks clusters, but it is not available out-of-the-box. There are two ways to deploy your .NET for Apache Spark job to Databricks: spark-submit and Set Jar.

    For more details, refer to Submit a .NET for Apache Spark job to Databricks.

    Hope this helps. Do let us know if you any further queries.


    • Please accept an answer if correct. Original posters help the community find answers faster by identifying the correct answer. Here is how.
    • Want a reminder to come back and check responses? Here is how to subscribe to a notification.

0 additional answers

Sort by: Most helpful