Call notebook from Spark job

Ryan Abbey 1,186 Reputation points
2021-08-09T04:29:51.837+00:00

Something I didn't think would be particularly complex but no details of how to do such in the MS help files apache-spark-job-definitions

I've created a Spark notebook that I want to call from a Spark job definition, how to do such? Or am I overthinking and should follow exactly how it's done for DataBricks?

Azure Synapse Analytics
Azure Synapse Analytics
An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.
5,357 questions
0 comments No comments
{count} vote

1 answer

Sort by: Most helpful
  1. HimanshuSinha-msft 19,486 Reputation points Microsoft Employee Moderator
    2021-08-10T03:14:12.653+00:00

    Hello @Ryan Abbey ,
    Thanks for the ask and using the Microsoft Q&A platform .
    You can do it from Synapse Studio . the below screen shot should help .

    121845-image.png

    Please do let me know how it goes .
    Thanks
    Himanshu
    Please do consider clicking on "Accept Answer" and "Up-vote" on the post that helps you, as it can be beneficial to other community members


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.