Adding external spark packages/jars to spark pool during deployment.

- 51 Reputation points Microsoft Employee
2022-11-11T01:09:59.557+00:00

By following the steps here: https://learn.microsoft.com/en-us/azure/synapse-analytics/spark/apache-spark-manage-pool-packages I am able to upload an external jar as a spark pool package.

May I know what is the right way to add such packages to spark pool during the deployment using ADO?
We currently use arm templates to deploy service artifacts in different environments.

Azure Synapse Analytics
Azure Synapse Analytics
An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.
5,373 questions
{count} votes

1 answer

Sort by: Most helpful
  1. KranthiPakala-MSFT 46,642 Reputation points Microsoft Employee Moderator
    2022-11-23T02:53:10.863+00:00

    Hello @- ,

    Are you looking for non-UI ways of adding packages to your spark pool? If yes, then we have az CLI and REST APIs that can be used in CI/CD.
    You can find documentation on that in this section: https://learn.microsoft.com/en-us/azure/synapse-analytics/spark/apache-spark-azure-portal-add-libraries#manage-your-packages-outside-synapse-analytics-ui

    Hope this info helps.

    ----------

    Please do consider clicking on "Accept Answer" and "Upvote" on the post that helps you, as it can be beneficial to other community members.

    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.