How to add external JARs to spark pook deployment using ADO

Gabriel-2005 405 Reputation points
2024-03-20T11:11:13.33+00:00

After successfully following the steps outlined in this guide: https://learn.microsoft.com/en-us/azure/synapse-analytics/spark/apache-spark-manage-pool-packages, I've managed to upload an external JAR file as a Spark pool package.

Could you please provide guidance on the correct method to include these packages within the Spark pool during deployment using Azure DevOps (ADO)? Currently, our deployment process relies on ARM templates to deploy service artifacts across various environments.

Azure Synapse Analytics
Azure Synapse Analytics
An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.
5,143 questions
0 comments No comments
{count} votes

Accepted answer
  1. Smaran Thoomu 19,225 Reputation points Microsoft Vendor
    2024-03-20T12:06:40.8633333+00:00

    Hi @vikranth-0706

    Thank you for reaching out to the community forum with your query.

    If you are looking for non-UI ways of adding packages to your Spark pool, you can use the Azure CLI or REST APIs.

    You can find detailed documentation on how to manage your packages outside of the Synapse Analytics UI in this section of the Azure documentation. These methods can be useful for CI/CD scenarios where you want to automate the package management process.

    Hope this helps. Do let us know if you any further queries.


    If this answers your query, do click Accept Answer and Yes for was this answer helpful. And, if you have any further query do let us know.

    1 person found this answer helpful.
    0 comments No comments

0 additional answers

Sort by: Most helpful

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.