Updating Python Packages in a Synapse Workspace (Preview) Spark Pool

Jordan Fox 1 Reputation point
2020-10-22T18:07:47.343+00:00

I've followed the steps outlined here exactly:

https://techcommunity.microsoft.com/t5/azure-synapse-analytics/add-manage-libraries-in-spark-pool-after-the-deployment/bc-p/1809603#M143

Namely, generate a requirements.txt file using py -m pip freeze > requirements.txt, and then upload that requirements.txt file to the spark pool under the packages folder in synapse workspace. Restarting sessions and getting a list of local packages should show that it picked up the requirements, but to date, I haven't been able to update pandas.

Any thoughts?

Azure Synapse Analytics
Azure Synapse Analytics
An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.
4,843 questions
{count} votes

1 answer

Sort by: Most helpful
  1. Jordan Fox 1 Reputation point
    2020-10-22T23:50:25.003+00:00

    Turns out my issue was I named the file requirements with an s and it needed to be requirement without an s.

    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.