Is there a way to install Python / Scala libraries from within notebooks?

Dimitri B 66 Reputation points
2020-06-15T06:18:49.193+00:00

In Databricks it is possible to install Python (but not Scala) libraries from within notebook, into notebook's local scope, e.g.:

!pip install 'library-name'

or

dbutils.library.installPyPI("library-name")

Is something similar possible with Synapse?

Azure Synapse Analytics
Azure Synapse Analytics
An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.
4,369 questions
0 comments No comments
{count} votes

Accepted answer
  1. Euan Garden 136 Reputation points
    2020-06-15T06:21:00.25+00:00

    Not currently no. Python libraries can be added via a requirements.txt (output from a pip freeze) at the Spark pool level. We will add more options in the future.

    -Euan

    1 person found this answer helpful.
    0 comments No comments

0 additional answers

Sort by: Most helpful