import python functions and variables from other notebooks in Azure Synapse Analytics

Farrukh Tashpulatov 81 Reputation points
2021-10-27T08:21:05.707+00:00

Is there a way to import into a pyspark notebook some functions/variables from other pyspark notebooks (within the same workspace)?

I tried the following:

from ipynb.fs.defs.some_other_notebook import some_function

but I get KeyError: '__package__' error, even though ipynb is installed in my workspace

I also tried

from ipynb.fs.full.some_other_notebook import some_function

and that resulted in the same error.

Azure Synapse Analytics
Azure Synapse Analytics
An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.
5,373 questions
{count} votes

Accepted answer
  1. Saurabh Sharma 23,846 Reputation points Microsoft Employee Moderator
    2021-11-10T00:02:40.447+00:00

    Hi @Farrukh Tashpulatov ,

    As per the internal conversation one of the solution you have is that you can package all the reusable functions in python wheel package and import in the notebook wherever you needed to call and it worked great. This might be your solution If your requirement is modularity and reusability.

    Thanks
    Saurabh

    1 person found this answer helpful.

0 additional answers

Sort by: Most helpful

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.