@Abhiram Duvvuru - Thanks for the question and using MS Q&A platform.
There are two primary ways to install a library on a Spark pool:
- Install a workspace library that has been uploaded as a workspace package.
- For updating Python libraries, provide a requirements.txt or Conda environment.yml environment specification to install packages from repositories like PyPI, Conda-Forge, and more. Read the section about environment specification for further information.
You can use easily upload ODBC jar file workspace package and install directly on the Apache spark pool.
Here is an example on how to add workspace package and install directly on the Apache spark pool:
Step1: After adding your .whl
file in to the workspace packages.
Note: You can add artifacts like (.whl, .jar, or .tar.gz) files to package your workspace packages
Step2: Associate package to spark pool package as below.
Go-To Apache Spark pool click on More option in your spark pool, then packages.
You will get option to upload in different ways as below. First, enable Allow session level packages option, then under Workspace packages select the file you uploaded in workspace earlier and click on apply settings.
If you don't have any .whl
files just create requirements.txt
add required packages and upload here under Requirements files. After clicking on apply settings wait for some time to successfully apply settings. Then, go to notebook stop the current session and run notebook in new spark session.
Note: Make sure you run notebook in new session after adding packages.
Hope this helps. Do let us know if you any further queries.
If this answers your query, do click Accept Answer
and Yes
for was this answer helpful. And, if you have any further query do let us know.