Hello @Nithin Gowrav !
Welcome to Microsoft QnA!
To access your utility modules and configs in your workspace when using a Serverless spark compute or synapse spark pool as compute, you can manage Spark pool level libraries for Apache Spark. You can install or remove them into a Spark pool and they will be available to all notebooks and jobs running on the pool. There are two primary ways to install a library on a Spark pool:
Install a workspace library that has been uploaded as a workspace package.
For updating Python libraries, provide a requirements.txt or Conda environment.yml environment specification to install packages from repositories like PyPI, Conda-Forge, and more.
You can read more about managing Spark pool level libraries for Apache Spark in Azure Synapse Analytics in this link.
https://learn.microsoft.com/en-us/azure/synapse-analytics/spark/apache-spark-manage-pool-packages
I hope this helps!
The answer or portions of it may have been assisted by AI Source: Microsoft CoPilot
Kindly mark the answer as Accepted and Upvote in case it helped!
Regards