Hi
I'm wondering if it's possible to upgrade/downgrade Python libraries for apache spark application if they're part of the core set: https://learn.microsoft.com/en-us/azure/synapse-analytics/spark/apache-spark-24-runtime
When I try to change any of these libraries using a requirements.txt, I get the following error, which leads me to believe it may not be possible which is annoying:
Error code 1
LIVY_JOB_STATE_DEAD
Message
[plugins.ultsynapse.systemreservedpool-librarymanagement.16 CCID:<d3b34775-25fb-46dc-bc88-907c81fb1ab5>] [Monitoring] Livy Endpoint=[https://hubservice.uksouth.azuresynapse.net:8001/api/v1.0/publish/18978b88-bb52-4e21-b7fe-997ef98d9a6f]. Livy Id=[0] Job failed during run time with state=[dead].
Edit: Seems to be working now as I believe I made a typo in the requirements text (= instead of ==). Sorry, idk if this question will still be useful to other people or not.