Hi @Anonymous ,
I got update internally. Kindly check pointers that helps for your case.
1 For your update failure. Can you check if there was any spark job triggered and if so, can you check the output generated? This will help you identify the root cause of the issue and confirm that in fact the order of the packages is breaking the update.
2 Updating libraries through the storage account is a legacy feature and no plan to add any more changes there. Is there any reason you are using this method? In fact, this process is not supported in the latest versions of Spark 3+.
3 geopandas is available in Anaconda so, the best way to install it is though a YML file and let conda resolve all the dependencies. You can also use directly the whl file, but I believe the most convenient way is YML. You can find all about the latest Library Management process here: https://learn.microsoft.com/en-us/azure/synapse-analytics/spark/apache-spark-manage-python-packages#install-wheel-files
Hope this will help. Thank you.
----------
Please consider hitting Accept Answer
. Accepted answers helps community as well.