Hello @Tian, Xinyong,
Welcome to the MS Q&A platform.
You can use the mssparkutils.session.stop()
API to stop the current interactive session asynchronously in the background, it stops the Spark session and releases resources occupied by the session so they are available to other sessions in the same pool.
You can add the mssparkutils.session.stop()
command at the end of your notebook code. if the notebook execution fails for any reason, the spark session will be stopped(spark session will be stopped regardless of whether the notebook execution succeeds or fails).
Please note: this will stop the entire Spark session associated with the notebook run, which may impact other running jobs or notebooks that are using the same Spark cluster.
(Or) you can use Spark Session - Cancel Spark Session API to cancel a running spark session.
DELETE {endpoint}/livyApi/versions/{livyApiVersion}/sparkPools/{sparkPoolName}/sessions/{sessionId}
Reference documents:
https://learn.microsoft.com/en-us/azure/synapse-analytics/spark/microsoft-spark-utilities?pivots=programming-language-python#stop-an-interactive-session
I hope this helps. Please let me know if you have any further questions.
If this answers your question, please consider accepting the answer by hitting the Accept answer and up-vote as it helps the community look for answers to similar questions