Hello Janvi Majmundar,
Welcome to the Microsoft Q&A forum.
As per the error message, the number of V-cores are exhausted on the spark session. When you define a Spark pool, you are effectively defining a quota per user for that pool.
The vcores limit depends on the node size and the number of nodes. To solve this problem, you have to reduce your usage of the pool resources before submitting a new resource request by running a notebook or a job.
(or)
Please scale up the node size and the number of nodes.
To scale up :
Go to the spark pool and click on scale settings. You can see the node size and the number of nodes.
A similar issue has been discussed on the below threads.
I hope this answers your question.
If this answers your question, please consider accepting the answer by hitting the Accept answer and up-vote as it helps the community look for answers to similar questions.