Synapse spark session LIVY_JOB_STATE_DEAD

When I want to use .toPandas ()
on a spark dataframe I receive an error that references
org.apache.spark.SparkException: Kryo serialization failed: Buffer overflow. Available: 0, required: 995464. To avoid this, increase spark.kryoserializer.buffer.max value.
So, I use the Apache Spark configurations interface to change a configuration where the buffer.max is set to 512 as I've seen others do online. However, when I do this and I start my notebook (or any notebook with such a config) I receive the error
LIVY_JOB_STATE_DEAD: Livy session has failed. Session state: Dead. Error code: LIVY_JOB_STATE_DEAD. [plugins.ws-syn-pdm.pdmspark2.10 WorkspaceType:
Try read the executor logs for this error and if its related to Vegas cache memory error, try below options in synapse workspace
1 > Try set the spark configurations for your pools. Set spark.synapse.vegas.useCache => false
2> Modify the pool scale settings and set the intelligent cache slider to 0%.