need to set "spark.rpc.message.maxSize"

uuu zhu 40 Reputation points

In Synapse Notebook, I tried to run


had the following error:

Py4JJavaError Traceback (most recent call last) Cell In [9], line 1 ----> 1 spark.conf.get("spark.rpc.message.maxSize") File /opt/spark/python/lib/, in RuntimeConfig.get(self, key, default) 47 self._checkType(key, "key") 48 if default is _NoValue: ---> 49 return self._jconf.get(key) 50 else: 51 if default is not None: File ~/cluster-env/env/lib/python3.10/site-packages/py4j/, in, args) 1315 command = proto.CALL_COMMAND_NAME +\ 1316 self.command_header +\ 1317 args_command +\ 1318 proto.END_COMMAND_PART 1320 answer = self.gateway_client.send_command(command) -> 1321 return_value = get_return_value( 1322 answer, self.gateway_client, self.target_id, 1324 for temp_arg in temp_args: 1325 temp_arg._detach() File /opt/spark/python/lib/, in capture_sql_exception.<locals>.deco(a, kw) 188 def deco(*a: Any, *kw: Any) -> Any: 189 try: --> 190 return f(a, kw) 191 except Py4JJavaError as e: 192 converted = convert_exception(e.java_exception) File ~/cluster-env/env/lib/python3.10/site-packages/py4j/, in get_return_value(answer, gateway_client, target_id, name) 324 value = OUTPUT_CONVERTERtype 325 if answer[1] == REFERENCE_TYPE: --> 326 raise Py4JJavaError( 327 "An error occurred while calling {0}{1}{2}.\n". 328 format(target_id, ".", name), value) 329 else: 330 raise Py4JError( 331 "An error occurred while calling {0}{1}{2}. Trace:\n{3}\n". 332 format(target_id, ".", name, value)) Py4JJavaError: An error occurred while calling o1433.get. : java.util.NoSuchElementException: spark.rpc.message.maxSize at org.apache.spark.sql.errors.QueryExecutionErrors$.noSuchElementExceptionError(QueryExecutionErrors.scala:1678) at org.apache.spark.sql.internal.SQLConf.$anonfun$getConfString$3(SQLConf.scala:5194) at scala.Option.getOrElse(Option.scala:189) at org.apache.spark.sql.internal.SQLConf.getConfString(SQLConf.scala:5194) at org.apache.spark.sql.RuntimeConfig.get(RuntimeConfig.scala:72) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke( at sun.reflect.DelegatingMethodAccessorImpl.invoke( at java.lang.reflect.Method.invoke( at py4j.reflection.MethodInvoker.invoke( at py4j.reflection.ReflectionEngine.invoke( at py4j.Gateway.invoke( at py4j.commands.AbstractCommand.invokeMethod( at py4j.commands.CallCommand.execute( at at

Azure Synapse Analytics
Azure Synapse Analytics
An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.
4,484 questions
0 comments No comments
{count} votes

Accepted answer
  1. PRADEEPCHEEKATLA-MSFT 80,096 Reputation points Microsoft Employee

    @uuu zhu - Thanks for the question and using MS Q&A platform.

    UPDATE:(30/04/2024: 8:30 PM IST)

    When I tried to run this code on the Apache Spark version: 3.3 got the same error message as shown above:
    User's image

    As per the error message: AnalysisException: Cannot modify the value of a Spark config: spark.rpc.message.maxSize. See also '' and if you check the document it shows what is excepted on Apache Spark version: >3.0

    User's image

    To resolve the issue, You can disable such a check by setting spark.sql.legacy.setCommandRejectsSparkCoreConfs to false

    spark.conf.set("spark.sql.legacy.setCommandRejectsSparkCoreConfs", False)

    User's image

    The solution works without any issue on Apache Spark version: 2.4:

    User's image

    In Synapse Notebook, you can set the spark configuration as shown below:

    spark.conf.set('spark.rpc.message.maxSize','512') # To set the spark configuration  
    spark.conf.get('spark.rpc.message.maxSize') # To get the spark configuration   


    Hope this helps. Do let us know if you any further queries.

    If this answers your query, do click Accept Answer and Yes for was this answer helpful. And, if you have any further query do let us know.

    1 person found this answer helpful.

0 additional answers

Sort by: Most helpful