need to set "spark.rpc.message.maxSize"

uuu zhu 20 Reputation points
2024-04-30T11:21:00.2066667+00:00

In Synapse Notebook, I tried to run

spark.conf.get("spark.rpc.message.maxSize")

had the following error:

Py4JJavaError Traceback (most recent call last) Cell In [9], line 1 ----> 1 spark.conf.get("spark.rpc.message.maxSize") File /opt/spark/python/lib/pyspark.zip/pyspark/sql/conf.py:49, in RuntimeConfig.get(self, key, default) 47 self._checkType(key, "key") 48 if default is _NoValue: ---> 49 return self._jconf.get(key) 50 else: 51 if default is not None: File ~/cluster-env/env/lib/python3.10/site-packages/py4j/java_gateway.py:1321, in JavaMember.call(self, args) 1315 command = proto.CALL_COMMAND_NAME +\ 1316 self.command_header +\ 1317 args_command +\ 1318 proto.END_COMMAND_PART 1320 answer = self.gateway_client.send_command(command) -> 1321 return_value = get_return_value( 1322 answer, self.gateway_client, self.target_id, self.name) 1324 for temp_arg in temp_args: 1325 temp_arg._detach() File /opt/spark/python/lib/pyspark.zip/pyspark/sql/utils.py:190, in capture_sql_exception.<locals>.deco(a, kw) 188 def deco(*a: Any, *kw: Any) -> Any: 189 try: --> 190 return f(a, kw) 191 except Py4JJavaError as e: 192 converted = convert_exception(e.java_exception) File ~/cluster-env/env/lib/python3.10/site-packages/py4j/protocol.py:326, in get_return_value(answer, gateway_client, target_id, name) 324 value = OUTPUT_CONVERTERtype 325 if answer[1] == REFERENCE_TYPE: --> 326 raise Py4JJavaError( 327 "An error occurred while calling {0}{1}{2}.\n". 328 format(target_id, ".", name), value) 329 else: 330 raise Py4JError( 331 "An error occurred while calling {0}{1}{2}. Trace:\n{3}\n". 332 format(target_id, ".", name, value)) Py4JJavaError: An error occurred while calling o1433.get. : java.util.NoSuchElementException: spark.rpc.message.maxSize at org.apache.spark.sql.errors.QueryExecutionErrors$.noSuchElementExceptionError(QueryExecutionErrors.scala:1678) at org.apache.spark.sql.internal.SQLConf.$anonfun$getConfString$3(SQLConf.scala:5194) at scala.Option.getOrElse(Option.scala:189) at org.apache.spark.sql.internal.SQLConf.getConfString(SQLConf.scala:5194) at org.apache.spark.sql.RuntimeConfig.get(RuntimeConfig.scala:72) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244) at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357) at py4j.Gateway.invoke(Gateway.java:282) at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) at py4j.commands.CallCommand.execute(CallCommand.java:79) at py4j.GatewayConnection.run(GatewayConnection.java:238) at java.lang.Thread.run(Thread.java:750)

Azure Synapse Analytics
Azure Synapse Analytics
An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.
4,441 questions
0 comments No comments
{count} votes

Accepted answer
  1. PRADEEPCHEEKATLA-MSFT 78,986 Reputation points Microsoft Employee
    2024-04-30T12:44:41.2533333+00:00

    @uuu zhu - Thanks for the question and using MS Q&A platform.

    UPDATE:(30/04/2024: 8:30 PM IST)

    When I tried to run this code on the Apache Spark version: 3.3 got the same error message as shown above:
    User's image

    As per the error message: AnalysisException: Cannot modify the value of a Spark config: spark.rpc.message.maxSize. See also 'https://spark.apache.org/docs/latest/sql-migration-guide.html#ddl-statements' and if you check the document it shows what is excepted on Apache Spark version: >3.0

    User's image

    To resolve the issue, You can disable such a check by setting spark.sql.legacy.setCommandRejectsSparkCoreConfs to false

    spark.conf.set("spark.sql.legacy.setCommandRejectsSparkCoreConfs", False)
    

    User's image


    The solution works without any issue on Apache Spark version: 2.4:

    User's image

    In Synapse Notebook, you can set the spark configuration as shown below:

    spark.conf.set('spark.rpc.message.maxSize','512') # To set the spark configuration  
    spark.conf.get('spark.rpc.message.maxSize') # To get the spark configuration   
    

    163142-image.png

    Hope this helps. Do let us know if you any further queries.


    If this answers your query, do click Accept Answer and Yes for was this answer helpful. And, if you have any further query do let us know.

    1 person found this answer helpful.

0 additional answers

Sort by: Most helpful