@Matthias Tauber - Thanks for the question and using MS Q&A platform.
Spark configuration: Specify values for Spark configuration properties listed in the topic: Spark Configuration - Application properties. Users can use default configuration and customized configuration.
To pass Spark Config options to a Spark Job Definition in Synapse Pipeline, you can add the configuration in the "Advanced" section of the Spark Job Definition object in your pipeline. Here are the steps to do this:
When you run your pipeline, the Spark Job Definition object will inherit the Spark Config options you specified in the "Advanced" section.
For more details, refer to Quickstart: Transform data using Apache Spark job definition.
Hope this helps. Do let us know if you any further queries.
If this answers your query, do click Accept Answer
and Yes
for was this answer helpful. And, if you have any further query do let us know.