The best approach is likely to use a docker image to get all the dependencies packaged up; the following AWS project (Spark on AWS Lambda (SoAL) ) can be a good starting point to configure an Azure function equivalent.
how can I run PySpark in an Azure Function App?
Eduardo Soares Penido
11
Reputation points
Hi.
I'm struggling to run pyspark in an Azure Function App.
It returns me Java Runtime error.
Is there a way to set Az Function App configuration to run PySpark?