Partager via


python_worker_logs

Retourne un DataFrame de journaux collectés à partir de workers Python.

Syntaxe

spark.tvf.python_worker_logs()

Examples

Exemple 1 : Collecter et afficher les journaux de travail Python

from pyspark.sql import functions as sf
import logging

@sf.udf("string")
def my_udf(x):
    logger = logging.getLogger("my_custom_logger")
    logger.warning("This is a warning")
    return str(x)

spark.conf.set("spark.sql.pyspark.worker.logging.enabled", "true")
spark.range(1).select(my_udf("id")).show()
+----------+
|my_udf(id)|
+----------+
|         0|
+----------+

Exemple 2 : Afficher les journaux collectés

spark.tvf.python_worker_logs().select(
    "level", "msg", "context", "logger"
).show(truncate=False)
+-------+-----------------+---------------------+----------------+
|level  |msg              |context              |logger          |
+-------+-----------------+---------------------+----------------+
|WARNING|This is a warning|{func_name -> my_udf}|my_custom_logger|
+-------+-----------------+---------------------+----------------+