How to make Log send to stderr and also shows in cell output

Jiacheng Zhang 5 Reputation points
2024-09-12T20:31:49.4966667+00:00

Hi Team,

May I know in Synapse pyspark notebook, how can we write log to stderr, and also shows it in the cell output?

I try to write the log with log4j, log will goes to stderr log, but will not shows in cell output

User's image

User's image

But if we try with different log ways like: logging.getLogger, sys.stderr.write('log test'), print("test log", file=sys.stderr), etc.

Log will shows in cell ouput, and in stdout, but will not goes to stderr.

User's image

User's image

May I know is there anyway log can goes to stderr and also shows in cell output? Thanks so much!

Azure Synapse Analytics
Azure Synapse Analytics
An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.
4,919 questions
{count} vote

1 answer

Sort by: Most helpful
  1. Amira Bedhiafi 24,556 Reputation points
    2024-09-12T21:15:03.0366667+00:00

    Use Python logging to send logs to both stderr and display the logs in the notebook output.

    
    import logging
    
    # Create a logger
    
    logger = logging.getLogger('synapse_logger')
    
    logger.setLevel(logging.DEBUG)
    
    # Create a StreamHandler for stderr (default for logging)
    
    stderr_handler = logging.StreamHandler()
    
    stderr_handler.setLevel(logging.DEBUG)
    
    # Create formatter and add it to the handler
    
    formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
    
    stderr_handler.setFormatter(formatter)
    
    # Add the handler to the logger
    
    logger.addHandler(stderr_handler)
    
    # Example log messages
    
    logger.info("This is an info message.")
    
    logger.error("This is an error message.")
    
    

    If you want to use Spark log4j logging and ensure logs are also displayed in the notebook output, you can use Spark's built-in logging with a PySpark wrapper.

    
    from pyspark import SparkContext
    
    from pyspark.sql import SparkSession
    
    # Initialize Spark session if not already done
    
    spark = SparkSession.builder.getOrCreate()
    
    # Get logger from log4j
    
    log4jLogger = spark._jvm.org.apache.log4j
    
    logger = log4jLogger.LogManager.getLogger("synapse_pyspark")
    
    # Log some messages
    
    logger.info("This is an info message from log4j.")
    
    logger.error("This is an error message from log4j.")
    
    
    

    Normally, both approaches will make the logs appear in the notebook's cell output by default. If not, check that the Spark or Python log levels aren't being filtered out in your Synapse workspace configuration.


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.