Cannot Save Python Logger Output to Unity Catalog Volume in Databricks

Yuma Taniguchi 20 Reputation points
2025-11-11T13:58:26.7366667+00:00

Hi everyone,

I’m using a Python logger in Databricks to save application logs, and I’d like to store them in a Unity Catalog volume. However, when I specify the volume path (e.g., volume://catalog/schema/volume/logs/app.log), the log file is not created, and no error is shown.

Here’s an example of my code (simplified):

import os
import logging
 
 
def main(cfg):
 
    # 1. Define configurations
    log_mode = cfg.logging.mode
    dbx_path_log_file = os.path.join("/Volumes/catalog_name/schema_name/log/train")
 
    # 2. Set logger
    logger = set_logger(name="example", log_file_path=dbx_path_log_file, mode=log_mode) # Note: Cannot save logs to databricks volume

def set_logger(name, log_file_path, mode="DEBUG"):
    os.makedirs(log_file_path, exist_ok=True)
    log_path = os.path.join(log_file_path, f"{name}.log")

    logger = logging.getLogger(name)
    logger.setLevel(logging.DEBUG)

    fh = logging.FileHandler(log_path, mode="w", encoding="utf-8")
    fh.setLevel(logging.DEBUG)
    fh.setFormatter(logging.Formatter("%(asctime)s %(levelname)s: %(message)s"))
    logger.addHandler(fh)

    ch = logging.StreamHandler()
    if mode == "DEBUG":
        ch.setLevel(logging.DEBUG)
    else:
        ch.setLevel(logging.INFO)
    ch.setFormatter(logging.Formatter("example: %(message)s"))
    logger.addHandler(ch)
    return logger


if __name__ == "__main__":
    main()


The code runs without error, but the file never appears in the specified volume. Has anyone managed to write logs directly to a Unity Catalog volume from Python in Databricks? If not supported, what’s the recommended way to persist application logs securely (e.g., ADLS Gen2 path, UC external location, etc.)?

Thanks for any advice!

Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
{count} votes

1 answer

Sort by: Most helpful
  1. PRADEEPCHEEKATLA 91,656 Reputation points Moderator
    2025-11-11T19:20:49.6266667+00:00

    Yuma Taniguchi - Thanks for the question and using MS Q&A Platform.

    Writing application logs directly to a Unity Catalog volume using Python’s logging.FileHandler is not supported in Databricks. Unity Catalog volumes are designed for storing data files, not for direct file I/O operations from Python. When you specify a path like:

    /Volumes/catalog_name/schema_name/volume_name/logs/app.log
    

    Note: The logger does not throw an error, but the file is never created because the volume is not mounted as a standard local filesystem.

    You can run the same code inside a Databricks notebook, where the /Volumes directory is available. Here’s the exact snippet to use in your Databricks notebook:

    # Define the directory and file path
    log_dir = "/Volumes/workspace/default/chepra/"
    dbutils.fs.mkdirs(log_dir)  # Use Databricks utility for creating directories
    file_path = f"{log_dir}logger.py"
    
    # Define the logger file content
    logger_code = """import logging
    import os
    
    # Define log file path inside the Databricks volume
    log_dir = "/Volumes/workspace/default/chepra/"
    os.makedirs(log_dir, exist_ok=True)
    log_file = os.path.join(log_dir, "app.log")
    
    # Configure logger
    logger = logging.getLogger("databricks_logger")
    logger.setLevel(logging.INFO)
    
    # Create file handler
    file_handler = logging.FileHandler(log_file)
    file_handler.setLevel(logging.INFO)
    
    # Create formatter
    formatter = logging.Formatter("%(asctime)s - %(levelname)s - %(message)s")
    file_handler.setFormatter(formatter)
    
    # Add handler to logger
    logger.addHandler(file_handler)
    
    # Example usage
    if __name__ == "__main__":
        logger.info("Logger initialized successfully!")
        logger.warning("This is a warning message.")
        logger.error("This is an error message.")
    """
    
    # Write the file to the volume
    with open(file_path, "w") as f:
        f.write(logger_code)
    
    print(f"Logger file created successfully at: {file_path}")
    

    ✅ This will create logger.py in /Volumes/workspace/default/chepra/ and set up logging to app.log in the same directory.

    ADB1111-02

    Hope this helps. Let me know if you have any further questions or need additional assistance. Also, if these answers your query, do click the "Upvote" and click "Accept the answer" of which might be beneficial to other community members reading this thread.


    𝘛𝘰 𝘴𝘵𝘢𝘺 𝘪𝘯𝘧𝘰𝘳𝘮𝘦𝘥 𝘢𝘣𝘰𝘶𝘵 𝘵𝘩𝘦 𝘭𝘢𝘵𝘦𝘴𝘵 𝘶𝘱𝘥𝘢𝘵𝘦𝘴 𝘢𝘯𝘥 𝘪𝘯𝘴𝘪𝘨𝘩𝘵𝘴 𝘰𝘯 𝘈𝘻𝘶𝘳𝘦 𝘋𝘢𝘵𝘢𝘣𝘳𝘪𝘤𝘬𝘴, 𝘥𝘢𝘵𝘢 𝘦𝘯𝘨𝘪𝘯𝘦𝘦𝘳𝘪𝘯𝘨, 𝘢𝘯𝘥 Data & AI 𝘪𝘯𝘯𝘰𝘷𝘢𝘵𝘪𝘰𝘯𝘴, 𝘧𝘰𝘭𝘭𝘰𝘸 𝘮𝘦 𝘰𝘯 𝘓𝘪𝘯𝘬𝘦𝘥𝘐𝘯.

    0 comments No comments

Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.