Share via

Livy session becomes "dead" when reading large .las file (~2GB) in Fabric Notebook

Gabriela Betancur Montoya 0 Reputation points
2026-02-23T13:42:54.4666667+00:00

Hi everyone,

I'm running a notebook in Microsoft Fabric where I read a large .las file (around 2GB) using a library specialized for LAS files.

The notebook takes a long time to execute, and eventually I receive the following error:

InvalidHttpRequestToLivy: Session ------ of workspace ------ on pool ------- is not running. It is either completed or not fully yet ready. Scheduler state : Scheduled. Plugin state : Cleanup. Livy state : dead HTTP status code: 400. Trace ID: ------.

My question is: Can this issue be handled from the code side, or is it more likely related to Spark/Livy configuration settings? What would be the best way to fix it?

Has anyone experienced something similar when processing large LAS files in Fabric?

Any guidance would be greatly appreciated.

Thanks in advance!

Community Center | Not monitored
{count} votes

1 answer

Sort by: Most helpful
  1. Q&A Assist
    2026-02-23T13:43:34.1733333+00:00
    AI generated content. This question contains an answer created with AI from Q&A Assist. Learn more

    The issue you're experiencing with the Livy session becoming "dead" when processing a large .las file is likely related to resource constraints or configuration settings in Spark/Livy. Here are some potential approaches to address the problem:

    1. Resource Allocation: Ensure that your Spark cluster has sufficient resources (memory and CPU) allocated to handle large files. You may need to increase the driver and executor memory settings in your Spark configuration.
    2. Session Timeout: Check if the session timeout settings are appropriate for your workload. If your notebook takes a long time to execute, consider adjusting the session timeout settings at the notebook or workspace level to prevent the session from timing out.
    3. Batch Processing: Instead of reading the entire 2GB file at once, consider breaking the file into smaller chunks if possible. This can help manage memory usage and prevent the session from becoming overwhelmed.
    4. Monitoring and Logs: Monitor the Spark application logs to identify any specific errors or warnings that may indicate what is causing the session to fail. This can provide insights into whether the issue is related to memory, timeouts, or other factors.
    5. Retry Logic: Implement retry logic in your code to handle transient failures. If the session fails due to temporary resource unavailability, retrying the operation after a short delay may succeed.

    By addressing these aspects, you may be able to resolve the issue with the Livy session and improve the performance of your notebook when processing large files.


    References:

    0 comments No comments

Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.