InvalidHttpRequestToLivy: Submission failed due to error content =["requirement failed: Session isn't active."] HTTP status code: 400.

Daniel Wang 5 Reputation points
2024-12-30T09:10:35.3833333+00:00

I searched the web high and low, and cannot find a viable solution.

I am using Azure Synapse.

I have 24 files within the Gen2 storage account, which I run a loop to read_csv for them all. Then write them into delta table with each run.

After around 40 mins or so, "InvalidHttpRequestToLivy: Submission failed due to error content =["requirement failed: Session isn't active."] HTTP status code: 400." message pops up.

My company is using the lowest tier with limited nodes and cores. How do I extend the time for this?

Azure Synapse Analytics
Azure Synapse Analytics
An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.
{count} votes

1 answer

Sort by: Most helpful
  1. Anonymous
    2024-12-30T18:43:27.38+00:00

    @Daniel Wang

    Welcome to the Microsoft Q&A and thank you for posting your questions here.

    Please check the below steps and confirm us

    Increase the Session Timeout: You can extend the session timeout by setting the livy.server.session.timeout property in your Spark configuration. For example, you can set it to 4 hours:

    spark.conf.set("livy.server.session.timeout", "4h")
    

    This should give your long-running jobs more time to complete

    Optimize Your Spark Job: Try to optimize your Spark job to reduce execution time. This can include:

    • Improving the efficiency of your Spark code.
    • Tuning Spark configurations.
    • Increasing the resources allocated to your Spark job1.

    Check Network Connectivity: Ensure there are no network connectivity issues between the Livy server and the Spark driver. This includes checking firewall and network settings

    Use a Different Spark Pool: If possible, try running your Spark job on a different Spark pool to see if that resolves the issue

    Review Spark Logs: Check the Spark logs for any errors or warnings that might be causing the issue. You can access these logs from the Azure Synapse Analytics workspace or the Azure portal

    I hope the above steps will resolve the issue, please do let us know if issue persists. Thank you


Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.