error using concurrent.futures (ThreadPoolExecutor) in Synapse Apache Spark notebook

alexander tikhomirov 31 Reputation points
2022-01-21T08:35:09.79+00:00

I have an unpredictable error

com.fasterxml.jackson.databind.exc.MismatchedInputException: No content to map due to end-of-input
at [Source: (String)""; line: 1, column: 0]

using concurrent.futures.ThreadPoolExecutor approach to parallel tasks in my python code for apache spark notebook

for example using simple example. but sometimes it works.

167097-image.png

import concurrent.futures  
import time  
  
def subjob(i):  
    print(i)  
  
  
def main(p):  
    with concurrent.futures.ThreadPoolExecutor(max_workers=5) as executor:  
        futures = {executor.submit(subjob, i): i for i in p}  
        for future in concurrent.futures.as_completed(futures):  
            print("outer job done\n")  
  
if __name__ == "__main__":  
    p = [1,2]  
    main(p)  
Azure Synapse Analytics
Azure Synapse Analytics
An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.
5,137 questions
0 comments No comments
{count} votes

1 answer

Sort by: Most helpful
  1. ShaikMaheer-MSFT 38,501 Reputation points Microsoft Employee
    2022-01-24T10:52:00.597+00:00

    Hi @alexander tikhomirov ,

    Thank you for posting query in Microsoft Q&A Platform.

    I used same code which shared you. In my case, I am not seeing errors. Please check below screenshot.
    167835-image.png

    Please check my Spark pool configurations.
    167864-image.png

    Could you please cross verify your Spark Pool configurations with above and see if that helps?

    Hope this helps. Please let us know if any further queries.

    ---------------

    Please consider hitting Accept Answer. Accepted answers helps community as well.

    1 person found this answer helpful.

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.