Hello parag kesar,
Welcome to the MS Q&A platform.
When you run the Spark job definition using the pipeline, the default is 12 hours, and the maximum time allowed is seven days. The reason is the pipeline is not a streaming service but rather a batch service, and it is not advised to run any pipeline forever.
For batch processing, dividing the pipeline into multiple smaller jobs is a good approach.
If a streaming application requires execution beyond the 7-day limit, we need to automate the restart process using Azure Functions or logic apps. They can provide more flexibility in scheduling the jobs.
I hope this helps. Please let me know if you have any further questions.