Hello Mayur Patil,
thanks for post your questio here, so I can see 2 ways short term and long term...
short term fix:
implement chunked uploads with a 4 MB chunk size and a 300-second timeout in the SDK. Increase the function timeout to 10 minutes in host.json. Add retry logic for uploads.
long term (probably solutions):
switch to the premium plan for longer timeouts and better resource allocation. Optimize memory usage with streaming and incremental uploads. Monitor ADLS Gen 2 metrics to adjust concurrency if needed.
By addressing both the timeout (via chunking and retries) and the 500 error (via timeout extension or plan upgrade), your function should handle large files more reliably.
Best regards,
Alex
P.S. If my answer help to you, please Accept my answer