@Magnus Welcome to Microsoft Q&A Forum, Thank you for posting your query here!
It's possible that the direct upload endpoint has a limit on the content size, which is causing the upload to fail with a status 413 error. Since you're using Ruby on Rails and the active storage JS library, it's likely that the framework is responsible for generating the direct upload URL and handling the file upload. It's also possible that the XMLHttpRequest used by the library is sending the entire file in one go, which could be causing the error.
It's worth noting that uploading large files can be challenging due to limitations on network bandwidth and server resources. One approach to uploading large files is to split them into smaller chunks and upload them in parallel, which can help to reduce the likelihood of errors and improve upload speed.
In any case, if you believe that the direct upload worked with large files a month ago, it's possible that something has changed on the server side that is causing the issue. You may want to reach out to Azure support to see if there have been any changes to the direct upload endpoint that could be causing the problem. Additionally, you may want to consider using a different approach for uploading large files, such as Azure's Python library, to see if that resolves the issue.
HTTP status code 413 means "Payload Too Large." It is typically returned by the server when the request is too large for it to handle. This can happen when the server has a limit on the size of the request that it can accept.
In this case, it seems that the size of the file being uploaded is larger than what the server can handle. One potential solution is to split the file into smaller chunks before uploading, which can be done using a technique called "chunked transfer encoding."
There are no restrictions for the zip file. You can upload .zip file of more than 7. GB and it went through successfully.
Note: We also recommended to use Azcopy tool to upload files from on-premises or cloud (Use this command-line tool to easily copy data to and blobs from Azure Blobs, Blob Files, and Table storage Storage with optimal performance. ) AzCopy supports concurrency and parallelism, and the ability to resume copy operations when interrupted. It provides high-performance for uploading, downloading larger files. Please consider using this library for larger files.
Storage limits: https://learn.microsoft.com/en-us/azure/azure-resource-manager/management/azure-subscription-service-limits#storage-limits
- If you want to upload larger files to file share or blob storage, there is an Azure Storage Data Movement Library
Please let us know if you have any further queries. I’m happy to assist you further.
Please do not forget to "Accept the answer” and “up-vote” wherever the information provided helps you, this can be beneficial to other community members.