if the you are probably hitting the max request size for an azure app service, so your action is never called.. how this is configured depends on your setup, could also be firewall settings (which also has a request limit).
API not able to process 400mb JSON file
Hello there,
I'm trying to upload a model and JSON file using an API named UploadModel API (code below)
The issue I'm facing is:
If a model size is 230mb and Json size is 40mb all works absolutely fine but
when I'm trying to upload a model of size is 250mb and Json of size is 400mb then it's not even printing the first log "Started Model Upload".
*Note: All the code is working fine on localhost (local machine) , facing the issue when the application is deployed on Azure app services.
I tried to modify the code as below:
Still it's not able to assign the filesection.Filestream to JsonFile.Bytes (stream)
In short the API is not able to bear the 400mb Json file , so all is getting crashed. It returns badresultobject as it get's crashed.
This is very high priority issue for us.
It will be very helpful if any solution provided...
Sample Json file is attached: (sample file has one model object , original json file has 62k model objects and more than 1cr of lines)
1485220439.txt
Any help will be appreciated
Original code :
public class UploadModelForm
{
[FromForm(Name = "json-file")]
public IFormFile JsonFile { get; set; }
[FromForm(Name = "model-file")]
public IFormFile ModelFile { get; set; }
}
[HttpPost("{projectId}/{directUpload}/Upload")]
public async Task<dynamic> CompareAndUploadNewModel(string projectId, string directUpload, [FromForm] UploadModelForm form)
{
_logger.LogInformation("Started Model Upload.");
try
{
string fileExtension = Path.GetExtension(form.ModelFile.FileName).ToLower();
if (fileExtension != ".rvt")
{
return new JsonResult("Uploaded file is not of .rvt format");
}
_logger.LogInformation("Moving towards database update.");
var rsponse = await _modelService.ProcessDirectPost(int.Parse(projectId), form.JsonFile, form.ModelFile);
_logger.LogInformation("Uploading model to forge bucket.");
var uploadStatus = await PostModelToForge(0, form, projectId);
_logger.LogInformation("Exiting model upload");
return uploadStatus;
}
catch (Exception ex)
{
Exception exception = new($"An error occurred: {ex.Message} , Detailed error : {ex.ToString()}");
throw exception;
}
}
2 answers
Sort by: Most helpful
-
-
P a u l 10,751 Reputation points
Oct 26, 2023, 7:56 PM Rather than streaming the JSON form field to a string, you could just give the JSON deserialise method the input stream directly. This avoids having the full JSON string representation being loaded into memory in one go on top of the memory allocated for the object you're deserialising into.
If you're prepared to use
System.Text.Json
overNewtonsoft.Json
then you could do this quite compactly:public class HomeController : Controller { public IActionResult Index() { return View(); } [HttpPost("/")] public async Task<IActionResult> Submit([FromForm(Name = "JsonFile")] IFormFile file) { var data = await JsonSerializer.DeserializeAsync<Item[]>(file.OpenReadStream(), cancellationToken: HttpContext.RequestAborted); return View("Index"); } public class Item { [JsonPropertyName("id")] public string Id { get; set; } // ... } }
Just a few extra points about this code: it uses the
Async
variant of theDeserialize
method, which allows this request to not block a thread waiting for the IO bound work to be done. Because it's async, we can also use a cancellation token provided by the context:HttpContext.RequestAborted
so if you navigate away from the page while the upload is taking place the serialiser will also abort & discontinue reading the form field into memory.