.NET 6 application published in Linux is much worse than Windows Web App.

Luan Gabriel Duarte 0 Reputation points
2024-10-10T19:20:12.7233333+00:00

My application receives more than 700M requests by month. An average of 300/350 requests per second in the most usage time of users.

The point is: In windows, it consumes 400mb of memory and 45% of CPU (p2v2).
In linux, it consumes 2.5gb of memory and 80% of CPU.

The hardware is better in Linux (500mb more memory and 1 more vCPU) than Windows and the application performance in Linux is horrible. It crashes the instances and the availability percentage decrease to 80%. Requests take much more time to returns a response.

Stack: .NET 6

Is there a solution for this? Why do I need to do to get the same results in Linux Web Apps compared with Windows Web Apps?

Do you have any suggestions?

Developer technologies | .NET | Other
0 comments No comments
{count} votes

1 answer

Sort by: Most helpful
  1. Bruce (SqlWork.com) 78,006 Reputation points Volunteer Moderator
    2024-10-10T20:37:55.11+00:00

    in general linux performs as well or better than windows. but any library (nuget) that uses native resources may not be ported to Linux as well. You will need to track down the offending code as it appears you have a memory leak. see this thread:

    https://github.com/dotnet/runtime/issues/96091

    note: .net 6 supports ends Nov 12 (< 5 weeks), so you might want to upgrade to .net 8 and lastest packages and do the test again.

    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.