removeServerHeader" attribute set to false in web.config
shouldn't it be set to true if you don't want to send Server header?
This browser is no longer supported.
Upgrade to Microsoft Edge to take advantage of the latest features, security updates, and technical support.
We host some of our .Net web apps in an Azure Web App on a shared plan. A security audit showed that in certain cases the HTTP Server header is returned, identifying the server running "Microsoft-IIS/10.0". We are already using the "removeServerHeader" attribute set to “true” in web.config. Also we are using the customHeaders remove elements to remove certain headers. So in most cases no Server header gets sent.
However, there is a specific case where the Server header is sent: when requesting it using a HTTP 1.0 request. When I open a Telnet connection to port 80 of my web app and send the following GET request:
GET / HTTP/1.0
I get the following response:
HTTP/1.1 404 Not Found
Content-Type: text/html
Server: Microsoft-IIS/10.0
Date: Tue, 06 Oct 2020 13:47:22 GMT
Connection: close
Content-Length: 2778
Can this be prevented?
removeServerHeader" attribute set to false in web.config
shouldn't it be set to true if you don't want to send Server header?
Sorry for my delayed response here but I feel that this question deserves a good answer.
One way that I can think of is to have this use system.webServer rather than system.web which is application specific. I've tested this on a normal html file and it works as expected. Let me know if this works for you.
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<system.webServer>
<security>
<requestFiltering removeServerHeader ="true" />
</security>
</system.webServer>
</configuration>