"Creating and maintaining secure publicly accessible Internet resources. A changing modern day challenge."

Web Server Protection

Back to Internet Security

Windows Let me start off by saying I have admiration for how Microsoft has dealt with security issues in the last decade regarding operating systems and Internet facing services, specifically for the 'vulnerable home users'.

NetworkThe concept of a unified global network for the world as conceived for the future in the 1980's (yes I know it's older than that!), was done in such a rush and with a total belief of freedom (flower power effect) that it failed to examine the down side in the belief that humanity was inertly good.

NetworkToday we know this is not the case, as example, the fact that 'SPAM', 'Junk mail', is still a problem in the twenty first century is a clear sign all is far from well. Internet is currently the 'New Wild West' of the world! Another pointer is the fact that it has got so bad, that now even governments want to seize control of it, the French are (as always) aggressively undertaking steps to achieve this. Enough rambling, on with the business of good solid Internet practices.

Don't blame the creators or programmers of the Internet, they almost all have functionality and usability for you at heart and not evil, because security is for the cops...right? When is the last time you have tried to hack (steal information you should not be able to and then destroy) your own website, or asked 'trusted relation's (ethical hacker) to do so. If the answer is never (and you run on a (no) low budget open source platform, is it worth waiting until an external force does. No one I know enters the Internet with evil intent and therefore acts accordingly. Science has (all to often shown me) that for every action there is a reaction. Evil prevails on the Internet just the same and perhaps in more perverse ways than many care to comprehend or try to understand, extending from common Internet 'street' criminals to global government driven exploits.

Detecting and dealing with 'unknowns' in IIS before they hit your website.

For all you know... It may just start world war three. But certainty is it will not do you nor your Internet connection any good. Methods here is to know what HTTP/S should could and should not allow into your website.

It's about setting up boundaries an eliminating potential threats. Prevention is better than cure. What we will do is ensure we only allow things into the web server that we know we can deal with, we'll start by blocking bad calls for paths from coming in.

We do this right at the beginning of any and each request by using the global website application class 'Global.asax'. (if you don't have this file in your site, create it.)

Global.asaxOpen this file for editing, and in the subroutine that fires everytime a request is made: 'Application_BeginRequest' we add the following code.. Application_BeginRequest routine

This way you avoid requests to places that certainly do not exist on your website from getting any further and anywhere near your actual website and its contents.

Don't advertise.

It should be rather obvious, but is nearly always ignored. The 'headers' the webserver responds with. A perfect way to quickly and acurately detect which type of webserver technology is used. Perfect for targeting specific types of servers for attack vectoring.

Typical of IIS is to place "X-Powered-By" in the response header. Totally useless data sent with each response and a dead giveaway you're dealing with Microsft IIS. To remove this from the response headers, place the following in the 'web.config' file of the site(s):
<system.webServer>
  <modules runAllManagedModulesForAllRequests="true">
    <add name="ServerCloak" type="ServerCloak.Class1" preCondition="managedHandler" />
  </modules>
  <httpProtocol>
    <customHeaders>
      <remove name="X-Powered-By"/>
    </customHeaders>
  </httpProtocol>
</system.webServer>

More attention needed.

There are more tell-tale signs in the response headers that you cannot change via settings or config files. Have a look at a typical IIS server response header and you'll see exactly what I mean: It shows exactly what web server technology is used and even the version number. If you use MVC technology then this will also be displayed.

Standard Response Headers.

Server header before

Optimized Response Headers.

Server header before

Server 'Cloaking' advantages.

Less non functional information in each response the server sends. The headers no longer advertise what type of server and versions of software are running, decreasing attack vectors from external parties targeting specific security or flaws in specific versions of specific types of websties.

To get this result, every response from IIS needs to go through a 'filter' that removes unnecessary information. To achieve this I wrote a class that attaches itself to the response stream of the site and does the filtering. To achieve the same for IIS running .Net 2.0+, download the 'ServerCloak.dll' zip file, unzip and place it in the 'bin' folder of the site(s) and add a reference to it using 'Add Reference' (right click on the main site icon.)

You now have an anonymous web server. To complete the task you can implement the URL Rewrite Module to use extentionless URL's, after this it is anyone's guess how the html is being created.