"If the Internet is a public utility.
Then it is best compared to the sewer.
A huge pipe with loads of excrement infesting your ports."Extreme, but true (unfortunately)
How to protect your email addresses when used in HTML pages. Bots (robots, crawlers, scrapers), like Google, are automated programs that search for websites and examine all the content available.
In the case of bots with bad intent, they look for 'email addresses', collect these and sell them to spam circuits to exploit.
Prevention is better than cure, havving said that I will show you how to prevent bots from harvesting your email addresses yet maintain full user functionality in the 'Email Address Protection' article.
'What you say you give away' and same goes for devices and applications attached to the Internet. By doing a simple HTTP request to a web server (even to a non-existant page) can provide you with a lot of information about the web server and system(s) it runs on, making it easy to identify and target specific web server versions for exploits.
In the article 'HTTP Server Protection' I'll show you how you can hide a lot of information from prying eyes. And because less unnecessary information is sent with each request, it may even speed up your service.
The rules of conduct for web bots and spiders is commonly known, but you'll be surprised just how many of these snoops are around and many don't give a hoot about about any rules, nor any instructions you offer them. You may alsos be surprised to see which companies actually fall into the simple and clearly marked trap.
In the 'Bad Bot' article I'll show you how to set up a web page to capture these crawlers and log their details for possible inclusion in the IP address blocker. The Bot User Agent allows blocking by idenitifiable name.
Once you have identified the IP addresses of intruders, bad bots and other abusive systems, you need to set up a way to detect these and take appropriate action.
I prefer to have each site deal with this instead of a hosts or htaccess file. This means you won't have any problems hosting it with external providers and you can manage and control this list as you please.
In the 'Implementing IP Blocking' article I will show how to set this up.
Supposed to be read by all 'robot' or automated http extraction systems, like google, yahoo, bing and too many more. The purpose of this file is to give instructions to these robots about what they may or may not take from your system. In theory you can even tell certain or all robots not to visit your site at all.
Largely and totally ignored by todays automated Internet systems, it can be used to catch 'red herrings' and help to procatively protect your website.
It's the next best (eco friendly) thing to sliced bread...or is it? Next time a country (like China) complains there are not enough IP addresses to go around, my advice is to tell then to either adapt or die, or take a flying leap!
When it comes to data centers and cloud computing there is a quite giant out there that gets my total allround respect...IBM. But lets have a look at the other side of the story. 'Every silver lining has a dark cloud', the dangers and pitfalls and what you can do about it.
An implementable free 2014 IP data set of IPV4 Internet addresses our systems have detected than should be blocked form accessing any information. You may download and use these files as you see fit. Although this IP data is updated regularly, please do not automate again this page.
2014 IP Block List - Plain text version
2014 IP Block List - Apache .htaccess
2014 IP Block List - Microsoft Threat Management Gateway computer set
Comprehensive 2015 (just about) real time data of intrusive and abusive IP addresses is available on the Cyber Security Results page.
Using pattern processing to detect bad intent from IP addresses. (Un)Fortunately most attempts are brute force with clear intent, making them easy to spot and eradicate simply and effectively. Below are examples of basic primitive methods (still) deployed on large scale to get access to and abuse your Internet facing infrastructure.
Looking for methods to get access to potential email addresses and more spanning multiple platforms. Goal = SPAM.
An example of one of the many signatures used to detect and attack WordPress websites. Goal = HACK, STEAL(DB content) and ABUSE.
Has your website been hacked, once, or on more occasions. Does your ISP brush it off as if they cannot do anything about it and that it has nothing to do with them, preying on your ignorance and suggesting your web site itself is not secure!. Then next time you are considering moving to another Internet provider, maybe you should ask them about there security policy at the same time you discus the price. Remember cheap hosting is that your website is one of hundreds on the same server. If one website owner is slack at security you are at risk, and so are all the other sites. TIP: Make sure you have an alert and active webmaster that knows your website well and keeps it up to date. Else you may find your website blacklisted and visitor numbers dropping by the day.