It's a "one strike and you're out" approach, any ip address that grabs forbidden files from my robots.txt file, or engages in comment, wiki, or referrer spam will probably be banned; it's my site, I can be as irrational as I like! My inertia and can't-be-arsed factor will affect the results here...
- Open .htaccess in an editor
- Add something like the following lines to block ip address 18.104.22.168
order allow,deny allow from all deny from 22.214.171.124
- Save .htaccess file
For testing, try entering your ip address in the deny line; you should get 403 errors when you try to view your pages. To block further ip addresses just add further "deny from n.n.n.n" lines.
For added hilarity and possible unwanted consequences I've also added "deny from sbl-xbl.spamhaus.org" which should block all ip addresses in Spamhaus' sbl and xbl blacklists, I have no idea whether this really works yet; I'll have to track down all the 403's in my access logs and see if I can establish why they appeared. I'm not a huge fan of blacklists, but I've noticed that a number of my regular unwanted customers already appear in this blacklist so it appears to be worth trying out.
In summary, what I've done so far is very trivial to implement, but it requires some manual updating. I'm going to observe how well it works in practise for the next few weeks before automating the process further or abandoning the experiment. I'd like to think it proves to be effective. I don't think it'll be too resource intensive as despite it appearing that every hit will cause a dns lookup, in fact dns does a lot of caching and I suspect my local dns server will handle 90% or more of the requests.