Meniu

How to prevent Spammers from accessing your Linux Server

I run an article directory on a dedicated Linux server and provide full article RSS feeds for any query or tag. When spammers find the site, they usually pull the feeds and repost the articles on their Wordpress or Blogger blogs.

Occasionaly, for whatever reason, they tend to download feeds like there is no tomorrow. I don't care about them accessing the feeds for whatever use -- as long as they publish the articles intact to preserve the content with links to the author's sites -- but I really when pissed off when they access the server thousands of times per hour. This can bring even the most powerful dedicated Linux Server to its knees.

A simple trick to prevent them from accessing your site is to add in the .htaccess file

order allow,deny
deny from
allow from all


This returns a 403 (forbidden) error. If the script they're running to grap the content from your linux server errors out, then it will stop accessing your site. More often than not, the script continues to scrap your site causing Apache to take up valuble dedicated resources.

A more practicle approach is to prevent them from accessing the server by using the iptables, or firewall rules.

This will reject the access at the TCP transport layer before it gets routed up the transport stack to Apache. To do this, enter this command:

iptables -A INPUT -s -j REJECT


The iptables command must be entered as root from ssh or from your control panel. But with a dedicated or virtual linux server, you should have root access.

Mirela

Zona de mobile
  • | 34 articole

Nici un comentariu inca. Fii primul!
  • powered by Verysign