Why would you ever want to start blocking unwanted traffic to your website? Read on, and discover the many reasons why you should —
So, here you are. You just loaded your brand new website, or blog, set up various ways to let people know about it through social networking like Twitter, Facebook, Digg, and others, and now you are ready to rock on the net with all of your great content and/or products.
In the first 6 weeks, your traffic numbers are tops .. your logs are filled with tons of great referrals .. and you can barely contain your excitement over it all.
90 days into this, your traffic numbers are still great, you brag to all of your buddies, or anybody who will listen, about all of the people who are visiting your website.
But wait .. if so many, visit so often, why are they not commenting on your articles, or buying any of your products?
You’ve jumped through all of the hoops, submitted your articles, offered sales and other perks, and still .. nothing.
Truth be known, nearly a full of 90% of the traffic you are seeing in your logs is really quite bogus. Automated and worthless, this logged traffic is usually nothing more than things like website scrapers, attempted hacks, comment spammers, and other such nefarious things.
Part of website ownership involves becoming aware that most things on the net aren’t worth the time, when it comes down to the numbers .. an automated parsing utility will never, ever, buy anything from you, or actually ever read any of your articles.
Just because you get a flurry of activity in your site logs after a Twitter post, doesn’t mean that there are hordes of people visiting you because they are interested in what you’ve had to say .. most people on Twitter could care less about your post.
What you witness in your site logs after a Twitter post, is nothing more than a swarm of automated parsing agents, each just as worthless as the next, attempting to load their own databases with as much info as they can before the other worthless site parsing agents do.
Another form of traffic you’ll see in your site logs are what we commonly refer to as, Website Scrapers.
These so-called website scrapers, cruise the internet every day, looking to copy anything you’ve written in order so that;
1) .. They can sell the content to others for profit,
or,
2) .. They can post your content to their own website in order to gain higher listings in Google.
At any rate, if you were to go on about blocking, or otherwise disallowing all of the useless traffic that visits your website, you would see your traffic numbers drop drastically and you would not have to be paying for all of the extra bandwidth that these worthless parsing agents use.
Getting your traffic numbers down to a realistic level should be the order of the day for any webmaster who would be interested in an honest view of how they are really doing online.
Seeing the real traffic, you know, those humans who actually are interested in you? .. will help in determining which direction, exactly, you need to be going, with your blog, or ecommerce website.
Unless you know the true numbers, you’ll never amount to much of an internet presence I’m afraid.
Having an effective website management plan can, at various times, have everything to do with disallowing worthless site parsing agents. Blocking, or otherwise disallowing certain forms of traffic to your new website, can often times, make way for more legitimate traffic to visit you in the future. Obtaining legitimate, page viewing traffic, should be the primary goal of any webmaster.
The next question you, the new website owner, should be asking is, “Well how do I go about identifying your so-called worthless website parsing agents?”
The answer to that question can be answered easily enough. There are quite literally hundreds of quality websites online that can help you get the information you need with regard to blocking worthless website parsing agents, otherwise known as “Bad Bots”.
Here are a few great resources to get you started;
AskApache .. How To Block Bots, Ban IP Addresses With .htaccess .. The ultimate guide to writing advanced .htaccess files!
Comprehensive guide to .htaccess .. Blocking bad bots and site rippers (aka offline browsers).
Webmasterworld .. Search Engine Spider and User Agent Identification.
—
If you have any resources of your own that deal with blocking unwanted traffic, bad bots, or other worthless kinds and types of parsing agents, then you are welcome to post them here in the comment section.
Leave a Reply
Your email is safe with us.
You must be logged in to post a comment.