how to detect spiders/web crawlers

In the previous posts, I’ve written about the techniques one could use to perform web scraping. I feel it’s important that developers know how to detect spiders and how to restrict them.

I think that the StackOverflowquestion “How do you stop scripters from slamming your website hundreds of times a second?” compiles the best information related to this topic. You can read the whole thing here.


0 Responses to “how to detect spiders/web crawlers”

  1. Leave a Comment

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

Blog Stats

  • 241,555 hits

%d bloggers like this: