Is it possible to force login based on IP address?
3 answers
Do you consider using a robots.txt file to minimize unwanted traffic with automated spidering tools?
You can have multiple Disallow lines for each user agent (i.e. for each spider). Below is an example of a longer robots.txt file:
User-agent: *
Disallow: /images/
Disallow: /cgi-bin/
User-agent: Googlebot-Image
Disallow: /
Here is an example that prohibits everything except google
User-agent: *
Disallow: /
User-agent: Googlebot
allow: /
Warning: This method does not guarantee that banned agents will not pass through your site, it will simply ask them in a standardized way that most of these tools understand.
+4
source to share