How do you dynamically change the robots.txt file in a load balanced environment?
It looks like we will need to start loading the balancing act of our web servers here soon.
We have a feature to dynamically edit robots.txt, which is not a problem for a single host. However, once we get our load balancer, it sounds like I will need to print the file to the other host (s).
This sounds extremely "bad". How would you handle this situation?
I've already allowed the client to edit the "robots" meta tags, which (imo) should effectively do what he wants from editing robots.txt, but I really don't know very well about SEO.
Maybe there is a completely different way to handle this?
UPDATE
looks like we'll save it to s3 for now, with memcache in the foreground ...
HOW WE ARE NOW
so we use merb..I mapped the route to our robots.txt like this:
match('/robots.txt').to(:controller => 'welcome', :action => 'robots')
then the relevant code looks like this:
def robots
@cache = MMCACHE.clone
begin
robot = @cache.get("/robots/robots.txt")
rescue
robot = S3.get('robots', "robots.txt")
@cache.set("/robots/robots.txt", robot, 0)
end
@cache.quit
return robot
end
source to share