- A robot may not index staging servers
- A robot must obey the sitemap
- A robot may not injure SEO or, through inaction, cause SEO to come to harm.
Add this line to your application's Gemfile:
gem 'laws_of_robots_txt'
And then execute:
$ bundle
Be sure to remove public/robots.txt
, if it exists
LawsOfRobotsTxt installs a rack middleware into your application which renders
a different /robots.txt
based on the request's domain.
It looks in config/robots/
for a file named <DOMAIN>.txt
, for example,
config/robots/www.example.com.txt
. A server restart is required to pick up
changes.
If no file exists for the requests domain, it renders a default
# Robots not allowed on this domain
User-Agent: *
Disallow: /
- Fork it ( http://github.com/freerunningtech/laws_of_robots_txt/fork )
- Create your feature branch (
git checkout -b my-new-feature
) - Commit your changes (
git commit -am 'Add some feature'
) - Push to the branch (
git push origin my-new-feature
) - Create new Pull Request