Update robots.txt.
* Generate /robots.txt dynamically. * Include link to sitemap. * Update list of allowed urls. * Disallow crawling of non-canonical subdomains.
This commit is contained in:
6
app/controllers/robots_controller.rb
Normal file
6
app/controllers/robots_controller.rb
Normal file
@@ -0,0 +1,6 @@
|
||||
class RobotsController < ApplicationController
|
||||
respond_to :text
|
||||
|
||||
def index
|
||||
end
|
||||
end
|
||||
Reference in New Issue
Block a user