* Generate /robots.txt dynamically. * Include link to sitemap. * Update list of allowed urls. * Disallow crawling of non-canonical subdomains.
7 lines
89 B
Ruby
7 lines
89 B
Ruby
class RobotsController < ApplicationController
|
|
respond_to :text
|
|
|
|
def index
|
|
end
|
|
end
|