Cache /robots.txt at the HTTP level because it rarely changes but it gets requested by bots relatively frequently.
* Generate /robots.txt dynamically. * Include link to sitemap. * Update list of allowed urls. * Disallow crawling of non-canonical subdomains.