Files
danbooru/app/views
evazion 91587aeb6b robots.txt: block Googlebot from crawling certain useless URLs.
Block Googlebot from crawling certain slow useless URLs. Sometimes
Googlebot tries to crawl old source:<url>, approver:<name>, and
ordfav:<name> searches in bulk, which tends to slow down the site because
things like source:<url> are inherently slow, and because Google spends
hours at a time crawling them in parallel. This is despite the fact that
these links are already marked as nofollow and noindex, and source:<url>
links were removed from posts a long time ago to try to stop Google from
crawling them.
2021-11-12 16:55:37 -06:00
..
2021-01-16 03:32:11 -06:00
2021-05-15 04:36:22 -05:00
2020-07-27 19:29:20 +00:00
2021-06-29 14:43:27 -03:00
2021-10-25 05:59:40 -05:00