robots.txt: disallow crawling version pages.

Some of these pages were accidentally crawled because rules like `Allow:
/artist` allowed `/artist_versions` to be crawled (Allow rules are
prefix matches).
This commit is contained in:
evazion
2021-02-07 22:30:10 -06:00
parent 3f6e7ff6b5
commit 30b7345900

View File

@@ -6,6 +6,14 @@ Allow: /$
Disallow: /*.atom
Disallow: /*.json
Disallow: <%= artist_urls_path %>
Disallow: <%= artist_versions_path %>
Disallow: <%= artist_commentary_versions_path %>
Disallow: <%= note_versions_path %>
Disallow: <%= post_versions_path %>
Disallow: <%= pool_versions_path %>
Disallow: <%= wiki_page_versions_path %>
Allow: <%= artists_path %>
Allow: <%= artist_commentaries_path %>
Allow: <%= comments_path %>