Fix bug reported in forum #182766:
The Download button on the posts page does not respect the Disable
tagged filenames user setting. Tags are included in the filename when
clicking the Download button even when the Disable tagged filenames
setting is set to Yes. Right click -> Save As on the image still
respects the setting.
* Add a `DiscordSlashCommand.register_slash_commands!` method to register
all slash commands with the Discord API.
* Allow registering global commands.
* Refactor slash commands to use class attributes for the command
name, description, and options.
Always store original files in `public/data/original` instead of directly in
`public/data`. Previously this was optional and defaulted to off.
Downstream boorus will need to either move all images in the
`public/data` directory to `public/data/original`, or symlink the
`public/data/original` directory to the toplevel `public/data` directory:
ln -s . /path/to/danbooru/public/data/original
This to simplify file layout. This option existed because in the past we
stored original files in different locations on different servers (for
no particular reason).
Generate image URLs relative to the site's canonical URL instead of
relative to the domain of the current request.
This means that all subdomains of Danbooru - safebooru.donmai.us,
shima.donmai.us, saitou.donmai.us, and kagamihara.donmai.us - will use
image URLs from https://danbooru.donmai.us, instead of from the current
domain.
The main reason we did this before was so that we could generate either
http:// or https:// image URLs, depending on whether the current request
was HTTP or HTTPS, back when we tried to support both at the same time.
Now we support only HTTPS in production, so there's no need for this. It
was also pretty hacky, since it required storing the URL of the current
request in a per-request global variable in `CurrentUser`.
This also improves caching slightly, since users of safebooru.donmai.us
will receive cached images from danbooru.donmai.us.
Downstream boorus should make sure that the `canonical_url` and
`storage_manager` config options are set correctly. If you don't support
https:// in development, you should make sure to set the canonical_url
option to http:// instead of https://.
Remove a monkey patch that added a `to_i` method to `FalseClass` so that
`false.to_i` returned 0. This is legacy code that shouldn't still be in
use anywhere. It doesn't really work anyway, because `true.to_i` isn't
defined.
Add ngrok.io (plus a few more domains) to the hostname whitelist so that
it can be used as a hostname in development. Useful for testing
webhooks.
* https://ngrok.com
Changes:
* Change the `expires_at` field to `duration`.
* Make moderators choose from a fixed set of standard ban lengths,
instead of allowing arbitrary ban lengths.
* List `duration` in seconds in the /bans.json API.
* Dump bans to BigQuery.
Note that some old bans have a negative duration. This is because their
expiration date was before their creation date, which is because in 2013
bans were migrated to Danbooru 2 and the original ban creation dates
were lost.
Remove the `category_name` field from the `/wiki_page.json` API. This
field was originally added only because it was needed by our autocomplete
Javascript. It was also misnamed, it wasn't the tag's category name, it
was the category's ID.
Users should use `https://danbooru.donmai.us/wiki_pages.json?only=title,tag`
instead if they need this.
This triggered a N+1 query pattern when dumping wiki pages to BigQuery,
which made dumping wiki pages very slow. It also meant this field was
included in the database dump, even though it wasn't a real database
column.
Prevent saved searches, news updates, and ip bans from being publicly
dumped to BigQuery. They didn't override the `visible` method to
restrict their visibility for anonymous users.
* Export daily public database dumps to BigQuery and Google Cloud Storage.
* Only data visible to anonymous users is exported. Some tables have
null or missing fields because of this.
* The bans table is excluded because some bans have an expires_at
timestamp set beyond year 9999, which BigQuery doesn't support.
* The favorites table is excluded because it's too slow to dump (it
doesn't have an id index, which is needed by find_each).
* Version tables are excluded because dumping them every day is
inefficient, streaming insertions should be used instead.
Links:
* https://console.cloud.google.com/bigquery?project=danbooru1
* https://console.cloud.google.com/storage/browser/danbooru_public
* https://storage.googleapis.com/danbooru_public/data/posts.json
* Max comment length: 15,000 characters.
* Max forum post length: 200,000 characters.
* Max forum topic title length: 200 characters.
* Max dmail length: 50,000 characters.
* Max dmail title length: 200 characters.