Fix StatementInvalid exception when uploading https://files.catbox.moe/vxoe2p.mp4.
This was a result of multiple bugs:
* First, generating thumbnails for the video failed. This was because
the video uses the AV1 codec, which FFmpeg failed to decode. It failed
because our version of FFmpeg was built without the `--enable-libdav1d`
flag, so it uses the builtin AV1 decoder, which apparently can't
handle this particular video (it spews a bunch of errors about "Failed
to get pixel format" and "missing sequence header" and "failed to get
reference frame").
* Because generating the thumbnails failed, an exception was raised. We
tried to save the error message in the upload_media_assets.error
field. However, this also failed because the error message was 77kb
long (it contained the entire output of the ffmpeg command), but the
`upload_media_assets` table had a btree index on the `error` column,
which meant the maximum length of the error column was limited to
~2.7kb. This lead to a StatementInvalid exception being raised.
* Because the StatementInvalid exception was raised while we were trying
to set the upload media asset's status to `failed`, the upload was
left stuck in the `processing` state rather than being set to the
`failed` state.
* Because the upload was stuck in the `processing` state, the upload
page would hang forever waiting for the upload to complete.
The fixes are to:
* Build FFmpeg with `--enable-libdav1d` to use libdav1d for decoding AV1
videos instead of the builtin AV1 decoder.
* Remove the index on the `upload_media_assets.error` column so that
setting overly long error messages won't fail.
* Catch unexpected exceptions in ProcessUploadMediaAssetJob so we can
mark uploads as failed, even if `process_upload!` itself fails because
it raises an unexpected exception inside its own exception handler.
* Check that the video is playable with `MediaFile::Video#is_corrupt?` before
allowing it to be uploaded. This way we can return a better error
message if we can't generate thumbnails because the video isn't
playable. This requires decoding the entire video, so it means uploads
may take several seconds longer for long videos. It's also a security
risk in case ffmpeg has any bugs.
* Define `MediaAsset#preview!` as raising an exception on error, so
it's clear that generating thumbnails can fail. Define `MediaAsset#preview`
as returning nil on error for when we don't care about the cause of
the error.
Add ability to upload .webp images.
Animated WebP images aren't supported. This is because they aren't
supported by FFmpeg yet[1], so generating thumbnails and samples for
them would be more complicated than for other formats.
[1]: https://trac.ffmpeg.org/ticket/4907
Features of AVIF include:
* Lossless and lossy compression.
* High dynamic range (HDR) images
* Wide color gamut images (i.e. 10- and 12-bit color depths)
* Transparency (through alpha planes).
* Animations (with an optional cover image).
* Auxiliary image sequences, where the file contains a single primary
image and a short secondary video, like Apple's Live Photos.
* Metadata rotation, mirroring, and cropping.
The AVIF format is still relatively new and some of these features aren't well
supported by browsers or other software:
* Animated AVIFs aren't supported by Firefox or by libvips.
* HDR images aren't supported by Firefox.
* Rotated, mirrored, and cropped AVIFs aren't supported by Firefox or Chrome.
* Image grids, where the file contains multiple images that are tiled
together into one big image, aren't supported by Firefox.
* AVIF as a whole has only been supported for a year or two by Chrome
and Firefox, and less than a year by Safari.
For these reasons, only basic AVIFs that don't use animation, rotation,
cropping, or image grids can be uploaded.
Add bar charts for non-timeseries data. For example, a bar chart of the
top 10 uploaders overall in the last month, rather than a timeseries
chart of the number of uploads per day for the last month.
Add ability to group reports by various columns. For example, you can see
the posts by the top 10 uploaders over time, or posts grouped by rating
over time.
Split requests made by Danbooru::Http into either internal or external
requests. Internal requests are API calls to internal services run by
Danbooru. External requests are requests to external websites, for
example fetching sources or downloading files. External requests may use
a HTTP proxy if one is configured. Internal requests don't.
Fixes a few source extractors not using the HTTP proxy for certain API calls.
Update email validation rules to disallow the percent character (e.g.
`foo%bar@gmail.com`) and names ending with a period (e.g. `foo.@gmail.com`).
Names ending with a period are invalid according to the RFCs and cause
`Mail::Address.new` to raise an exception.
The percent character is technically legal, but only one email used it
and it was probably a typo.
* Make it an error to supply empty API credentials, like this:
`https://danbooru.donmai.us/posts.json?login=&api_key=`. Some clients
did this for some reason.
* Make it so that the `login` and `api_key` params are only allowed as
URL params, not as POST or PUT body params. Allowing them as body
params could interfere with the `PUT /api_keys/:id` endpoint, which
takes an `api_key` param.
When choosing the Open Graph image (the preview image shown when a
Danbooru link is posted on Discord or social media), choose the safest
image with the highest score, rather than the image with the highest
favcount.
Try to automatically fix various kind of typos and common mistakes in
email addresses when a user creates a new account. It's common for users
to signup with addresses like `name@gmai.com`, which leads to bounces
when we try to send the welcome email.
Add support for uploading posts from Gelbooru. Note that the translated
tags will include both the Gelbooru tags and the tags from the Gelbooru
post's source. The commentary and artist information will also be taken
from the Gelbooru post's source. The source of the Danbooru post however
will be left as the Gelbooru post itself, not as the Gelbooru post's source.
Remove the last remaining uses of the PixivUgoiraFrameData model. As of
32bfb8407, Ugoira frame data is now stored in the MediaMetadata model,
under the `Ugoira:FrameDelays` EXIF field.
The pixiv_ugoira_frame_data table still exists, but it can be removed
after this commit is deployed.
Fixes#5264: Error when replacing with ugoira.
Automatically add the AI-generated tag to posts that have the
`PNG:Software=NovelAI` EXIF attribute.
This is not foolproof because this metadata may get removed if an
AI-generated post is resaved or uploaded to a site that strips EXIF
metadata. It also only works for NovelAI. Currently it detects 29 out of
177 AI-generated uploads on Danbooru.
Store Ugoira frame delays in the MediaMetadata model as a fake EXIF
field instead of in the PixivUgoiraFrameData model. This way we can get
rid of the PixivUgoiraFrameData model completely. This is a step towards
fixing #5264.
Whenever the email address normalization procedure changes, the
`normalized_address` column of the email address table must be updated.
This is normally when the list of canonical domain mappings changes.
Renormalizing addresses may also require deleting duplicates.
* Fixed a bug where manga posts with a single tag would raise an error
* Fixed a bug where dic.nicovideo.jp/oekaki posts weren't uploadable due
to SSL issues
* Added support for more manga corner cases
Fix not being able to use the full set of search operators on polymorphic `model_id` and
`model_type` attributes. Before things like `search[model_type]=Post` worked, but
`search[model_type_not_eq]=Post` or other `model_type_*` operators didn't.
Log the following information in email headers:
* X-Danbooru-User: the user's name and ID.
* X-Danbooru-IP: the user's IP.
* X-Danbooru-Session: the users' session ID.
* X-Danbooru-URL: the page that triggered the email.
* X-Danbooru-Job-Id: the ID of the background job that sent the email.
* X-Danbooru-Enqueued-At: when the email was queued as a background job.
* X-Danbooru-Dmail: for Dmail notifications, the link to the Dmail.
* X-Request-Id: the request ID of the HTTP request that triggered the email.
Also make it so we log an event in the APM when we send an email.