Fix an exception on the error page when a controller index action raised
an PG::AmbiguousColumn error because the model `search` method generated
SQL with an ambiguous column reference. In this case the error page
tried to generate data attributes for the <body> tag, but this failed
because evaluating the `current_item` raised an exception again.
Fix `Cannot write log file 'ffmpeg2pass-0.log' for pass-1 encoding: Permission denied` error
when uploading ugoira files. Caused by the fact that 2-pass encoding tries to write a log file in
the current directory by default, which fails in production because the default working directory in
the Docker image is /danbooru, which is read-only.
Add a limit so that users can't upload more if they already have more
than 250 images queued for upload.
For example, if you upload a Pixiv post that has 200 images, then you'll
have 200 queued images for upload. This will go down as the images are
processed. If you exceed the limit, then trying to create new uploads
will return an error.
This is to prevent single users from overwhelming the site by uploading
too many images at once, thereby preventing other users from uploading
because the job queue is backed up and can't process new uploads by
other users until existing uploads are finished.
Fix it so that upvoting or downvoting a revealed thresholded comment
doesn't hide it again.
The fix is to explicitly store a `data-show-thresholded` flag on the
comment, instead of manually hiding elements with jQuery, and to morph
the comment HTML instead of replacing it so that the state isn't lost
after voting. Alpine.js is used for this, which isn't strictly necessary,
but is useful to test the library before adopting it on a wider scale.
https://alpinejs.dev/start-here
Also fixes the uploader uploading all images when trying to upload only a
single image in a multi-image work. Caused by `image_urls` incorrectly
returning all images when the source strategy was given a url for a
single image.
Add `#basename`, `#filename`, and `#file_ext` utility methods to
Danbooru::URL and change a few places to use them. Simplifies parsing
filenames in source URLs in various places.
Introduce a Source::URL class for parsing URLs from source sites. Refactor the Twitter
source strategy to use it.
This is the first step towards factoring all the URL parsing logic out of source
strategies and moving it to subclasses of Source::URL. Each site will have a subclass
of Source::URL dedicated to parsing URLs from that site. Source strategies will use
these classes to extract information from URLs.
This is to simplify source strategies. Most sites have many different URL formats we have
to parse or rewrite, and handling all these different cases tends to make source
strategies very complex. Isolating the URL parsing logic from the site scraping logic
should make source strategies easier to maintain.
Fix certain ugoiras having very low quality webm samples. This was
because we had a target bitrate of 5 Mbps, but this wasn't enough for
videos that were high resolution or that had choppy, hard-to-compress
motion, such as post 5081776 (nsfw).
NicoSeiga changed it so that on every login, you must enter a 2FA code
sent by email. This broke the NicoSeiga strategy. The fix is to just use
a static session cookie instead (and hope it doesn't expire, and isn't
tied to an IP).
The `nico_seiga_login` and `nico_seiga_password` config settings have
been removed from config/danbooru_default_config.rb and replaced by
`nico_seiga_user_session`. If you run your own Danbooru instance, you
will have to update your config file manually.
* Remove unnecessary trailing slashes when artist URLs are saved.
* Automatically add `http://` to new artist URLs if it's missing (before
this was an error; now it's automatically fixed).
Introduce a Danbooru::URL class for dealing with URLs. This is a wrapper
around Addressable::URI that adds some additional helper methods. Most
significantly, the `parse` method only allows valid http/https URLs, and
it returns nil instead of raising an exception when the URL is invalid.
Fixes a bug where the Foundation source strategy failed because http.rb
automatically sent a `Content-Length: 0` header with all GET requests,
which caused Foundation to return a 400 Bad Request error. This behavior
was fixed in http.rb 5.x.
http.rb 5.x has a breaking change where it now includes the request object
inside the response object, which we have to handle in a few places.
Allow uploading multiple files from your computer at once.
The maximum limit is 100 files at once. There is still a 50MB size limit
that applies to the whole upload. This limit is at the Nginx level.
The upload widget no longer shows a thumbnail preview of the uploaded
file. This is because there isn't room for it in a multi-file upload,
and because the next page will show a preview anyway after the files are
uploaded.
Direct file uploads are processed synchronously, so they may be slow.
API change: the `POST /uploads` endpoint now expects the param to be
`upload[files][]`, not `upload[file]`.
Use a spinner icon instead of the word "Loading" for thumbnails that are
being processed in the background in a batch upload.
Also use morphdom to update thumbnails so we only update the parts of
the DOM that actually changed.
Add data attributes to thumbnails on the /uploads, /upload_media_assets,
and /media_assets pages. Add a `data-is-posted` attribute for styling
thumbnails based on whether they've already been posted.
Followup to 093a808a3. Using a NOT EXISTS clause is much faster than the
`LEFT OUTER JOIN posts WHERE posts.id IS NULL` clause generated by
`.where.missing(:post)`.
Change the loading indicator from a progress bar to a spinner. Fixes
issue with the <progress> element having a different appearance on
different browsers.