* Switch CloudflareService from HttpartyCache to Danbooru::Http.
* Purge cached urls from Cloudflare when a post is replaced and the md5
doesn't change. This happens when a corrupted image is replaced or
thumbnails are regenerated. Before we purged urls when a post was
expunged, which was unneeded because those urls can expire naturally.
It was also wrong because the subdomains were hardcoded, the urls used
http:// instead of https://, and we didn't account for tagged urls.
The twitter gem had several problems:
* It's been unmaintained for over a year.
* It pulled in a lot of dependencies, many of which were outdated. In
particular, it locked the `http` gem to version 3.3, preventing us
from upgrading to 4.2.
* It raised exceptions on normal error conditions, like for deleted
tweets or suspended users, which we really don't want.
* We had to wrap it to provide caching.
Changes:
* Fixes#4226 (Exception when creating new artists entries for suspended
Twitter accounts)
* Drop support for scraping images from summary cards. Summary cards
are the previews you get when you link to a website in a tweet. These
preview images aren't always the best image.
* B2 doesn't allow the path to start with a '/' character.
* When storing the file, we have to rewind the file pointer to make sure
we get the whole file.
Fix an error in operator precedence:
> sum + Tag.find_by_name(token[1]).try(:post_count) || 0
This was treated as `(sum + X) || 0` not `sum + (X || 0)` as intended.
This failed when X was nil.
Filter out the user's own uploads and favorites from their
recommendations.
Note that in most cases a user's top-N recommendations will be things
they've already favorited. If a user has 10,000 favorites, most of their
top 10,000 recommendations will be their own favorites, so we have to
generate a little more than 10,000 recommendations to be sure they won't
all be filtered out.
In other words, the more favorites a user has, the more recommendations
we have to generate. The upper bound is clamped to 50,000 for
performance reasons. If a user has more favorites than this we may not
be able to find any recommendations for them.
* Open recommendations to all users (not just gold).
* Show recommendations on all posts (not just posts after 2017).
* Allow users to browse recommendations for other users.
* Increase number of recommended posts returned.
* Change endpoints to /recommended_posts?user_id=1234 and
/recommended_posts?post_id=1234 and add json/xml support.
* Move the Curated pool updater from Reportbooru to Danbooru.
* Change the process for selecting curated posts. Previously it was
every post from the last week with at least three supervotes. This was
flawed because it included both super-upvotes and super-downvotes. Now
it's the top 100 posts from the last week, ordered from most super-upvoted
to least.
Remove the "Remember" checkbox from the login page. Make session cookies
permanent instead. Phase out legacy `user_name` and `password_hash` cookies.
Previously a user's session cookies would be cleared whenever they
closed their browser window, which would log them out of the site. To
work around this, when the "Remember" box was checked on the login page
(which it was by default), the user's name and password hash (!) would
be stored in separate permanent cookies, which would be used to
automatically log the user back in when their session cookies were
cleared. We can avoid all of this just by making the session cookies
themselves permanent.
Don't track IP addresses for post appeals, post flags, tag aliases, tag
implications, or user feedbacks. These things are already tightly
limited. We don't need IPs from them to detect sockpuppets.
Add a new IP address search page at /ip_addresses. Replaces the old
search page at /moderator/ip_addrs.
On user profile pages, show the user's last known IP to mods. Also add
search links for finding other IPs or accounts associated with the user.
IP address search uses a big UNION ALL statement to merge IP addresses
across various tables into a single view. This makes searching easier,
but is known to timeout in certain cases.
Fixes#4207 (the new IP search page supports searching by subnet).
This gem uses a native extension that requires a C++ compiler to build.
Removing this gem removes the need to have a C++ toolchain to install Danbooru.
Bug: sending a dmail containing a wiki link (ex: [[tagme]]) failed when
the recipient had email notifications turned on.
Cause: wiki links inside email notifications use absolute urls, which
the dtext postprocessor didn't parse correctly.
Move the parsing for the [bur:<id>], [ta:<id>], [ti:<id>] pseudo tags to
the main parser in `DText.format_text`. This fixes a bug where wiki
links inside bulk update requests on the forum weren't properly
colorized because the text of the BUR was embedded after we scanned for
wiki links, not before.
This also ensures that tags inside bulk update requests will be recorded
in the dtext_links table, meaning that forum posts can be properly
searched by tags.
This incidentally means that these request pseudo tags can now be used
outside the forum.
* Let gold users use upvote:self, downvote:self metatags to search for
their own votes.
* Don't let mods use upvote:<user>, downvote:<user> metatags to see
votes by other users. Only let admins see other users' votes.
* Add vote count to profile page.
* Remove the single alias and implication request forms. From now
on, bulk update requests are the only way to request aliases or
implications.
* Remove the forum topic ID field from the bulk update request form.
Instead, to attach a BUR to an existing topic you go to the topic then
you click "Request alias/implication" at the top of the page.
* Update the bulk update request form to give better examples for the
script format and to explain the difference between aliases and
implications.
Instead of sending IQDB the image url and letting it download the file,
download the file on Danbooru's end and send IQDB the downloaded file.
This fixes several issues:
* Some sites need the referer header set to avoid hotlink protection
when downloading the file. Danbooru knows how to deal with this but
IQDB doesn't.
* We need to enforce certain restrictions when downloading files,
including setting max filesize limits, setting max timeouts, and not
allowing downloads from forbidden IPs (to avoid SSRF attacks).
Danbooru knows how to handle these things but IQDB doesn't.