Replace the mocked services in scripts/mocked_services with Rails-level
mocked services.
The scripts in scripts/mocked_services were a set of stub Sinatra
servers used to mock the Reportbooru, Recommender, and IQDBs services
during development. They return fake data so you can test pages that use
these services.
Implementing these services in Rails makes it easier to run them. It
also lets us drop a dependency on Sinatra and drop a use of HTTParty.
To use these services, set the following configuration in danbooru_local_config.rb
or .env.local:
* reportbooru_server: http://localhost:3000/mock/reportbooru
* recommender_server: http://localhost:3000/mock/recommender
* iqdbs_server: http://localhost:3000/mock/iqdb
where `http://localhost:300` is the url for your local Danbooru server
(may need to be changed depending on your configuration).
The Nijie login process works like this:
* First we submit our `email` and `password` to `https://nijie.info/login_int.php`.
* Then we save the NIJIEIEID session cookie from the response.
* We optionally retry if login failed. Nijie returns 429 errors with a
`Retry-After: 5` header if we send too many login requests. This can
happen during parallel testing.
* We cache the login cookies for only 1 hour so we don't have to worry
about them becoming invalid if we cache them too long.
Cookies and retrying errors on failure are handled transparently by Danbooru::Http.
Replace http.rb's builtin redirect following option with our own
redirect follower. This fixes an issue with http.rb losing cookies after
following a redirect.
Allow cookies to be saved and sent back when making several requests in
a row. Usage:
http = Danbooru::Http.use(:session)
# saves the foo=42 cookie sent by the response.
http.get("https://httpbin.org/cookies/set/foo/42")
# sends back the foo=42 cookie from the previous request.
http.get("https://httpbin.org/cookies")
Remove the Downloads::File class. Move download methods to
Danbooru::Http instead. This means that:
* HTTParty has been replaced with http.rb for downloading files.
* Downloading is no longer tightly coupled to source strategies. Before
Downloads::File tried to automatically look up the source and download
the full size image instead if we gave it a sample url. Now we can
do plain downloads without source strategies altering the url.
* The Cloudflare Polish check has been changed from checking for a
Cloudflare IP to checking for the CF-Polished header. Looking up the
list of Cloudflare IPs was slow and flaky during testing.
* The SSRF protection code has been factored out so it can be used for
normal http requests, not just for downloads.
* The Webmock gem can be removed, since it was only used for stubbing
out certain HTTParty requests in the download tests. The Webmock gem
is buggy and caused certain tests to fail during CI.
* The retriable gem can be removed, since we no longer autoretry failed
downloads. We assume that if a download fails once then retrying
probably won't help.
Revert back to previous workaround of fetching previous day if current
day returns no result. A terrible hack, really we should convert dates
to Reportbooru's timezone, but that has other complications.
Fix gem version conflicts described in 20abd8a5f. Nokogiri couldn't be
upgraded past 1.10.9 because 1.11.0 causes a build failure in Nokogumbo
2.0.2, but we couldn't stay on 1.10.9 either because it has a hard
requirement on Ruby <2.7 and we require Ruby >=2.7. This made `bundle
update` fail with a Gemfile conflict.
The fix is to disable libxml2 support when building Nokogumbo. Nokogumbo
wants to use the same version of libxml2 as Nokogiri, but Nokogiri
1.11.0 changed how it reports which version of libxml2 it's using, which
causes Nokogumbo's build to fail. Disabling libxml2 may reduce
performance of Nokogumbo ([1]).
While we're at it, we also make Nokogiri use the system version of
libxml2 instead of its own bundled version. Nokogiri really wants
us to use its own patched version of libxml2 instead of the system
version, but the patches it applies look relatively minor and don't seem
relevant to us ([2]). Using the system version reduces build time during CI.
This adds libxml2 and libxslt as OS-level dependencies of Danbooru. You
may need to do `sudo apt-get install libxml2-dev libxslt-dev` to install
these libraries after this commit.
[1]: https://github.com/rubys/nokogumbo#flavors-of-nokogumbo
[2]: https://github.com/sparklemotion/nokogiri/tree/master/patches/libxml2
Increase timeout to 30 seconds when uploading files to IQDB. Previously
we used the default timeout of 3 seconds, which could cause 599 timeout
errors sometimes if the upload took too long.
Fix "cannot determine size of body" errors on upload page. Caused by
exception during IQDB lookup. We were posting the form data wrong, we
need to wrap the file with HTTP::FormData::File and pass it through the
`form` parameter.
Fix the sidebar on the /posts index page sometimes being blank. This
could happen when either the related tag calculation was too slow and
timed out, or when Reporbooru was unavailable and we couldn't fetch the
list of popular tags.
In the tag list would otherwise be blank, we fall back to frequent tags
(the most common tags on the current page of results).
Also change it so that if Reportbooru is unconfigured, we fail
gracefully by returning blank results instead of failing with an
exception. This is so we can still view the popular searches and missed
searches pages during testing (even though they'll be blank).
3cdf67920 changed it so that Danbooru::Http follows redirects by
default. This broke some things in the Nico Seiga strategy, so disable
following redirects in the Nico Seiga API client for now.
Also change it so that Danbooru::Http follows redirects after a POST
request (by setting `strict: false`). Nico Seiga needs this because it
sends a redirect after we POST the login form.