Allow promo codes to be used during checkout if a secret promo=true url
param is passed. Allows promo codes to be offered without having the
promo code option always appear even when there aren't any active promos.
Add the following bank redirect payment methods:
* https://stripe.com/docs/payments/bancontact
* https://stripe.com/docs/payments/eps
* https://stripe.com/docs/payments/giropay
* https://stripe.com/docs/payments/ideal
* https://stripe.com/docs/payments/p24
These methods are used in Austria, Belgium, Germany, the Netherlands,
and Poland.
These methods require payments to be denominated in EUR, which means we
have to set prices in both USD and EUR, and we have to automatically
detect which currency to use based on the user's country. We also have
to automatically detect which payment methods to offer based on the
user's country. We do this by using Cloudflare's CF-IPCountry header to
geolocate the user's country.
This also switches to using prices and products defined in Stripe
instead of generated on-the-fly when creating the checkout.
Add links to the Stripe payment page and the Stripe receipt page on
completed user upgrades.
The Stripe payment link is a link to the payment details on the Stripe
dashboard and is only visible to the owner.
Require new accounts to verify their email address if any of the
following conditions are true:
* Their IP is a proxy.
* Their IP is under a partial IP ban.
* They're creating a new account while logged in to another account.
* Somebody recently created an account from the same IP in the last week.
Changes from before:
* Allow logged in users to view the signup page and create new accounts.
Creating a new account while logged in to your old account is now
allowed, but it requires email verification. This is a honeypot.
* Creating multiple accounts from the same IP is now allowed, but they
require email verification. Previously the same IP check was only for
the last day (now it's the last week), and only for an exact IP match
(now it's a subnet match, /24 for IPv4 or /64 for IPv6).
* New account verification is disabled for private IPs (e.g. 127.0.0.1,
192.168.0.1), to make development or running personal boorus easier
(fixes#4618).
* Add a frequently asked questions section.
* Add nicer looking upgrade buttons.
* Format the page nicer.
* Prevent users from attempting invalid upgrades on users that are
already Platinum or above.
Add a model to store the status of user upgrades.
* Store the upgrade purchaser and the upgrade receiver (these are
different for a gifted upgrade, the same for a self upgrade).
* Store the upgrade type: gold, platinum, or gold-to-platinum upgrades.
* Store the upgrade status:
** pending: User is still on the Stripe checkout page, no payment
received yet.
** processing: User has completed checkout, but the checkout status in
Stripe is still 'unpaid'.
** complete: We've received notification from Stripe that the payment
has gone through and the user has been upgraded.
* Store the Stripe checkout ID, to cross-reference the upgrade record on
Danbooru with the checkout record on Stripe.
This is the upgrade flow:
* When the user clicks the upgrade button on the upgrade page, we call
POST /user_upgrades and create a pending UserUpgrade.
* We redirect the user to the checkout page on Stripe.
* When the user completes checkout on Stripe, Stripe sends us a webhook
notification at POST /webhooks/receive.
* When we receive the webhook, we check the payment status, and if it's
paid we mark the UserUpgrade as complete and upgrade the user.
* After Stripe sees that we have successfully processed the webhook,
they redirect the user to the /user_upgrades/:id page, where we show
the user their upgrade receipt.
This upgrades from the legacy version of Stripe's checkout system to the
new version:
> The legacy version of Checkout presented customers with a modal dialog
> that collected card information, and returned a token or a source to
> your website. In contrast, the new version of Checkout is a smart
> payment page hosted by Stripe that creates payments or subscriptions. It
> supports Apple Pay, Dynamic 3D Secure, and many other features.
Basic overview of the new system:
* We send the user to a checkout page on Stripe.
* Stripe collects payment and sends us a webhook notification when the
order is complete.
* We receive the webhook notification and upgrade the user.
Docs:
* https://stripe.com/docs/payments/checkout
* https://stripe.com/docs/payments/checkout/migration#client-products
* https://stripe.com/docs/payments/handling-payment-events
* https://stripe.com/docs/payments/checkout/fulfill-orders
* Fix a bug where non-GET 404 requests weren't handled.
* Fix a bug where non-HTML 404 requests weren't handled.
* Show a random image from a specified pool on the 404 page.
Add a debug mode option. This is useful when debugging failed tests.
Debug mode disables parallel testing so you can set breakpoints in tests
with binding.pry (normally parallel testing makes it hard to set
breakpoints).
Debug mode also disables global exception handling for controllers. This
lets exceptions bubble up to the console during controller tests
(normally exceptions are swallowed by the controller, which prevents you
from seeing backtraces in failed controller tests).
Fix session cookies being sent in publicly cached /autocomplete.json
responses. We can't set any cookies in a response that is being publicly
cached, otherwise they'll be visible to other users. If a user's session
cookies were to be cached, then it would allow their account to be stolen.
In reality, well-behaved caches like Cloudflare will simply refuse to
cache responses that contain cookies to avoid this scenario.
https://support.cloudflare.com/hc/en-us/articles/200172516-Understanding-Cloudflare-s-CDN:
BYPASS is returned when enabling Origin Cache-Control. Cloudflare also
sets BYPASS when your origin web server sends cookies in the response
header.
* Test that the user upgrade process integrates with Stripe correctly.
* Replace a deprecated `card` param with `source` in `Stripe::Charge.create`.
* Rescue Stripe::StripeError instead of Stripe::CardError so that we
handle failures outside of card failures, such as network errors.
New rules for user promotions:
* Moderators can no longer promote other users to moderator level. Only
Admins can promote users to Mod level. Mods can only promote up to Builder level.
* Admins can no longer promote other users to Admin level. Only Owners
can promote users to Admin. Admins can only promote up to Mod level.
* Admins can no longer demote themselves or other admins.
These rules are being changed to account for the new Owner user level.
Also change it so that when a user upgrades their account, the promotion
is done by DanbooruBot. This means that the inviter and the mod action
will show DanbooruBot as the promoter instead of the user themselves.
The previous cache policy was that all autocomplete results were cached
for a fixed 7 days. The new policy is that if autocomplete returns more
than 10 results they're cached for 24 hours, otherwise if it returns
less than 10 results they're cached for 1 hour.
The rationale is that if autocomplete returns a lot of results, then the
top 10 results are relatively stable and unlikely to change, but if it
returns less than 10 results, then the results are unstable and can be
easily changed.
We also change it so that autocomplete calls can be cached publicly.
Public caching means that HTTP requests are cached by Cloudflare. This
will ideally reduce load on the server and reduce latency for end users.
This is only safe for calls that return the same results for all users
(i.e. the results don't depend on the current user), since the cache is
publicly shared by all users. Currently username, favgroup, and saved
search autocomplete results depend on the current user, so they can't be
publicly cached.
This refactors the autocomplete Javascript to use a single dedicated
/autocomplete.json endpoint instead of a bunch of separate endpoints.
This simplifies the autocomplete Javascript by making it so that instead
of calling a different endpoint for each type of query (for users, wiki
pages, pools, artists, etc), then having to parse the results of each
call to get the data we need, we can call a single endpoint that returns
exactly what we need.
This also means we don't have to parse searches clientside in order to
autocomplete metatags. Instead we can just pass the search term to the
server and let it parse the search, which is easy to do serverside.
Finally, this makes autocomplete easier to test, and it makes it easier
to add more sophisticated autocomplete behavior, since most of the logic
lives serverside.
* Add index on posts.is_deleted. The modqueue was slow because we the
appeal condition wasn't constrained to deleted posts, so it degraded to
a full table scan.
* Avoid extra queries for calculating the page count and disapproval counts.
* Include appealed posts in the modqueue.
* Add `status` field to appeals. Appeals start out as `pending`, then
become `rejected` if the post isn't approved within three days. If the
post is approved, the appeal's status becomes `succeeded`.
* Add `status` field to flags. Flags start out as `pending` then become
`rejected` if the post is approved within three days. If the post
isn't approved, the flag's status becomes `succeeded`.
* Leave behind a "Unapproved in three days" dummy flag when an appeal
goes unapproved, just like when a pending post is unapproved.
* Only allow deleted posts to be appealed. Don't allow flagged posts to be appealed.
* Add `status:appealed` metatag. `status:appealed` is separate from `status:pending`.
* Include appealed posts in `status:modqueue`. Search `status:modqueue order:modqueue`
to view the modqueue as a normal search.
* Retroactively set old flags and appeals as succeeded or rejected. This
may not be correct for posts that were appealed or flagged multiple
times. This is difficult to set correctly because we don't have
approval records for old posts, so we can't tell the actual outcome of
old flags and appeals.
* Deprecate the `is_resolved` field on post flags. A resolved flag is a
flag that isn't pending.
* Known bug: appealed posts have a black border instead of a blue
border. Checking whether a post has been appealed would require either
an extra query on the posts/index page, or an is_appealed flag on
posts, neither of which are very desirable.
* Known bug: you can't use `status:appealed` in blacklists, for the same
reason as above.
* Remove unused `ban` and `without_mod_action` options.
* Don't try to set the `is_banned` flag during deletion.
* Don't create modactions for automatic "unapproved in 3 days"
deletions, only to delete them after the fact.
Rework post deletion from using a separate page to using a dialog box,
like flagging.
* Add `DELETE /posts/:id` endpoint.
* Remove `POST /moderator/post/posts/:id/delete` endpoint.
* Show a banner if the user is restricted because they signed up from a
proxy or VPN.
* Add an option to resend the confirmation email if your account has an
unverified email address.
Rework sitemaps to provide more coverage of the site. We want every
important page on the site - including every post, tag, and wiki page -
to be indexed by Google. We do this by generating sitemaps and sitemap
indexes that contain links to every important page on the site.
Bug: when a search timed out we got the generic failbooru page instead
of the search timeout error page.
Cause: when rendering the <link rel="next"> / <link rel="prev"> tags in
the header, we may need to evaluate the search to determine the next or
previous page, but if the searches times out then this fails, which
caused Rails to throw a ActionView::Template::Error because an exception
was thrown while rendering the template.
Likewise, rendering the attributes for the <body> tag could fail with an
ActionView::Template::Error because the call to `current_item.present?`
forced evaluation of the search.
Replace the mocked services in scripts/mocked_services with Rails-level
mocked services.
The scripts in scripts/mocked_services were a set of stub Sinatra
servers used to mock the Reportbooru, Recommender, and IQDBs services
during development. They return fake data so you can test pages that use
these services.
Implementing these services in Rails makes it easier to run them. It
also lets us drop a dependency on Sinatra and drop a use of HTTParty.
To use these services, set the following configuration in danbooru_local_config.rb
or .env.local:
* reportbooru_server: http://localhost:3000/mock/reportbooru
* recommender_server: http://localhost:3000/mock/recommender
* iqdbs_server: http://localhost:3000/mock/iqdb
where `http://localhost:300` is the url for your local Danbooru server
(may need to be changed depending on your configuration).
Revert back to previous workaround of fetching previous day if current
day returns no result. A terrible hack, really we should convert dates
to Reportbooru's timezone, but that has other complications.