Rework the rate limit implementation to make it more flexible: * Allow setting different rate limits for different actions. Before we had a single rate limit for all write actions. Now different controller endpoints can have different limits. * Allow actions to be rate limited by user ID, by IP address, or both. Before actions were only limited by user ID, which meant non-logged-in actions like creating new accounts or attempting to login couldn't be rate limited. Also, because actions were limited by user ID only, you could use multiple accounts with the same IP to get around limits. Other changes: * Remove the API Limit field from user profile pages. * Remove the `remaining_api_limit` field from the `/profile.json` endpoint. * Rename the `X-Api-Limit` header to `X-Rate-Limit` and change it from a number to a JSON object containing all the rate limit info (including the refill rate, the burst factor, the cost of the call, and the current limits). * Fix a potential race condition where, if you flooded requests fast enough, you could exceed the rate limit. This was because we checked and updated the rate limit in two separate steps, which was racy; simultaneous requests could pass the check before the update happened. The new code uses some tricky SQL to check and update multiple limits in a single statement.
Installation
It is recommended that you install Danbooru on a Debian-based system since most of the required packages are available on APT. Danbooru has been successfully installed on Fedora, CentOS, FreeBSD, and OS X. The INSTALL.debian install script is straightforward and should be simple to adapt for other platforms.
For best performance, you will need at least 256MB of RAM for PostgreSQL and Rails. The memory requirement will grow as your database gets bigger.
On production Danbooru uses PostgreSQL 9.4, but any 9.x release should work.
Use your operating system's package management system whenever possible. This will simplify the process of installing init scripts, which will not always happen when compiling from source.
Troubleshooting
These instructions won't work for everyone. If your setup is not working, here are the steps I usually recommend to people:
-
Test the database. Make sure you can connect to it using psql. Make sure the tables exist. If this fails, you need to work on correctly installing PostgreSQL, importing the initial schema, and running the migrations.
-
Test the Rails database connection by using rails console. Run Post.count to make sure Rails can connect to the database. If this fails, you need to make sure your Danbooru configuration files are correct.
-
Test Nginx to make sure it's working correctly. You may need to debug your Nginx configuration file.
-
Check all log files.
Services
Danbooru employs numerous external services to delegate some functionality.
For development purposes, you can just run mocked version of these
services. They're available in scripts/mock_services and can be started
automatically using Foreman and the provided Procfile.
Amazon Web Services
In order to enable the following features, you will need an AWS SQS account:
- Pool versions
- Post versions
- IQDB
- Saved searches
- Related tags
Google APIs
The following features requires a Google API account:
- Bulk revert
- Post versions report
IQDB Service
IQDB integration is delegated to the IQDBS service.
Archive Service
In order to access versioned data for pools and posts you will need to install and configure the Archives service.
Reportbooru Service
The following features are delegated to the Reportbooru service:
- Related tags
- Missed searches report
- Popular searches report
- Favorite searches
- Upload trend graphs
Recommender Service
Post recommendations require the Recommender service.
Cropped Thumbnails
There's optional support for cropped thumbnails. This relies on installing
libvips-8.6 or higher and setting Danbooru.config.enable_image_cropping
to true.