Refactors the upload process to pass around temp files, rather than passing around file paths and directly writing output to the local filesystem. This way we can pass the storage manager the preview / sample / original temp files, so it can deal with storage itself. * Change Download::File#download! to return a temp file. * Change DanbooruImageResizer and PixivUgoiraConverter to accept/return temp files instead of file paths. * Change Upload#generate_resizes to return temp files for previews and samples. * Change Upload#generate_resizes to generate ugoira .webm samples synchronously instead of asynchronously.
Installation
It is recommended that you install Danbooru on a Debian-based system since most of the required packages are available on APT. Danbooru has been successfully installed on Fedora, CentOS, FreeBSD, and OS X. The INSTALL.debian install script is straightforward and should be simple to adapt for other platforms.
For best performance, you will need at least 256MB of RAM for PostgreSQL and Rails. The memory requirement will grow as your database gets bigger.
On production Danbooru uses PostgreSQL 9.4, but any 9.x release should work.
Use your operating system's package management system whenever possible. This will simplify the process of installing init scripts, which will not always happen when compiling from source.
Troubleshooting
These instructions won't work for everyone. If your setup is not working, here are the steps I usually recommend to people:
-
Test the database. Make sure you can connect to it using psql. Make sure the tables exist. If this fails, you need to work on correctly installing PostgreSQL, importing the initial schema, and running the migrations.
-
Test the Rails database connection by using rails console. Run Post.count to make sure Rails can connect to the database. If this fails, you need to make sure your Danbooru configuration files are correct.
-
Test Nginx to make sure it's working correctly. You may need to debug your Nginx configuration file.
-
Check all log files.
Amazon Web Services
In order to enable the following features, you will need an AWS SQS account:
- Pool versions
- Post versions
- IQDB
- Saved searches
- Related tags
Google APIs
The following features requires a Google API account:
- Bulk revert
- Post versions report
IQDB Integration
IQDB integration is now delegated to the IQDBS service.
You will need to install your own copy and enable the appropriate configuration settings.
Listbooru Service
In order to access saved search functionality you will need to install and configure the Listbooru service.
Archive Service
In order to access versioned data for pools and posts you will need to install and configure the Archives service.
Reportbooru Service
The following features are delegated to the Reportbooru service:
- Related tags
- Missed searches report
- Popular searches report
- Favorite searches
- Upload trend graphs
- Similar users (via favorites and post votes)