Track the history of the tag `category` and `is_deprecated` fields in the `tag_versions` table. Adds generic Versionable and VersionFor concerns that encapsulate most of the history tracking logic. These concerns are designed to make it easy to add history to any model. There are a couple notable differences between tag versions and other versions: * There is no 1 hour edit merge window. All changes to the `category` and `is_deprecated` fields produce a new version in the tag history. * New versions aren't created when a tag is created. Versions are only created when a tag is edited for the first time. The tag's initial version isn't created until *after* the tag is edited for the first time. For example, if you change the category of a tag that was last updated 10 years ago, that will create an initial version of the tag backdated to 10 years ago, plus a new version for your edit. This is for a few reasons: * So that we don't have to create new tag versions every time a new tag is created. This would be wasteful because most tags never have their category or deprecation status change. * So that if you make a typo tag, your name isn't recorded in the tag's history forever. * So that we can create new tags in various places without having to know who created the tag (which may be unknown if the current user isn't set). * Because we don't know the full history of most tags, so we have to deal with incomplete histories anyway. This has a few important consequences: * Most tags won't have any tag versions. They only gain tag versions if they're edited. * You can't track /tag_versions to see newly created tags. It only shows changes to already existing tags. * Tag version IDs won't be in strict chronological order. Higher IDs may have created_at timestamps before lower IDs. For example, if you change the category of a tag that is 10 years old, that will create an initial version with a high ID, but with a created_at timestamp dated to 10 years ago. Fixes #4402: Track tag category changes
Quickstart
Run this to start a basic Danbooru instance:
curl -sSL https://raw.githubusercontent.com/danbooru/danbooru/master/bin/danbooru | sh
This will install Docker Compose and use it to start Danbooru. When it's done, Danbooru will be running at http://localhost:3000.
Alternatively, if you already have Docker Compose installed, you can just do:
wget https://raw.githubusercontent.com/danbooru/danbooru/master/docker-compose.yaml
docker-compose up
Manual Installation
Follow the INSTALL.debian script to install Danbooru.
The INSTALL.debian script is written for Debian, but can be adapted for other distributions. Danbooru has been successfully installed on Debian, Ubuntu, Fedora, Arch, and OS X. It is recommended that you use an Ubuntu-based system since Ubuntu is what is used in development and production.
See here for a guide on how set up Danbooru inside a virtual machine.
For best performance, you will need at least 256MB of RAM for PostgreSQL and Rails. The memory requirement will grow as your database gets bigger.
In production, Danbooru uses PostgreSQL 10.18, but any release later than this should work.
Troubleshooting
If your setup is not working, here are the steps I usually recommend to people:
-
Test the database. Make sure you can connect to it using
psql. Make sure the tables exist. If this fails, you need to work on correctly installing PostgreSQL, importing the initial schema, and running the migrations. -
Test the Rails database connection by using
bin/rails console. RunPost.countto make sure Rails can connect to the database. If this fails, you need to make sure your Danbooru configuration files are correct. -
Test Nginx to make sure it's working correctly. You may need to debug your Nginx configuration file.
-
Check all log files.
Services
Danboou depends on a couple of cloud services and several microservices to implement certain features.
Amazon Web Services
The following features require an Amazon AWS account:
- Pool history
- Post history
Google APIs
The following features require a Google Cloud account:
- BigQuery database export
IQDB Service
IQDB integration is delegated to the IQDB service.
Archive Service
In order to access pool and post histories you will need to install and configure the Archives service.
Reportbooru Service
The following features are delegated to the Reportbooru service:
- Post views
- Missed searches report
- Popular searches report
Recommender Service
Post recommendations require the Recommender service.