Fix the approver:, parent:, and pixiv: metatags not working correctly when negated:
* Fix -approver:<name> not including posts that don't have an approver (the approver_id is NULL)
* Fix -parent:<id> not including posts that don't have a parent (the parent_id is NULL)
* Fix -pixiv:<id> not including posts that aren't from Pixiv (the pixiv_id is NULL)
The problem lies how the equality operator is negated when the column contains NULL values;
`approver_id != 52664` doesn't match posts where the `approver_id` is NULL.
The search `approver:evazion` boils down to:
# Post.where(approver_id: 52664).to_sql
SELECT * FROM posts WHERE approver_id = 52664;
When that is negated with `-approver:evazion`, it becomes:
# Post.where(approver_id: 52664).invert_where.to_sql
SELECT * FROM posts WHERE approver_id != 52664;
But in SQL, `approver_id != 52664` doesn't match when the approver_id IS NULL, so the search doesn't
include posts without an approver.
We could use `a IS NOT DISTINCT FROM b` instead of `a = b`:
# Post.where(Post.arel_table[:approver_id].is_not_distinct_from(52664)).to_sql
SELECT * FROM posts WHERE approver_id IS NOT DISTINCT FROM 52664;
This way when it's inverted it becomes `IS DISTINCT FROM`:
# Post.where(Post.arel_table[:approver_id].is_not_distinct_from(52664)).invert_where.to_sql
SELECT * FROM posts WHERE approver_id IS NOT DISTINCT FROM 52664;
`approver_id IS DISTINCT FROM 52664` is like `approver_id != 52664`, except it matches when
approver_id is NULL [1].
This works correctly, however the problem is that `IS NOT DISTINCT FROM` can't use indexes because
of a long-standing Postgres limitation [2]. This makes searches too slow. So instead we do this:
# Post.where(approver_id: 52664).where.not(approver_id: nil).to_sql
SELECT * FROM posts WHERE approver_id = 52664 AND approver_id IS NOT NULL;
That way when negated it becomes:
# Post.where(approver_id: 52664).where.not(approver_id: nil).invert_where.to_sql
SELECT * FROM posts WHERE approver_id != 52664 OR approver_id IS NULL;
Which is the correct behavior.
[1] https://modern-sql.com/feature/is-distinct-from
[2] https://www.postgresql.org/message-id/6FC83909-5DB1-420F-9191-DBE533A3CEDE@excoventures.com
Quickstart
Run this to start a basic Danbooru instance:
curl -sSL https://raw.githubusercontent.com/danbooru/danbooru/master/bin/danbooru | sh
This will install Docker Compose and use it to start Danbooru. When it's done, Danbooru will be running at http://localhost:3000.
Alternatively, if you already have Docker Compose installed, you can just do:
wget https://raw.githubusercontent.com/danbooru/danbooru/master/docker-compose.yaml
docker-compose up
Manual Installation
Follow the INSTALL.debian script to install Danbooru.
The INSTALL.debian script is written for Debian, but can be adapted for other distributions. Danbooru has been successfully installed on Debian, Ubuntu, Fedora, Arch, and OS X. It is recommended that you use an Ubuntu-based system since Ubuntu is what is used in development and production.
See here for a guide on how set up Danbooru inside a virtual machine.
For best performance, you will need at least 256MB of RAM for PostgreSQL and Rails. The memory requirement will grow as your database gets bigger.
In production, Danbooru uses PostgreSQL 10.18, but any release later than this should work.
Troubleshooting
If your setup is not working, here are the steps I usually recommend to people:
-
Test the database. Make sure you can connect to it using
psql. Make sure the tables exist. If this fails, you need to work on correctly installing PostgreSQL, importing the initial schema, and running the migrations. -
Test the Rails database connection by using
bin/rails console. RunPost.countto make sure Rails can connect to the database. If this fails, you need to make sure your Danbooru configuration files are correct. -
Test Nginx to make sure it's working correctly. You may need to debug your Nginx configuration file.
-
Check all log files.
Services
Danboou depends on a couple of cloud services and several microservices to implement certain features.
Amazon Web Services
The following features require an Amazon AWS account:
- Pool history
- Post history
Google APIs
The following features require a Google Cloud account:
- BigQuery database export
IQDB Service
IQDB integration is delegated to the IQDB service.
Archive Service
In order to access pool and post histories you will need to install and configure the Archives service.
Reportbooru Service
The following features are delegated to the Reportbooru service:
- Post views
- Missed searches report
- Popular searches report
Recommender Service
Post recommendations require the Recommender service.