evazion 9ba84efc07 BURs: process BURs sequentially in a single job.
Change the way BURs are processed. Before, we spawned a background job
for each line of the BUR, then processed each job sequentially. Now, we
process the entire BUR sequentially in a single background job.

This means that:

* BURs are truly sequential now. Before certain things like removing
  aliases weren't actually performed in a background job, so they were
  performed out-of-order before everything else in the BUR.

* Before, if an alias or implication line failed, then subsequent alias
  or implication lines would still be processed. This was because each
  alias or implication line was queued as a separate job, so a failure
  of one job didn't block another. Now, if any alias or implication
  fails, the entire BUR will fail and stop processing after that line.
  This may be good or bad, depending on whether we actually need the BUR
  to be processed in order or not.

* Before, BURs were processed inside a database transaction (except for the
  actual updating of posts). Now they're not. This is because we can't
  afford to hold transactions open while processing long-running aliases
  or implications. This means that if BUR fails in the middle when it is
  initially approved, it will be left in a half-complete state. Before
  it would be rolled back and left in a pending state with no changes
  performed.

* Before, only one BUR at a time could be processed. If multiple BURs
  were approved at the same time, then they would queue up and be
  processed one at a time. Now, multiple BURs can be processed at the
  same time. This may be undesirable when processing large BURs, or BURs
  that must be approved in a specific order.

* Before, large tag category changes could time out. This was because
  they weren't actually performed in a background job. Now they are, so
  they shouldn't time out.
2021-09-20 01:12:14 -05:00
2019-10-28 21:37:34 -05:00
2021-03-01 00:39:47 -06:00
2021-09-08 05:00:54 -05:00
2017-10-09 14:45:23 -07:00
2020-06-21 15:15:47 -05:00
2021-01-28 00:22:49 -06:00
2021-09-14 21:40:39 -05:00
2020-06-09 01:35:40 -05:00
2020-06-27 13:03:04 -05:00
2021-06-17 04:10:26 -05:00
2019-10-02 01:52:24 -05:00
2021-03-01 00:39:47 -06:00
2021-03-31 21:32:01 -05:00
2020-06-07 17:14:41 -05:00
2021-09-16 00:44:26 -05:00
2021-01-28 16:20:56 +09:00
2021-09-15 06:23:20 -05:00
2019-12-22 21:23:37 -06:00
2021-09-04 07:06:58 -05:00
2021-09-14 05:39:18 -05:00

codecov Discord

Quickstart

Clone this repository and run bin/danbooru to start a basic Danbooru instance:

git clone https://github.com/danbooru/danbooru
cd danbooru
./bin/danbooru

This will install Docker Compose and use it to start Danbooru. This will take several minutes and produce lots of output. When it's done, Danbooru will be running at http://localhost.

Alternatively, if you already have Docker Compose installed, you can just do:

docker-compose -f config/docker/docker-compose.simple.yaml up

Manual Installation

Follow the INSTALL.debian script to install Danbooru.

The INSTALL.debian script is written for Debian, but can be adapted for other distributions. Danbooru has been successfully installed on Debian, Ubuntu, Fedora, Arch, and OS X. It is recommended that you use an Ubuntu-based system since Ubuntu is what is used in development and production.

See here for a guide on how set up Danbooru inside a virtual machine.

For best performance, you will need at least 256MB of RAM for PostgreSQL and Rails. The memory requirement will grow as your database gets bigger.

In production, Danbooru uses PostgreSQL 10.18, but any release later than this should work.

Troubleshooting

If your setup is not working, here are the steps I usually recommend to people:

  1. Test the database. Make sure you can connect to it using psql. Make sure the tables exist. If this fails, you need to work on correctly installing PostgreSQL, importing the initial schema, and running the migrations.

  2. Test the Rails database connection by using bin/rails console. Run Post.count to make sure Rails can connect to the database. If this fails, you need to make sure your Danbooru configuration files are correct.

  3. Test Nginx to make sure it's working correctly. You may need to debug your Nginx configuration file.

  4. Check all log files.

Services

Danboou depends on a couple of cloud services and several microservices to implement certain features.

Amazon Web Services

The following features require an Amazon AWS account:

  • Pool history
  • Post history

Google APIs

The following features require a Google Cloud account:

  • BigQuery database export

IQDB Service

IQDB integration is delegated to the IQDB service.

Archive Service

In order to access pool and post histories you will need to install and configure the Archives service.

Reportbooru Service

The following features are delegated to the Reportbooru service:

  • Post views
  • Missed searches report
  • Popular searches report

Recommender Service

Post recommendations require the Recommender service.

Description
No description provided
Readme 68 MiB
Languages
Ruby 78.3%
HTML 13.5%
JavaScript 3.5%
SCSS 2.5%
Nix 1.6%
Other 0.5%