Files
danbooru/test/unit/bigquery_export_service_test.rb
evazion f235b72b3f Export public database dumps to BigQuery.
* Export daily public database dumps to BigQuery and Google Cloud Storage.
* Only data visible to anonymous users is exported. Some tables have
  null or missing fields because of this.
* The bans table is excluded because some bans have an expires_at
  timestamp set beyond year 9999, which BigQuery doesn't support.
* The favorites table is excluded because it's too slow to dump (it
  doesn't have an id index, which is needed by find_each).
* Version tables are excluded because dumping them every day is
  inefficient, streaming insertions should be used instead.

Links:

* https://console.cloud.google.com/bigquery?project=danbooru1
* https://console.cloud.google.com/storage/browser/danbooru_public
* https://storage.googleapis.com/danbooru_public/data/posts.json
2021-03-10 02:52:16 -06:00

20 lines
660 B
Ruby

require 'test_helper'
class BigqueryExportServiceTest < ActiveSupport::TestCase
context "BigqueryExportService: " do
context "#async_export_all! method" do
should "export all tables to BigQuery" do
@post = create(:post, tag_string: "tagme")
@bigquery = BigqueryExportService.new(dataset_name: "testbooru_export")
skip unless @bigquery.enabled?
BigqueryExportService.async_export_all!(dataset_name: "testbooru_export")
perform_enqueued_jobs
assert_equal(1, @bigquery.dataset.table("posts").rows_count)
assert_equal(1, @bigquery.dataset.table("tags").rows_count)
end
end
end
end