Export public database dumps to BigQuery.
* Export daily public database dumps to BigQuery and Google Cloud Storage. * Only data visible to anonymous users is exported. Some tables have null or missing fields because of this. * The bans table is excluded because some bans have an expires_at timestamp set beyond year 9999, which BigQuery doesn't support. * The favorites table is excluded because it's too slow to dump (it doesn't have an id index, which is needed by find_each). * Version tables are excluded because dumping them every day is inefficient, streaming insertions should be used instead. Links: * https://console.cloud.google.com/bigquery?project=danbooru1 * https://console.cloud.google.com/storage/browser/danbooru_public * https://storage.googleapis.com/danbooru_public/data/posts.json
This commit is contained in:
2
Gemfile
2
Gemfile
@@ -47,6 +47,8 @@ gem 'nokogiri'
|
||||
gem 'view_component', require: 'view_component/engine'
|
||||
gem 'tzinfo-data'
|
||||
gem 'hsluv'
|
||||
gem 'google-cloud-bigquery', require: "google/cloud/bigquery"
|
||||
gem 'google-cloud-storage', require: "google/cloud/storage"
|
||||
|
||||
group :production, :staging do
|
||||
gem 'unicorn', :platforms => :ruby
|
||||
|
||||
Reference in New Issue
Block a user