Efficient way to export 10 datasets (each with 80 tables) from bigquery to google storage?
I have 10 datasets in BigQuery and each dataset contains 80 tables. I know that I can export every table to every dataset one by one to google store using console or web interface. This is for backup. However, this will take some time.
I wonder if there is a more convenient way to handle this?
+3
renakre
source
to share
1 answer
You can do this using the command line tools bq
( https://cloud.google.com/bigquery/bq-command-line-tool )
The following code will help you achieve the same:
project="bigquery-project"
dataset="your-dataset"
bucket="your-bucket"
# get list of tables
tables=$(bq ls "$project:$dataset" | awk '{print $1}' | tail +3)
for table in $tables
do
bq extract --destination_format "CSV" --compression "GZIP" "$project:$dataset.$table" "gs://$bucket/$table/$table.csv.gz"
done
Also, if you want to run the command in mode async
, replace the extract command with the following
bq --nosync extract --destination_format "CSV" --compression "GZIP" "$project:$dataset.$table" "gs://$bucket/$table/$table.csv.gz"
PS:
- Make sure the command line tools are
bq
installed and located in your environment. It comes withgoogle cloud SDK
. To check that you can runbq help
+4
Shamshad alam
source
to share