What's the best practice for "updating" tables in Google BigQuery?
I am getting massive batch files that need to be downloaded every hour.
Some records in batch files contain records that need to be replaced with old ones in the large target table.
If you have an ID for each record, you can concatenate the new table and the old table like this:
SELECT * FROM (SELECT * FROM [oldtable] WHERE id NOT IN (SELECT id FROM [newtable])), (SELECT * FROM [newtable])