Uploading JSON files to BigQuery from local disk using Java

I'm new to Google BigQuery.
Looking for a sample JAVA code that will fetch JSON files from my local drive and upload to BigQuery. In this process, the code should:

  • read from a file on local disk containing a delimited JSON data string
  • create a new table in BigQuery
  • create table schema reading JSON from file in real time
  • load it into a new BigQuery spreadsheet.

Any help would be a jump for me. Pls let me know if my requirement here is clear!



source to share

2 answers



public long writeJSONFileToTable(String datasetName, String tableName, Path JSONPath, String location)
      throws IOException, InterruptedException, TimeoutException {
      LoadStatistics stats = null;
      try {
            //[START bigquery_load_from_file]
            TableId tableId = TableId.of(datasetName, tableName);
            //WriteChannelConfiguration writeChannelConfiguration = WriteChannelConfiguration.newBuilder(tableId).setFormatOptions(FormatOptions.csv()).build();
            WriteChannelConfiguration writeChannelConfiguration = WriteChannelConfiguration.newBuilder(tableId).setFormatOptions(FormatOptions.json()).build();
            // The location must be specified; other fields can be auto-detected.
            JobId jobId = JobId.newBuilder().setLocation(location).build();
            TableDataWriteChannel writer = bigquery.writer(jobId, writeChannelConfiguration);

            // Write data to writer
            try (OutputStream stream = Channels.newOutputStream(writer)) {
              Files.copy(JSONPath, stream);
            // Get load job
            Job job = writer.getJob();
            job = job.waitFor();
            stats = job.getStatistics();
            System.out.println("State: " + job.getStatus().getState());

            // [END bigquery_load_from_file]
    } catch (Exception e) {
        // TODO: handle exception           
      return stats.getOutputRows();    




All Articles