File not found when running Spark Job with data from Google Storage Bucket

I am doing work on a Google Cloud Dataproc cluster that takes one parameter, the path to the input file. This file is stored in a slave Google Cloud Storage. I am getting FileNotFoundException (trace below). Why would that be?

gcloud dataproc jobs submit spark --cluster cluster-1 --class MST.ComputeMST \
    --jars gs://dataproc-211700eb-83ed-456d-a67e-98af9e6fa02d-us/ComputeMST.jar \
    -- gs:///dataproc-211700eb-83ed-456d-a67e-98af9e6fa02d-us/input.txt

Job [8b193fcd-1350-462b-ae11-373333e868fe] submitted.
Waiting for job output...
17/05/16 05:06:02 INFO com.google.cloud.hadoop.fs.gcs.GoogleHadoopFileSystemBase: GHFS version: 1.6.1-hadoop2
number of runs = 0
Exception in thread "main" java.io.FileNotFoundException: gs:/dataproc-211700eb-83ed-456d-a67e-98af9e6fa02d-us/input.txt (No such file or directory)
  at java.io.FileInputStream.open0(Native Method)
  at java.io.FileInputStream.open(FileInputStream.java:195)
  at java.io.FileInputStream.<init>(FileInputStream.java:138)
  at java.io.FileInputStream.<init>(FileInputStream.java:93)
  at java.io.FileReader.<init>(FileReader.java:58)
  at MST.ComputeMST.main(ComputeMST.java:670)
  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  at java.lang.reflect.Method.invoke(Method.java:498)
  at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:736)
  at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
  at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
  at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
  at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
ERROR: (gcloud.dataproc.jobs.submit.spark) Job [8b193fcd-1350-462b-ae11-373333e868fe] entered state [ERROR] while waiting for [DONE].

      

+3


source to share


1 answer


Although the GCS connector is installed by default on the Cloud Dataproc cluster, you cannot use it from your job through the interface java.io.FileReader

.



To access GCS objects through the GCS connector, you must use the Hadoop interface FileSystem

.

0


source







All Articles