Spark HbaseRDD gives an exception

I am trying to read Hbase form using the following code

JavaPairRDD<ImmutableBytesWritable, Result> pairRdd = ctx
        .newAPIHadoopRDD(conf, TableInputFormat.class,
                ImmutableBytesWritable.class,
                org.apache.hadoop.hbase.client.Result.class).cache().cache();

System.out.println(pairRdd.count());

      

But java.lang.IllegalStateException: unread block data

Find below code

    SparkConf sparkConf = new SparkConf().setAppName("JavaSparkSQL");
sparkConf.set("spark.master","spark://192.168.50.247:7077");

      

/ * String [] stjars = {"/home/BreakDown/SparkDemo2/target/SparkDemo2-0.0.1-SNAPSHOT.jar"}; sparkConf.setJars (stjars); * / JavaSparkContext ctx = new JavaSparkContext (sparkConf); JavaSQLContext sqlCtx = new JavaSQLContext (ctx);

Configuration conf= HBaseConfiguration.create();
;
conf.set("hbase.master","192.168.50.73:60000");
conf.set("hbase.zookeeper.quorum","192.168.50.73");
conf.set("hbase.zookeeper.property.clientPort","2181");
conf.set("zookeeper.session.timeout","6000");
conf.set("zookeeper.recovery.retry","1");


conf.set("hbase.mapreduce.inputtable","employee11");

      

Any pointer would be very helpful

Spark version 1.1.1 hadoop 2 hadoop 2.2.0 Hbase 0.98.8-hadoop2

PFB Stack Trace 12/14/17 21:18:45 WARN NativeCodeLoader: Could not load native-hadoop library for your platform ... using built-in Java classes where applicable 12/14/17 21:18:46 INFO AppClient $ ClientActor: Connecting to main spark: //192.168.50.247: 7077 ... 12/14/17 21:18:46 INFO SparkDeploySchedulerBackend: SchedulerBackend is ready for scheduling starting from reaching minRegisteredResourcesRatio: 0.0 12/14/17 21:18:46 INFO SparkDeploySchedulerBackend: connected to Spark cluster with app app ID-20141217211846-0035 12/14/17 21:18:47 INFO TaskSetManager: start task 0.0 at stage 0.0 (TID 0, 192.168.50.253, ANY, 1256 bytes) 14 / 12/17 21:18:47 INFO BlockManagerMasterActor: Block manager registration 192.168.50.253:41717 with 265.4 MB of RAM, BlockManagerId (0, 192.168.50.253, 41717, 0) 14/12/17 21:18:48 WARN TaskSetManager: Lost Task 0.0 at Stage 0.0 (TID 0, 192.168.50.253): java.lang.IllegalStateException: Unread Block Data java.io.ObjectInputStream $ BlockDataInputStream.setBlockDataMode (ObjectInputStream.java:2420) java.io.OreamOream readObject0 (ObjectInputStream.java:1380) java.io.ObjectInputStream.defaultReadFields (ObjectInputStream.java:1989) java.io.ObjectInputStream.readSerialData (ObjectInputStream.java:1913) java.io.ObjectInputStream. java.io.ObjectInputStream.readObject0 (ObjectInputStream.java:1348) java.io.ObjectInputStream.readObject (ObjectInputStream.java:370) org.apache.spark.serializer.JavaDeserializationStream.readObject (JavaSerializer.scapala: 62) spark.serializer.JavaSerializerInstance.deserialize (JavaSerializer.scala: 87) org.apache.spark.executor.Executor $ TaskRunner.run (Executor.scala: 160) java.util.concurrent.ThreadPoolExecutor.runWorker (ThreadPoolExecutor.java:1145) java.util.concurrent. ThreadPoolExecutor $ Worker.run (ThreadPoolExecutor.java:615) java.lang.Thread.run (Thread.java:724)

+3


source to share





All Articles