Exception on thread "main" java.lang.NoClassDefFoundError: org / apache / hadoop / hbase / HBaseConfiguration

I am using Hadoop 1.0.3 and HBase 0.94.22. I am trying to run a mapper program to read values ​​from an Hbase table and output them to a file. I am getting the following error:

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Class.java:340)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.HBaseConfiguration
    at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

      

The code looks like this

import java.io.IOException;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.client.Result;
import org.apache.hadoop.hbase.client.Scan;
import org.apache.hadoop.hbase.filter.FirstKeyOnlyFilter;
import org.apache.hadoop.hbase.io.ImmutableBytesWritable;
import org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil;
import org.apache.hadoop.hbase.mapreduce.TableMapper;
import org.apache.hadoop.hbase.util.Bytes;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;


    public class Test {

    static class TestMapper extends TableMapper<Text, IntWritable> {
        private static final IntWritable one = new IntWritable(1);

        public void map(ImmutableBytesWritable row, Result value, Context context) throws    IOException, InterruptedException
        {
            ImmutableBytesWritable userkey = new ImmutableBytesWritable(row.get(), 0 , Bytes.SIZEOF_INT);
            String key =Bytes.toString(userkey.get());
            context.write(new Text(key), one);

        }
    }


    public static void main(String[] args) throws Exception {

        HBaseConfiguration conf = new HBaseConfiguration();
        Job job = new Job(conf, "hbase_freqcounter");
        job.setJarByClass(Test.class);
        Scan scan = new Scan();

        FileOutputFormat.setOutputPath(job, new Path(args[0]));
        String columns = "data";
        scan.addFamily(Bytes.toBytes(columns));
        scan.setFilter(new FirstKeyOnlyFilter());
        TableMapReduceUtil.initTableMapperJob("test",scan, TestMapper.class, Text.class, IntWritable.class, job);
        job.setOutputKeyClass(Text.class);
        job.setOutputValueClass(IntWritable.class);
        System.exit(job.waitForCompletion(true)?0:1);

    }

}

      

I am getting the above code exported to a jar file and on the command line I use the following command to run the above code.

hadoop jar / home / testdb.jar test

where test is the folder where the matching results should be written.

I checked a few other links like Caused: java.lang.ClassNotFoundException: org.apache.zookeeper.KeeperException where it was suggested to include the zookeeper file in the classpath, but when building the project in eclipse I already included the zookeeper file from the hbase lib directory. The file I included is zookeeper-3.4.5.jar. Ans also visited this HBase link - java.lang.NoClassDefFoundError in java , but I am using the mapper class to get the values ​​from the hbase table, not the client API. I know I'm wrong guys, can you please help me?

I noticed another strange thing when I remove all the code in the main function except the first line "HBaseConfiguration conf = new HBaseConfiguration ();" then export the code to a jar file and try to compile the jar file as a hasoop jar test.jar I don't care getting the same error. It seems that I am either defining the conf configuration incorrectly, or there is some problem with my environment.

+3


source to share


5 answers


I have a problem fixing, I didn't add the hbase classpath to the hadoop-env.sh file. Below is the one I added to make it work.



$ export HADOOP_CLASSPATH=$HBASE_HOME/hbase-0.94.22.jar:\
    $HBASE_HOME/hbase-0.94.22-test.jar:\
    $HBASE_HOME/conf:\
    ${HBASE_HOME}/lib/zookeeper-3.4.5.jar:\
    ${HBASE_HOME}/lib/protobuf-java-2.4.0a.jar:\
    ${HBASE_HOME}/lib/guava-11.0.2.jar

      

+6


source


I tried to edit the file hadoop-env.sh

, but the changes mentioned here did not work for me.

What happened:



export HADOOP_CLASSPATH="$HADOOP_CLASSPATH:$HBASE_HOME/lib/*"

      

I just added that at the end of mine hadoop-env.sh

. Don't forget to set the variable HBASE_HOME

. You can also replace $HBASE_HOME

with the actual installation path of hbase.

+3


source


If there is someone with different paths / config. Here's what I added hadoop-env.sh

to make it work:

$ export HADOOP_CLASSPATH="$HBASE_HOME/lib/hbase-client-0.98.11-hadoop2.jar:\
    $HBASE_HOME/lib/hbase-common-0.98.11-hadoop2.jar:\
    $HBASE_HOME/lib/protobuf-java-2.5.0.jar:\
    $HBASE_HOME/lib/guava-12.0.1.jar:\
    $HBASE_HOME/lib/zookeeper-3.4.6.jar:\
    $HBASE_HOME/lib/hbase-protocol-0.98.11-hadoop2.jar"

      

NOTE: if you haven't installed $HBASE_HOME

, you have 2 options. - Under export HBASE_HOME=[your hbase installation path]

 - Or just replace $HBASE_HOME

with the full path hbase

+2


source


here CreateTable is my java class file

use this command

hasoop @osboxes: ~ / hbase / hbase-0.94.8 / bin $ java -cp.:/home/hadoop/hbase/hbase-0.94.8/hbase-0.94.8.jar:/home/hadoop/hbase/hbase -0.94.8 / lib / * CreateTable

0


source


HADOOP_USER_CLASSPATH_FIRST=true \
HADOOP_CLASSPATH=$($HBASE_HOME/bin/hbase mapredcp) \
hadoop jar  /home/testdb.jar test 

      

-1


source







All Articles