Failed to run nutch2.3-snapshot on hadoop2.4.0 using gora0.5 and mongodb as backend datastore

I have been facing this problem for several days. When I use hasoop1.2 everything works correctly. When I go to hadoop2.x (hadoop2.4 or hadoop2.5.2) I get this problem:

java.lang.Exception: java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.mapreduce.TaskAttemptContext, but class was expected
    at org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:462)
    at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:522)
Caused by: java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.mapreduce.TaskAttemptContext, but class was expected
    at org.apache.gora.mapreduce.GoraOutputFormat.getRecordWriter(GoraOutputFormat.java:83)
    at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.<init>(MapTask.java:624)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:744)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340)
    at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:243)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
    at java.util.concurrent.FutureTask.run(FutureTask.java:262)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:744)

      

I found that when I use hasoop2.x in ivy.xml

it will automatically create hadoop-core-1.0.1.jar

, it will be affected by the gora dependency. After excluding hadoop-core- *, this problem occurs! I also update the jar file avro-mapre-1.7.6.jar

to avro-mapred-1.7.6-hadoop2.jar

manually, but unfortunately nothing changes! Any ideas would be appreciated, thanks!

+3
hadoop nutch gora


source to share


No one has answered this question yet

Check out similar questions:

1
General data record could not be transferred to Avro
1
elastic search on qbox is not available through nutch
0
NUTCH 1.13 URL selection failed: org.apache.nutch.protocol.ProtocolNotFound: protocol not found for url = http
0
Hadoop getting file not found exception
0
Map Reduce client banks for 2.4.1 hadoop in eclipse
0
OpenCV Libraries in Hadoop
0
how the following error can be solved when importing data from sqoop to hive / hdfs?
-1
Sqoop Import String hash value has special character
-3
Java ArrayIndexOutOfBound Exception in MapReduce Program
-3
Hi I am trying to run a pig script on Apache Zeppelin and it gives me the error



All Articles
Loading...
X
Show
Funny
Dev
Pics