1

I have upgraded to a recent Hadoop from Hortonworks:

Hadoop 2.4.0.2.1.2.1-471
Subversion [email protected]:hortonworks/hadoop.git -r 9e5db004df1a751e93aa89b42956c5325f3a4482
Compiled by jenkins on 2014-05-27T18:57Z
Compiled with protoc 2.5.0
From source with checksum 9e788148daa5dd7934eb468e57e037b5
This command was run using /usr/lib/hadoop/hadoop-common-2.4.0.2.1.2.1-471.jar

Before upgrading I wrote a Java MRD program that uses Hive tables both for input & output. In previous version of Hadoop it worked, notwithstanding I got deprecation warnings at compile time for this code:

    Job job = new Job(conf, "Foo");
    HCatInputFormat.setInput(job,InputJobInfo.create(dbName, inputTableName, null));

Now, after updating dependencies to new jars in Hadoop 2.4.0.2.1.2.1-471 and runing the same code I get the following error:

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hcatalog/mapreduce/InputJobInfo
    at com.bigdata.hadoop.Foo.run(Foo.java:240)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
    at com.bigdata.hadoop.Foo.main(Foo.java:272)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
Caused by: java.lang.ClassNotFoundException: org.apache.hcatalog.mapreduce.InputJobInfo
    at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
    ... 9 more

To run my code I use the following settings:

export LIBJARS=/usr/lib/hive-hcatalog/share/hcatalog/hive-hcatalog-core.jar,/usr/lib/hive/lib/hive-exec.jar,/usr/lib/hive/lib/hive-metastore.jar,/usr/lib/hive/lib/libfb303-0.9.0.jar,/usr/lib/hive/lib/jdo-api-3.0.1.jar,/usr/lib/hive/lib/antlr-runtime-3.4.jar,/usr/lib/hive/lib/datanucleus-api-jdo-3.2.6.jar,/usr/lib/hive/lib/datanucleus-core-3.2.10.jar

export HADOOP_CLASSPATH=/usr/lib/hive-hcatalog/share/hcatalog/hive-hcatalog-core.jar,/usr/lib/hive/lib/hive-exec.jar,/usr/lib/hive/lib/hive-metastore.jar,/usr/lib/hive/lib/libfb303-0.9.0.jar,/usr/lib/hive/lib/jdo-api-3.0.1.jar,/usr/lib/hive/lib/antlr-runtime-3.4.jar,/usr/lib/hive/lib/datanucleus-api-jdo-3.2.6.jar,/usr/lib/hive/lib/datanucleus-core-3.2.10.jar

Any ideas why I get java.lang.NoClassDefFoundError: org/apache/hcatalog/mapreduce/InputJobInfo ?

2
  • What if you try appending the output of hadoop classpath to your HADOOP_CLASSPATH ? Commented Jun 19, 2014 at 15:21
  • Does not help. Tried appending it to my HADOOP_CLASSPATH and also used it alone - still the same error. Iteresting thing - my code compiles OK with this new version of Hadoop, but when it runs - can not resolve class! Commented Jun 19, 2014 at 15:55

2 Answers 2

0

I think you should add the following dependency in pom.xml.

<dependency>
     <groupId>org.apache.hcatalog</groupId>
     <artifactId>hcatalog-core</artifactId>
     <version>0.11.0</version>
</dependency>
Sign up to request clarification or add additional context in comments.

Comments

0

I did face exactly same issue. In your case you will need the following jars to your class path:

1. jdo2-api-2.3-eb.jar,
2. libthrift-0.9.0.jar,
3. datanucleus-rdbms-3.2.6.jar,
4. hive-ant-0.13.0.jar 

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.