How to resolve NoClassDefFoundError while using Hadoop? - java

I am getting
Exception in thread "main" java.lang.NoClassDefFoundError: com/linkedin/camus/etl/IEtlKey.
On running the command:
hadoop jar camus-etl-kafka-0.1.0-SNAPSHOT.jar
com.linkedin.camus.etl.kafka.CamusJob -P camus.properties
I am getting the below exceptions..
2016-04-27 11:34:04.622 java[13567:351959] Unable to load realm mapping info from SCDynamicStore
[NativeCodeLoader] - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Exception in thread "main" java.lang.NoClassDefFoundError: com/linkedin/camus/etl/IEtlKey
at com.linkedin.camus.etl.kafka.CamusJob.run(CamusJob.java:252)
at com.linkedin.camus.etl.kafka.CamusJob.run(CamusJob.java:235)
at com.linkedin.camus.etl.kafka.CamusJob.run(CamusJob.java:691)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
at com.linkedin.camus.etl.kafka.CamusJob.main(CamusJob.java:646)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: java.lang.ClassNotFoundException: com.linkedin.camus.etl.IEtlKey
I have included camus-example-0.1.0-SNAPSHOT-shaded.jar in the classpath .
Please let me know if I am missing something .
Thanks in Advance
Soumyajit

You should try to include camus-api you can find on this LinkedIn's previous generation Kafka to HDFS pipeline page, since the missing class is contained in this package, as you can see here.
Pay attention to other transitive dependencies that may be required by Camus.
In addition, to be sure that classes will be found in the classpath when you use the hadoop jar from command line, you can add the libjars command line option, as reported in Using the libjars option with Hadoop:
$ export LIBJARS=/path/jar1,/path/jar2
$ hadoop jar my-example.jar com.example.MyTool -libjars ${LIBJARS} -mytoolopt value
It could be useful to know that Camus is going to be superseded by Gobblin:
Camus is being phased out and replaced by Gobblin. For those using or
interested in Camus, we suggest taking a look at Gobblin.
For instructions on Migrating from Camus to Gobblin, please take
a look at Camus Gobblin Migration.

Related

SonarQube Scanner error when run "sonar-scanner" command

I want to use sonarQube to analyze an android project.
While I am trying to install SonarQube Scanner, I have faced some problem.
I received the following error when running sonar-scanner command from the project directory.
*I have followed the Installation steps from this link https://docs.sonarqube.org/display/SCAN/Analyzing+with+SonarQube+Scanner
*I am using SonarQube 6.5 and SonarScanner 3.0.3
ERROR: Error during SonarQube Scanner execution
org.sonar.squidbridge.api.AnalysisException: Please provide compiled
classes of
your project with sonar.java.binaries property
at org.sonar.java.JavaClasspath.init(JavaClasspath.java:59)
at org.sonar.java.AbstractJavaClasspath.getElements (AbstractJavaClasspath.java:281)
at org.sonar.java.SonarComponents.getJavaClasspath(SonarComponents.java:
141)
at org.sonar.java.JavaSquid.<init>(JavaSquid.java:83)
at org.sonar.plugins.java.JavaSquidSensor.execute(JavaSquidSensor.java:8
3)
at org.sonar.scanner.sensor.SensorWrapper.analyse(SensorWrapper.java:53)
at org.sonar.scanner.phases.SensorsExecutor.executeSensor(SensorsExecutor.java:88)
at org.sonar.scanner.phases.SensorsExecutor.execute(SensorsExecutor.java
:82)
at org.sonar.scanner.phases.SensorsExecutor.execute(SensorsExecutor.java
:68)
at org.sonar.scanner.phases.AbstractPhaseExecutor.execute(AbstractPhaseE
xecutor.java:78)
at org.sonar.scanner.scan.ModuleScanContainer.doAfterStart(ModuleScanCon
tainer.java:179)
at org.sonar.core.platform.ComponentContainer.startComponents(ComponentC
ontainer.java:144)
at org.sonar.core.platform.ComponentContainer.execute(ComponentContainer
.java:129)
at org.sonar.scanner.scan.ProjectScanContainer.scan(ProjectScanContainer
.java:261)
at org.sonar.scanner.scan.ProjectScanContainer.scanRecursively(ProjectSc
anContainer.java:256)
at org.sonar.scanner.scan.ProjectScanContainer.doAfterStart(ProjectScanC
ontainer.java:245)
at org.sonar.core.platform.ComponentContainer.startComponents(ComponentC
ontainer.java:144)
at org.sonar.core.platform.ComponentContainer.execute(ComponentContainer
.java:129)
at org.sonar.scanner.task.ScanTask.execute(ScanTask.java:47)
at org.sonar.scanner.task.TaskContainer.doAfterStart(TaskContainer.java:
84)
at org.sonar.core.platform.ComponentContainer.startComponents(ComponentC
ontainer.java:144)
at
org.sonar.core.platform.ComponentContainer.execute (ComponentContainerjava:129)
at org.sonar.scanner.bootstrap.GlobalContainer.executeTask(GlobalContain
er.java:119)
at org.sonar.batch.bootstrapper.Batch.executeTask(Batch.java:116)
at org.sonarsource.scanner.api.internal.batch.BatchIsolatedLauncher.exec
ute(BatchIsolatedLauncher.java:63)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.
java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.sonarsource.scanner.api.internal.IsolatedLauncherProxy.invoke(Iso
latedLauncherProxy.java:60)
at com.sun.proxy.$Proxy0.execute(Unknown Source)
at org.sonarsource.scanner.api.EmbeddedScanner.doExecute(EmbeddedScanner
.java:233)
at org.sonarsource.scanner.api.EmbeddedScanner.runAnalysis(EmbeddedScann
er.java:151)
at org.sonarsource.scanner.cli.Main.runAnalysis(Main.java:123)
at org.sonarsource.scanner.cli.Main.execute(Main.java:77)
at org.sonarsource.scanner.cli.Main.main(Main.java:61)
ERROR:
ERROR: Re-run SonarQube Scanner using the -X switch to enable full debug
logging
.
If you have any solution for this problem, please help me.
You need to build your project before the analysis. Let's say that your project structure is equal to:
project
|- src
|- dependencies
|- classes
then you should configure:
sonar.sources = src
sonar.java.libraries = dependencies/**/*.jar
sonar.java.binaries = classes
Read more at: Java Plugin and Bytecode

Hadoop: Unable to load native-hadoop library for your platform. using builtin-java classes > where applicable

I am trying to configure hadoop and format namenode using this command:
$ hdfs namenode -format
However, I keep getting this error. How can I fix it?
2017-06-20 12:22:25,792 WARN util.NativeCodeLoader: Unable to load
native-hadoop library for your platform... using builtin-java classes
where applicable ^[[A2017-06-20 12:22:28,825 WARN ipc.Client: Failed
to connect to server: localhost/127.0.0.1:9000: try once and fail.
java.net.ConnectException: Connection refused at
sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at
sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
at
org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531) at
org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495) at
org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:681)
at
org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:777)
at
org.apache.hadoop.ipc.Client$Connection.access$3500(Client.java:408)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1542) at
org.apache.hadoop.ipc.Client.call(Client.java:1373) at
org.apache.hadoop.ipc.Client.call(Client.java:1337) at
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:227)
at
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:115)
at com.sun.proxy.$Proxy11.getFileInfo(Unknown Source) at
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:812)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498) at
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:398)
at
org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:163)
at
org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:155)
at
org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95)
at
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:335)
at com.sun.proxy.$Proxy12.getFileInfo(Unknown Source) at
org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1638) at
org.apache.hadoop.hdfs.DistributedFileSystem$27.doCall(DistributedFileSystem.java:1367)
at
org.apache.hadoop.hdfs.DistributedFileSystem$27.doCall(DistributedFileSystem.java:1364)
at
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at
org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1379)
at org.apache.hadoop.fs.Globber.getFileStatus(Globber.java:64) at
org.apache.hadoop.fs.Globber.doGlob(Globber.java:269) at
org.apache.hadoop.fs.Globber.glob(Globber.java:148) at
org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1960) at
org.apache.hadoop.fs.shell.PathData.expandAsGlob(PathData.java:326)
at
org.apache.hadoop.fs.shell.Command.expandArgument(Command.java:239)
at
org.apache.hadoop.fs.shell.Command.expandArguments(Command.java:222)
at
org.apache.hadoop.fs.shell.FsCommand.processRawArguments(FsCommand.java:103)
at org.apache.hadoop.fs.shell.Command.run(Command.java:166) at
org.apache.hadoop.fs.FsShell.run(FsShell.java:326) at
org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76) at
org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90) at
org.apache.hadoop.fs.FsShell.main(FsShell.java:389) ls: Call From
ubuntu/127.0.1.1 to localhost:9000 failed on connection exception:
java.net.ConnectException: Connection refused; For more details see:
http://wiki.apache.org/hadoop/ConnectionRefused \ERROR: JAVA_HOME
/opt/jdk1.8.0_91/ does not exist.
2017-06-20 12:22:25,792 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Apache Hadoop can integrate with an optional native library that contains extensions implementing deeper OS integration and performance enhancements for certain features. For more details, refer to the documentation page Native Libraries Guide.
In many cases for client-side usage or developer setups, the native library extensions are optional. However, the log warning can be a nuisance. If you'd like to get it out of the way, there are a few options:
Some Apache Hadoop releases include a pre-built native library. You could potentially use this. This bundled library would use whatever OS/architecture executed the Apache Hadoop release build, so there is no guarantee that it will match your own OS/architecture. Do not attempt to mix and match different versions of Hadoop code and native code or you may see unusual linkage errors.
Build the native library yourself. This would guarantee the build matches your actual runtime OS/architecture. In addition to the documentation page linked above, there are more specific instructions on how to build in the BUILDING.txt file. Again, it's important to match the version of the Hadoop code and native code.
If you use a commercial vendor's distro, then they likely have taken care of guaranteeing the native library is already set up and deployed correctly.
If all else fails, you can stifle the warning by adding the following to log4j.properties
log4j.logger.org.apache.hadoop.util.NativeCodeLoader=ERROR
org.apache.hadoop.fs.FsShell.main(FsShell.java:389) ls: Call From ubuntu/127.0.1.1 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
This is a different error, unrelated to the native code loader warning. This indicates that the client failed to connect to the NameNode during command execution. This can be caused by lack of network connectivity or simply the NameNode daemon is not running. The linked wiki page in the error message has more details on troubleshooting this.
ERROR: JAVA_HOME /opt/jdk1.8.0_91/ does not exist.
I'm not sure exactly where this message came from, but it's self-explanatory. Either check that you have Java deployed at that path, or change JAVA_HOME to the correct path.

Maven1.x: javax.xml.parsers.FactoryConfigurationError: Provider org.apache.xerces.jaxp.SAXParserFactoryImpl not found

After OS update from Windows 7 to Windows 10, I'm facing following error:
Apache Maven
intelligent projects ~
v. 1.0.2
javax.xml.parsers.FactoryConfigurationError: Provider org.apache.xerces.jaxp.SAXParserFactoryImpl not
at javax.xml.parsers.SAXParserFactory.newInstance(SAXParserFactory.java:93)
at org.apache.maven.jelly.JellyUtils.compileScript(JellyUtils.java:202)
at org.apache.maven.jelly.JellyUtils.compileScript(JellyUtils.java:180)
at org.apache.maven.jelly.JellyUtils.compileScript(JellyUtils.java:146)
at org.apache.maven.plugin.PluginManager.loadScript(PluginManager.java:1109)
at org.apache.maven.plugin.PluginManager.runScript(PluginManager.java:1135)
at org.apache.maven.plugin.PluginManager.initialiseHousingPluginContext(PluginManager.java:770)
at org.apache.maven.plugin.PluginManager.prepAttainGoal(PluginManager.java:725)
at org.apache.maven.plugin.PluginManager.attainGoals(PluginManager.java:656)
at org.apache.maven.MavenSession.attainGoals(MavenSession.java:263)
at org.apache.maven.cli.App.doMain(App.java:488)
at org.apache.maven.cli.App.main(App.java:1239)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:324)
at com.werken.forehead.Forehead.run(Forehead.java:551)
at com.werken.forehead.Forehead.main(Forehead.java:581)
You have encountered an unknown error running Maven. Please help us to correct
this problem by following these simple steps:
- read the Maven FAQ at http://maven.apache.org/faq.html
- run the same command again with the '-e' parameter, eg maven -e jar
As my application is using Maven1 and JDK1.4, I'm running eclipse:generate-classpath command.
I also tried to add xerces and xercesImpl jars via maven. Also added these two jars in C:\Dev\Jdks\jdk1.8.0_31\jre\lib\ext directory. Still getting the same error. Can anyone please help? I'll be greatful, Thanks in advance.

Hive startup -[ERROR] Terminal initialization failed; falling back to unsupported

I have downloaded hive and modified HADOOP_HOME to
HADOOP_HOME=${bin}/../../usr/local/hadoop
my actual hadoop path is
/usr/local/hadoop
in .bashrc i have added the below env variables
export HIVE_HOME=/usr/lib/hive/apache-hive-1.1.0-bin
export PATH=$PATH:$HIVE_HOME/bin
export CLASSPATH=$CLASSPATH:/usr/local/Hadoop/lib/*:.
export CLASSPATH=$CLASSPATH:/usr/local/hive/lib/*:.
then i tried starting hive using bin/hive. I got the below error
Logging initialized using configuration in jar:file:/usr/lib/hive/apache-hive-1.1.0-bin/lib/hive-common-1.1.0.jar!/hive-log4j.properties
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/lib/hive/apache-hive-1.1.0-bin/lib/hive-jdbc-1.1.0-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
[ERROR] Terminal initialization failed; falling back to unsupported
java.lang.IncompatibleClassChangeError: Found class jline.Terminal, but interface was expected
at jline.TerminalFactory.create(TerminalFactory.java:101)
at jline.TerminalFactory.get(TerminalFactory.java:158)
at jline.console.ConsoleReader.<init>(ConsoleReader.java:229)
at jline.console.ConsoleReader.<init>(ConsoleReader.java:221)
at jline.console.ConsoleReader.<init>(ConsoleReader.java:209)
at org.apache.hadoop.hive.cli.CliDriver.getConsoleReader(CliDriver.java:773)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:715)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:675)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:615)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Exception in thread "main" java.lang.IncompatibleClassChangeError: Found class jline.Terminal, but interface was expected
at jline.console.ConsoleReader.<init>(ConsoleReader.java:230)
at jline.console.ConsoleReader.<init>(ConsoleReader.java:221)
at jline.console.ConsoleReader.<init>(ConsoleReader.java:209)
at org.apache.hadoop.hive.cli.CliDriver.getConsoleReader(CliDriver.java:773)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:715)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:675)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:615)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
I had the same problem and got it working from this link:
https://cwiki.apache.org/confluence/display/Hive/Hive+on+Spark%3A+Getting+Started
Hive has upgraded to Jline2 but jline 0.9x exists in the Hadoop lib.
So you should follow these steps:
Delete jline from the Hadoop lib directory (it's only pulled in transitively from ZooKeeper).
export HADOOP_USER_CLASSPATH_FIRST=true
Try after removing the jline-0.9.94.jar file under the path $HADOOP_HOME/share/hadoop/yarn/lib/jline-0.9.94.jar
Here's the link to jira ticket
https://issues.apache.org/jira/browse/HIVE-8609
I just set the
HADOOP_USER_CLASSPATH_FIRST=true
and it works for me for this issue.
Try to delete one of this file
SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/lib/hive/apache-hive-1.1.0-bin/lib/hive-jdbc-1.1.0-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
then i think it will only choose to bind to one. so multiple binding will not be available anymore
I had the same problem with cloudera CDH5.4 . Removing jline-0.9.94.jar from yarn/libs folder worked for me.
With Hadoop version 2.4.1 and Hive 1.2.0. I had the same issue. And after setting HADOOP_USER_CLASSPATH_FIRST=true in .bashrc
It worked like a charm!!!
Do this locate jline.
The file, jline-0.9.94.jar, is located in 3 locations, delete it from all the 3 locations and do the necessary export:
$ export HADOOP_USER_CLASSPATH_FIRST=true
With the new version of Hive (1.2.1), I just had to replace jline-2.12.jar with jline-2.13.jar in the installation / lib folder.
check this link it might help you Facing issue while running hive from CLI
You should init the hadoop libary:
$ vi ~/.bashrc
$ export HADOOP_USER_CLASSPATH_FIRST=true
$ source .bashrc
$ hive
I renamed
addJava "-Djline.terminal=jline.UnixTerminal"
to
addJava "-Djline.terminal=jline2.UnixTerminal"
in the "activator" file

Wordcount example error

Does anyone know how to solve this error?
I have read that I can edit in the fileutils.setPermission and remove lines with checkReturnValue, but I don't know how.
13/03/10 13:04:14 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
13/03/10 13:04:14 ERROR security.UserGroupInformation: PriviledgedActionException as:Nesreen.Mamdouh cause:java.io.IOException: Failed to set permissions of path: \tmp\hadoop-Nesreen.Mamdouh\mapred\staging\Nesreen.Mamdouh64097525\.staging to 0700
Exception in thread "main" java.io.IOException: Failed to set permissions of path: \tmp\hadoop-Nesreen.Mamdouh\mapred\staging\Nesreen.Mamdouh64097525\.staging to 0700
at org.apache.hadoop.fs.FileUtil.checkReturnValue(FileUtil.java:689)
at org.apache.hadoop.fs.FileUtil.setPermission(FileUtil.java:662)
at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:509)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:344)
at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:189)
at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:116)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:856)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:850)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Unknown Source)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:850)
at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:824)
at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1261)
at BigDataPackage.WordCount.main(WordCount.java:55)
I believe you are running this program on Windows. I believe there is a similar problem solved here
Check some information on Windows-hadoop installation here
You can use the below workaround to suppress this error:
Under your hadoop directory go to src->core->org->apache->hadoop->fs
Open FileUtil.Java
Comment out the code inside checkReturnValue function at line 685.
Re create hadoop-core-1.0.4.jar and include it in the build path of your eclipse project.
This should solve the problem.

Categories

Resources