Does anyone know how to solve this error?
I have read that I can edit in the fileutils.setPermission and remove lines with checkReturnValue, but I don't know how.
13/03/10 13:04:14 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
13/03/10 13:04:14 ERROR security.UserGroupInformation: PriviledgedActionException as:Nesreen.Mamdouh cause:java.io.IOException: Failed to set permissions of path: \tmp\hadoop-Nesreen.Mamdouh\mapred\staging\Nesreen.Mamdouh64097525\.staging to 0700
Exception in thread "main" java.io.IOException: Failed to set permissions of path: \tmp\hadoop-Nesreen.Mamdouh\mapred\staging\Nesreen.Mamdouh64097525\.staging to 0700
at org.apache.hadoop.fs.FileUtil.checkReturnValue(FileUtil.java:689)
at org.apache.hadoop.fs.FileUtil.setPermission(FileUtil.java:662)
at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:509)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:344)
at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:189)
at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:116)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:856)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:850)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Unknown Source)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:850)
at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:824)
at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1261)
at BigDataPackage.WordCount.main(WordCount.java:55)
I believe you are running this program on Windows. I believe there is a similar problem solved here
Check some information on Windows-hadoop installation here
You can use the below workaround to suppress this error:
Under your hadoop directory go to src->core->org->apache->hadoop->fs
Open FileUtil.Java
Comment out the code inside checkReturnValue function at line 685.
Re create hadoop-core-1.0.4.jar and include it in the build path of your eclipse project.
This should solve the problem.
Related
I am trying to configure hadoop and format namenode using this command:
$ hdfs namenode -format
However, I keep getting this error. How can I fix it?
2017-06-20 12:22:25,792 WARN util.NativeCodeLoader: Unable to load
native-hadoop library for your platform... using builtin-java classes
where applicable ^[[A2017-06-20 12:22:28,825 WARN ipc.Client: Failed
to connect to server: localhost/127.0.0.1:9000: try once and fail.
java.net.ConnectException: Connection refused at
sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at
sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
at
org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531) at
org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495) at
org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:681)
at
org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:777)
at
org.apache.hadoop.ipc.Client$Connection.access$3500(Client.java:408)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1542) at
org.apache.hadoop.ipc.Client.call(Client.java:1373) at
org.apache.hadoop.ipc.Client.call(Client.java:1337) at
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:227)
at
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:115)
at com.sun.proxy.$Proxy11.getFileInfo(Unknown Source) at
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:812)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498) at
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:398)
at
org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:163)
at
org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:155)
at
org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95)
at
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:335)
at com.sun.proxy.$Proxy12.getFileInfo(Unknown Source) at
org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1638) at
org.apache.hadoop.hdfs.DistributedFileSystem$27.doCall(DistributedFileSystem.java:1367)
at
org.apache.hadoop.hdfs.DistributedFileSystem$27.doCall(DistributedFileSystem.java:1364)
at
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at
org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1379)
at org.apache.hadoop.fs.Globber.getFileStatus(Globber.java:64) at
org.apache.hadoop.fs.Globber.doGlob(Globber.java:269) at
org.apache.hadoop.fs.Globber.glob(Globber.java:148) at
org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1960) at
org.apache.hadoop.fs.shell.PathData.expandAsGlob(PathData.java:326)
at
org.apache.hadoop.fs.shell.Command.expandArgument(Command.java:239)
at
org.apache.hadoop.fs.shell.Command.expandArguments(Command.java:222)
at
org.apache.hadoop.fs.shell.FsCommand.processRawArguments(FsCommand.java:103)
at org.apache.hadoop.fs.shell.Command.run(Command.java:166) at
org.apache.hadoop.fs.FsShell.run(FsShell.java:326) at
org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76) at
org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90) at
org.apache.hadoop.fs.FsShell.main(FsShell.java:389) ls: Call From
ubuntu/127.0.1.1 to localhost:9000 failed on connection exception:
java.net.ConnectException: Connection refused; For more details see:
http://wiki.apache.org/hadoop/ConnectionRefused \ERROR: JAVA_HOME
/opt/jdk1.8.0_91/ does not exist.
2017-06-20 12:22:25,792 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Apache Hadoop can integrate with an optional native library that contains extensions implementing deeper OS integration and performance enhancements for certain features. For more details, refer to the documentation page Native Libraries Guide.
In many cases for client-side usage or developer setups, the native library extensions are optional. However, the log warning can be a nuisance. If you'd like to get it out of the way, there are a few options:
Some Apache Hadoop releases include a pre-built native library. You could potentially use this. This bundled library would use whatever OS/architecture executed the Apache Hadoop release build, so there is no guarantee that it will match your own OS/architecture. Do not attempt to mix and match different versions of Hadoop code and native code or you may see unusual linkage errors.
Build the native library yourself. This would guarantee the build matches your actual runtime OS/architecture. In addition to the documentation page linked above, there are more specific instructions on how to build in the BUILDING.txt file. Again, it's important to match the version of the Hadoop code and native code.
If you use a commercial vendor's distro, then they likely have taken care of guaranteeing the native library is already set up and deployed correctly.
If all else fails, you can stifle the warning by adding the following to log4j.properties
log4j.logger.org.apache.hadoop.util.NativeCodeLoader=ERROR
org.apache.hadoop.fs.FsShell.main(FsShell.java:389) ls: Call From ubuntu/127.0.1.1 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
This is a different error, unrelated to the native code loader warning. This indicates that the client failed to connect to the NameNode during command execution. This can be caused by lack of network connectivity or simply the NameNode daemon is not running. The linked wiki page in the error message has more details on troubleshooting this.
ERROR: JAVA_HOME /opt/jdk1.8.0_91/ does not exist.
I'm not sure exactly where this message came from, but it's self-explanatory. Either check that you have Java deployed at that path, or change JAVA_HOME to the correct path.
I am getting
Exception in thread "main" java.lang.NoClassDefFoundError: com/linkedin/camus/etl/IEtlKey.
On running the command:
hadoop jar camus-etl-kafka-0.1.0-SNAPSHOT.jar
com.linkedin.camus.etl.kafka.CamusJob -P camus.properties
I am getting the below exceptions..
2016-04-27 11:34:04.622 java[13567:351959] Unable to load realm mapping info from SCDynamicStore
[NativeCodeLoader] - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Exception in thread "main" java.lang.NoClassDefFoundError: com/linkedin/camus/etl/IEtlKey
at com.linkedin.camus.etl.kafka.CamusJob.run(CamusJob.java:252)
at com.linkedin.camus.etl.kafka.CamusJob.run(CamusJob.java:235)
at com.linkedin.camus.etl.kafka.CamusJob.run(CamusJob.java:691)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
at com.linkedin.camus.etl.kafka.CamusJob.main(CamusJob.java:646)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: java.lang.ClassNotFoundException: com.linkedin.camus.etl.IEtlKey
I have included camus-example-0.1.0-SNAPSHOT-shaded.jar in the classpath .
Please let me know if I am missing something .
Thanks in Advance
Soumyajit
You should try to include camus-api you can find on this LinkedIn's previous generation Kafka to HDFS pipeline page, since the missing class is contained in this package, as you can see here.
Pay attention to other transitive dependencies that may be required by Camus.
In addition, to be sure that classes will be found in the classpath when you use the hadoop jar from command line, you can add the libjars command line option, as reported in Using the libjars option with Hadoop:
$ export LIBJARS=/path/jar1,/path/jar2
$ hadoop jar my-example.jar com.example.MyTool -libjars ${LIBJARS} -mytoolopt value
It could be useful to know that Camus is going to be superseded by Gobblin:
Camus is being phased out and replaced by Gobblin. For those using or
interested in Camus, we suggest taking a look at Gobblin.
For instructions on Migrating from Camus to Gobblin, please take
a look at Camus Gobblin Migration.
Hi i have a problem with a jpl interface. I want connect JPL with swi-prolog installed with mac-ports with eclipse. I have a jpl.jar and i have tried to import the jar file in eclipse with build path but i have this error: "no jpl in java.library.path".
So i have copied libjpl.dylib in a /opt/local/lib/swipl-7.1.29/bin/ and when i execute the code i have this error: "Exception in thread "main" java.lang.UnsatisfiedLinkError: /opt/local/lib/swipl-7.1.29/bin/libjpl.dylib: dlopen(/opt/local/lib/swipl-7.1.29/bin/libjpl.dylib, 1): Library not loaded: /Users/janw/stable/lib/swipl/lib/x86_64-darwin13.0.0/libswipl.dylib
Referenced from: /opt/local/lib/swipl-7.1.29/bin/libjpl.dylib
Reason: image not found"
After a anoying waste of time i found the solution about that problem.
First of all, its completely necesary to install swi-prolog via macports, if not, as i did, when you point in the
Djava.library.path=/users/rivax/Applications/SWI-Prolog.app/Contents/swipl/lib/x86_64-darwin13.0.0
this exception will apear
Exception in thread "main" java.lang.UnsatisfiedLinkError: /Users/rivax/Applications/SWI-Prolog.app/Contents/swipl/lib/x86_64-darwin13.0.0/libjpl.dylib: dlopen(/Users/rivax/Applications/SWI-Prolog.app/Contents/swipl/lib/x86_64-darwin13.0.0/libjpl.dylib, 1): Library not loaded: /Users/janw/stable/lib/swipl/lib/x86_64-darwin13.0.0/libswipl.dylib
Referenced from: /Users/rivax/Applications/SWI-Prolog.app/Contents/swipl/lib/x86_64-darwin13.0.0/libjpl.dylib
Reason: image not found
at java.lang.ClassLoader$NativeLibrary.load(Native Method)
at java.lang.ClassLoader.loadLibrary1(ClassLoader.java:1965)
at java.lang.ClassLoader.loadLibrary0(ClassLoader.java:1890)
at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1880)
at java.lang.Runtime.loadLibrary0(Runtime.java:849)
at java.lang.System.loadLibrary(System.java:1088)
at jpl.JPL.loadNativeLibrary(JPL.java:100)
at jpl.fli.Prolog.<clinit>(Prolog.java:85)
at jpl.Query.open(Query.java:286)
at jpl.Util.textToTerm(Util.java:162)
at jpl.Query.<init>(Query.java:198)
at consultasProlog.Consultas.consultaFicheroProlog(Consultas.java:19)
at utilidades.RellenarModelo.ejecutarArchivo(RellenarModelo.java:30)
at javaprolog.JavaProlog.main(JavaProlog.java:30)
Java Result: 1
So follow these steps.
port install swi-prolog on terminal , if you dont have install already macports command not found will apear so go to https://www.macports.org/install.php and install macports.
navigate to the path of swi prolog macports installation which mine is
/opt/local/lib/swipl-6.6.6/lib/x86_64-darwin14.0.0
copy this path and set in java.library.path in the java VM as -Djava.library.path=/opt/local/lib/swipl-6.6.6/lib/x86_64-darwin14.0.0
Now .pl with jpl.jar will be able to execute and the consults will run.
Hope it will help you and every person who find this hell problem.
Cheers frank.
i'm using Windows 7, jdk 1.8.0 (64bit), jre 8 (64bit), Unity Pro 4.3.4f1 and Android SDK 22.6.1
All are updated softwares.
All works fine but when i use StartApp sdk, this error comes. No extra plugin is used, only the SDK provided by the StartApp ad.
Error building Player: CommandInvokationFailure: Unable to convert classes into dex format. See the Console for details.
C:\Program Files\Java\jdk1.8.0\bin\java.exe -Xmx1024M -Dcom.android.sdkmanager.toolsdir="D:/android-sdks\tools" -Dfile.encoding=UTF8 -jar "C:/Program Files (x86)/Unity/Editor/Data/BuildTargetTools/AndroidPlayer\sdktools.jar" -
stderr[
UNEXPECTED TOP-LEVEL EXCEPTION:
java.lang.IllegalArgumentException: already added: Lcom/unity3d/player/a$1;
at com.android.dx.dex.file.ClassDefsSection.add(ClassDefsSection.java:122)
at com.android.dx.dex.file.DexFile.add(DexFile.java:161)
at com.android.dx.command.dexer.Main.processClass(Main.java:685)
at com.android.dx.command.dexer.Main.processFileBytes(Main.java:634)
at com.android.dx.command.dexer.Main.access$600(Main.java:78)
at com.android.dx.command.dexer.Main$1.processFileBytes(Main.java:572)
at com.android.dx.cf.direct.ClassPathOpener.processArchive(ClassPathOpener.java:284)
at com.android.dx.cf.direct.ClassPathOpener.processOne(ClassPathOpener.java:166)
at com.android.dx.cf.direct.ClassPathOpener.processDirectory(ClassPathOpener.java:229)
at com.android.dx.cf.direct.ClassPathOpener.processOne(ClassPathOpener.java:158)
at com.android.dx.cf.direct.ClassPathOpener.process(ClassPathOpener.java:144)
at com.android.dx.command.dexer.Main.processOne(Main.java:596)
at com.android.dx.command.dexer.Main.processAllFiles(Main.java:498)
at com.android.dx.command.dexer.Main.runMonoDex(Main.java:264)
at com.android.dx.command.dexer.Main.run(Main.java:230)
at com.android.dx.command.dexer.Main.main(Main.java:199)
at com.android.dx.command.Main.main(Main.java:103)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at SDKMain.main(SDKMain.java:129)
1 error; aborting
]
I already tried JAVA_OPT and JAVA_HOME solutions. But still couldn't solved
I ran into this using IntelliJ 13.1.4, JDK 1.6 but I would guess that this is caused by the same thing on any other IDE -> classes.jar is referenced/included more than once.
Check and make sure that the classes.jar you added to the project (from Unity\Editor\Data\PlaybackEngines\androidplayer\bin\classes.jar) isn't referenced/included more than once. If you go into File->Project Structure:
There should only be one "classes" under libraries.
There should be NO reference to classes under Artifacts (Artifacts is used to create the plugin .jar file.)
Check out this related answer ->
https://stackoverflow.com/a/8437996/3464367
My issue was actually related to having duplicate files. Well not really duplicates, but I had 2 adMob sdk packages (different versions), that were getting into conflict. After deleting the older one, everything was fixed. Are you using any 3rd party SDK's or something similar, if so, try removing them if possible and see if the issue persists. Try to find out which part of the code is causing this to happen. You could ultimately try to compile an empty project just for testing.
Today I solved similar problem of Error building Player: CommandInvokationFailure: Unable to convert classes into dex format. See the Console for details. with adding .jar file into Plugins\Android\
I got different kind of errors:
Caused by: com.android.dx.cf.iface.ParseException: bad class file magic (cafebabe) or version (0034.0000 ...
Caused by: com.android.dx.cf.iface.ParseException: class name (ht/stringing/test) does not match path (src/ht/stringing/test.class)
My result working solution was similar to this:
Code lives here:
verysimple\src\ht\stringing\test.java
I do:
> javac -classpath "c:\Android\sdk\platforms\android-15" ht\stringing\test.java
> jar cvf ..\dist\verysimple.jar ht\stringing\test.class
> copy ..\dist\verysimple.jar c:\ExUnityApp\Assets\Plugins\Android
Don't forget to compare your 'Minimum Api Level' in Player Setting. For me it was Android 4.0.3 ... (15)
Please go easy on me. I just started linux and hadoop at the same time. I have almost zero experience with linux and a complete beginner with hadoop.
I downloaded the file hadoop-1.1.1-bin.tar.gz from here:
http://www.motorlogy.com/apache/hadoop/common/hadoop-1.1.1/
I was able to unpack it.
I am following a tutorial that tells me to run:
bin/hadoop jar hadoop-*-examples.jar
I am getting this error:
agordon#Ubuntu32:/hadoop/hadoop-1.1.1$ bin/hadoop jar hadoop-*-examples-1.0.3.jar
Exception in thread "main" java.io.IOException: Error opening job jar: hadoop-*-examples-1.0.3.jar
at org.apache.hadoop.util.RunJar.main(RunJar.java:90)
Caused by: java.io.FileNotFoundException: hadoop-*-examples-1.0.3.jar (No such file or directory)
at java.util.zip.ZipFile.open(Native Method)
at java.util.zip.ZipFile.<init>(ZipFile.java:214)
at java.util.zip.ZipFile.<init>(ZipFile.java:144)
at java.util.jar.JarFile.<init>(JarFile.java:152)
at java.util.jar.JarFile.<init>(JarFile.java:89)
at org.apache.hadoop.util.RunJar.main(RunJar.java:88)
What am i doing wrong? thank you for your guidance.
Instead of hadoop-*-examples.jar use the full name of the jar file.