need to use hadoop native - java

I am invoking a mapreduce job from my java program.
Today, when I set the mapreduce job's input format to :LzoTextInputFormat
The mapreduce job fails:
Could not load native gpl library
java.lang.UnsatisfiedLinkError: no gplcompression in java.library.path
at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1738)
at java.lang.Runtime.loadLibrary0(Runtime.java:823)
at java.lang.System.loadLibrary(System.java:1028)
at com.hadoop.compression.lzo.GPLNativeCodeLoader.<clinit>(GPLNativeCodeLoader.java:32)
at com.hadoop.compression.lzo.LzoCodec.<clinit>(LzoCodec.java:67)
at com.hadoop.mapreduce.LzoTextInputFormat.listStatus(LzoTextInputFormat.java:58)
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.getSplits(FileInputFormat.java:241)
at com.hadoop.mapreduce.LzoTextInputFormat.getSplits(LzoTextInputFormat.java:85)
at org.apache.hadoop.mapred.JobClient.writeNewSplits(JobClient.java:885)
at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:779)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:432)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:447)
at company.Validation.run(Validation.java:99)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at company.mapreduceTest.main(mapreduceTest.java:18)
Apr 5, 2012 4:40:29 PM com.hadoop.compression.lzo.LzoCodec <clinit>
SEVERE: Cannot load native-lzo without native-hadoop
java.lang.IllegalArgumentException: Wrong FS: hdfs://D-SJC-00535164:9000/local/usecases /gbase014/outbound/seed_2012-03-12_06-34-39/1_1.lzo.index, expected: file:///
at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:310)
at org.apache.hadoop.fs.RawLocalFileSystem.pathToFile(RawLocalFileSystem.java:47)
at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:357)
at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:245)
at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:648)
at com.hadoop.compression.lzo.LzoIndex.readIndex(LzoIndex.java:169)
at com.hadoop.mapreduce.LzoTextInputFormat.listStatus(LzoTextInputFormat.java:69)
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.getSplits(FileInputFormat.java:241)
at com.hadoop.mapreduce.LzoTextInputFormat.getSplits(LzoTextInputFormat.java:85)
at org.apache.hadoop.mapred.JobClient.writeNewSplits(JobClient.java:885)
at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:779)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:432)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:447)
at company.Validation.run(Validation.java:99)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at company.stopTransfer.mapreduceTest.main(mapreduceTest.java:18)
Apr 5, 2012 4:40:29 PM company.Validation run
SEVERE: LinkExtractor: java.lang.IllegalArgumentException: Wrong FS: hdfs://D-SJC-00535164:9000/local/usecases/gbase014/outbound/seed_2012-03-12_06-34-39/1_1.lzo.index, expected: file:///
at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:310)
at org.apache.hadoop.fs.RawLocalFileSystem.pathToFile(RawLocalFileSystem.java:47)
at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:357)
at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:245)
at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:648)
at com.hadoop.compression.lzo.LzoIndex.readIndex(LzoIndex.java:169)
at com.hadoop.mapreduce.LzoTextInputFormat.listStatus(LzoTextInputFormat.java:69)
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.getSplits(FileInputFormat.java:241)
at com.hadoop.mapreduce.LzoTextInputFormat.getSplits(LzoTextInputFormat.java:85)
at org.apache.hadoop.mapred.JobClient.writeNewSplits(JobClient.java:885)
at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:779)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:432)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:447)
at company.Validation.run(Validation.java:99)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at company.stopTransfer.mapreduceTest.main(mapreduceTest.java:18)
But in lib/native they are some files extends with a,la,so...
I tried to set them in my path environment variable, but it still doesn't work.
Could anyone please give me a suggestion!!!!
Thank you very much!

Your error relates to the actual shared library for Lzo not being present in the hadoop native library folder.
The code for GPLNativeCodeLoader is looking for a shared library called gplcompression. Java is actually looking for a file named libgplcompression.so. If this file doesn't exist in your lib/native/${arch} folder then you'll see this error.
In a terminal, navigate to your hadoop base directory and execute the following to dump the native libraries installed, and post back to your original question
uname -a
find lib/native

If you are using Cloudera Hadoop, you can install lzo easily according to the following instruction:
http://www.cloudera.com/content/cloudera/en/documentation/cloudera-impala/v1/v1-0-1/Installing-and-Using-Impala/ciiu_lzo.html

Related

Hbase calling HTable hangs

There is sample java code for Hbase connectivity program which is the famous "HbaseTest" class sample, which is available in the internet for long time.
I have compiled the code in my server and compiling was successful. When i run my Java class file, i am able to see that it is getting hanged in the particular line. "HTable table = new HTable(conf, tableName);"
It throws the below alert when running.
Jun 18, 2015 12:16:14 PM org.apache.hadoop.util.NativeCodeLoader WARNING: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Jun 18, 2015 12:16:15 PM org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper INFO: The identifier of this process is pid#servername
I have identified it has stuck in that particular lines by giving Print statements.
Please do let me know what to do for the same. I have checked that the Hbase is running properly.
Kindly share your thoughts and idea's.
#hive #hbase #hadoop
Thanks in Advance Sam
I had a similar problem before it was network issues. Try setting retry and timeout parameters, e.g.
hbase.client.retries.number=2
zookeeper.session.timeout=2000
zookeeper.recovery.retry=0
hbase.rpc.timeout=100
ipc.socket.timeout=100
hbase.client.pause=100
zookeeper.recovery.retry.intervalmill=100
timeout=100
You may need to modify your network settings according to the errors that are thrown.

apache Derby - getting java.io.FileNotFoundException: derby.log (Access is denied) when creating new database

I am new to Apache Derby database,
When i am trying to crate new database using the following command i am getting the below problem
C:\>java org.apache.derby.tools.ij
ij version 10.10
ij> connect 'jdbc:derby:Mynewdb;create=true';
Mon Mar 03 20:17:32 IST 2014 Thread[main,5,main] java.io.FileNotFoundException: derby.log
(Access is denied)
----------------------------------------------------------------
Mon Mar 03 20:17:33 IST 2014:
Booting Derby version The Apache Software Foundation - Apache Derby - 10.10.1.1 - (1458268): instance a816c00e-0144-886a-02f2-000000b8d0b0
on database directory C:\Mynewdb with class loader sun.misc.Launcher$AppClassLoader#11b86e7
Loaded from file:/C:/db-derby-10.10.1.1-bin/db-derby-10.10.1.1-bin/lib/derby.jar
java.vendor=Sun Microsystems Inc.
java.runtime.version=1.6.0_23-b05
user.dir=C:\
os.name=Windows 7
os.arch=x86
os.version=6.1
derby.system.home=null
Database Class Loader started - derby.database.classpath=''
A file named derby.log will be created in the current working directory when you run ij (or attempt to use embedded Apache Derby in some other application). From the post, it appears you are executing this from C:\ and the user you are logged on as does not have write access to that directory: change to a directory where the user has permission to create a file and retry.
Note it is possible to suppress this log file (though I have not yet done this myself). See Getting rid of derby.log. However, suppressing the log file would just result in another failure in your case because the database will be created on the file system relative to the current directory. That is, an attempt to create the directory named Mynewdb in the current directory, C:\, would also fail for the same reason. It is possible to specify a path for the database to avoid creating in the current working directory:
ij> connect 'jdbc:derby:/tmp/test_db;create=true';
user.dir=C:\
os.name=Windows 7
Windows 7 (and up?) doesn't let you write files to the root directory in most cases. You should cd to another directory before starting ij. e.g. cd \Users\YOUR_USER_NAME and you should be good to go.

What is libmuxer library ? If I run my server or application then I find these

If I run my program or server then I find alsways this error message. Can anyone tell me and help me please.. I will be very thankfull to you.
<Jul 29, 2013 3:01:55 AM ACT> <Error> <Socket> <BEA-000433> <Unable to load performance pack. Using Java I/O instead.
Please ensure that libmuxer library is in :'C:\j2sdk1.4.2_17\bin;C:\bea\weblogic81\bin;C:\bea\weblogic81\server\bin'
java.lang.UnsatisfiedLinkError: no muxer in java.library.path
java.lang.UnsatisfiedLinkError: no muxer in java.library.path
at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1517)
at java.lang.Runtime.loadLibrary0(Runtime.java:788)
at java.lang.System.loadLibrary(System.java:834)
at weblogic.socket.PosixSocketMuxer.<init>(PosixSocketMuxer.java:30)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
at java.lang.reflect.Constructor.newInstance(Constructor.java:274)
at java.lang.Class.newInstance0(Class.java:308)
at java.lang.Class.newInstance(Class.java:261)
at weblogic.socket.SocketMuxer.makeTheMuxer(SocketMuxer.java:82)
at weblogic.socket.SocketMuxer.getMuxer(SocketMuxer.java:49)
at weblogic.t3.srvr.ListenThread.initServerSocket(ListenThread.java:690)
at weblogic.t3.srvr.ListenThread.run(ListenThread.java:205)
"...this indicates that the Native Libraries are not properly being picked up the Weblogic server. This happens when the weblogic installed as 32 Bit on a 64 Bit Operating System or vice versa. In such scenario we need to explicitly specify the path to the Native Library."
Add the following to the setDomainEnv.sh
-Djava.library.path=/opt/bea/wlserver_10.3/server/native/solaris/sparc64/
Enable "Native IO" check box under each servers Tuning Tab.
Restart the servers.
http://weblogic.middlewarebase.com/2013/04/unable-to-load-performance-pack-using.html
I have the some problem, and thanks to sᴜʀᴇsʜ-ᴀᴛᴛᴀ for the hint! It looks like a 32bit/64bit issue. By the way, I'm using WebLogic 12 on a 64bit Centos5.
Note that the libmuxer.so is provided with weblogic for many archs, in fact
$ find $MW_HOME -name *muxer*
/application/weblogic/wlserver/server/native/macosx/libmuxer.jnilib
/application/weblogic/wlserver/server/native/linux/s390/libmuxer.so
/application/weblogic/wlserver/server/native/linux/ia64/libmuxer.so
/application/weblogic/wlserver/server/native/linux/x86_64/libmuxer.so
/application/weblogic/wlserver/server/native/linux/i686/libmuxer.so
/application/weblogic/wlserver/server/native/linux/s390x/libmuxer.so
I edited the file $MW_HOME/wlserver/common/bin/commEnv.sh, where I found
#JAVA_USE_64BIT, true if JVM uses 64 bit operations
JAVA_USE_64BIT=false
and updated with
#JAVA_USE_64BIT, true if JVM uses 64 bit operations
JAVA_USE_64BIT=true
This worked for me, the Error you reported is gone.
Good luck!

Errors for running Mahout example

I downloaded the examples of latest version for chapter 09 of “Mahout in Action”. I can successfully run several examples, but for three files, NewsKMeansClustering.java, ReutersToSparseVectors.java, and NewsFuzzyKMeansClusteing.java. Running these three programs gives similar error messages:
Aug 3, 2011 2:03:54 PM org.apache.hadoop.metrics.jvm.JvmMetrics init
INFO: Initializing JVM Metrics with processName=JobTracker, sessionId=
Aug 3, 2011 2:03:54 PM org.apache.hadoop.mapred.JobClient configureCommandLineOptions
WARNING: Use GenericOptionsParser for parsing the arguments. Applications should
implement Tool for the same.
Aug 3, 2011 2:03:54 PM org.apache.hadoop.mapred.JobClient configureCommandLineOptions
WARNING: No job jar file set. User classes may not be found. See JobConf(Class) or
JobConf#setJar(String).
Exception in thread "main" org.apache.hadoop.mapreduce.lib.input.InvalidInputException: Input path does not exist: file:/home/user1/workspaceMahout1/recommender/inputDir
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.listStatus(FileInputFormat.java:224)
at org.apache.hadoop.mapreduce.lib.input.SequenceFileInputFormat.listStatus(SequenceFileInputFormat.java:55)
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.getSplits(FileInputFormat.java:241)
at org.apache.hadoop.mapred.JobClient.writeNewSplits(JobClient.java:885)
at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:779)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:432)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:447)
at org.apache.mahout.vectorizer.DocumentProcessor.tokenizeDocuments(DocumentProcessor.java:93)
at mia.clustering.ch09.NewsKMeansClustering.main(NewsKMeansClustering.java:54)
For the above messages, I do not quite understand what do those two warnings mean? Moreover, it looks like that “input path” should have been created, how can I create this type of input? Thanks.
You can ignore the warnings. The error is that the input directory you have specified does not exist. Does it exist? What is your command line?
I ran into a similar mismatch. The MiA files at https://github.com/tdunning/MiA have some cases where a .csv file is left in the same dir as the Java source. For example https://github.com/tdunning/MiA/tree/master/src/main/java/mia/recommender/ch02 ... however via Eclipse, loading it using DataModel model = new FileDataModel(new File("intro.csv")); ...doesn't find it.
Adding
System.out.println("CWD: "+System.getProperty("user.dir"));
...will reveal where Eclipse is looking (in my case, a couple levels up the filetree, but this might vary depending on how exactly you've set things up).

What am I doing wrong when trying to run the this java test command for LWJGL?

I'm attempting to use the lwjgl library and I'm starting from scratch on a new Windows 7 install.
I downloaded the latest JDK 6 from the Oracle website. After installing it, I found that commands like "java" or "javac" weren't being recognized from a windows cmd prompt. So, I edited my path variable and appended the jdk's bin folder to it.
Now the java commands work.
So, I download the latest lwjgl, extract it and read the installation instructions on their website:
Download the distribution Unpack the
archive, file contents (in sub
folders) should include (amongst other
things):
lwjgl.dll lwjglaudio.dll lwjgl.jar
lwjgl_util.jar lwjgl_test.jar
Test
LWJGL by opening a command prompt, and
navigating to the folder where the
archive was extracted. Once navigated,
issue the following command: (all in
one line, space before each -option)
java -cp .;res;jar\lwjgl.jar;jar\lwjgl_test.jar;jar\lwjgl_util.jar;jar\lwjgl_fmod3.jar;jar\lwjgl_devil.jar;jar\jinput.jar;-Djava.library.path=native\windows org.lwjgl.test.WindowCreationTest
A window should appear and you should
see the following output:
Found display modes 240, 320,
WindowCreationTest Display created
Moving to 100, 100 Window created 600,
800, Game
So, I extracted it and navigated to the extracted folder in a cmd prompt.
I then executed the test command specified above and I get the following error:
C:\Users\Nestor\Downloads\lwjgl-2.6\lwjgl-2.6>java
-cp .;res;jar\lwjgl.jar;jar\ lwjgl_test.jar;jar\lwjgl_util.jar;jar\lwjgl_fmod3.jar;jar\lwjgl_devil.jar;jar\ji
nput.jar;-Djava.library.path=native\windows
org.lwjgl.test.WindowCreationTest
The
following keys are available: ESCAPE:
Exit test ARROW Keys: Move window
when in non-fullscreen mode L:
List selectable display modes 0-8:
Selection of display modes F:
Toggle fullscreen SHIFT-F:
Toggle fullscreen with
Display.destroy()/create() cycle
Exception in thread "main"
java.lang.UnsatisfiedLinkError: no
lwjgl in java.libr ary.path
at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1734)
at java.lang.Runtime.loadLibrary0(Runtime.java:823)
at java.lang.System.loadLibrary(System.java:1028)
at org.lwjgl.Sys$1.run(Sys.java:73)
at java.security.AccessController.doPrivileged(Native
Method)
at org.lwjgl.Sys.doLoadLibrary(Sys.java:66)
at org.lwjgl.Sys.loadLibrary(Sys.java:82)
at org.lwjgl.Sys.(Sys.java:99)
at org.lwjgl.opengl.Display.(Display.java:130)
at org.lwjgl.test.WindowCreationTest.initialize(WindowCreationTest.java:
82)
at org.lwjgl.test.WindowCreationTest.main(WindowCreationTest.java:286)
C:\Users\Nestor\Downloads\lwjgl-2.6\lwjgl-2.6>
Why am I getting that error? I don't understand why there should be linking errors. In the command that I attempted to execute it clearly spells out the path to those native dll's it needs:
C:\Users\Nestor\Downloads\lwjgl-2.6\lwjgl-2.6>java
-cp .;res;jar\lwjgl.jar;jar\ lwjgl_test.jar;jar\lwjgl_util.jar;jar\lwjgl_fmod3.jar;jar\lwjgl_devil.jar;jar\ji
nput.jar;-Djava.library.path=native\windows
org.lwjgl.test.WindowCreationTest
I've confirmed that the relative path "native\windows" contains those dependencies:
C:\Users\Nestor\Downloads\lwjgl-2.6\lwjgl-2.6\native\windows>dir
Volume in drive C has no label.
Volume Serial Number is 2061-75F6
Directory of C:\Users\Nestor\Downloads\lwjgl-2.6\lwjgl-2.6\native\windows
11/24/2010 12:35 AM .
11/24/2010 12:35 AM ..
10/18/2010 08:44 PM 31,232 jinput-dx8.dll
10/18/2010 08:44 PM 65,024 jinput-dx8_64.dll
10/18/2010 08:44 PM 29,696 jinput-raw.dll
10/18/2010 08:44 PM 62,464 jinput-raw_64.dll
10/18/2010 08:44 PM 197,120 lwjgl.dll
10/18/2010 08:44 PM 305,664 lwjgl64.dll
10/18/2010 08:44 PM 56,832 OpenAL32.dll
10/18/2010 08:44 PM 157,184 OpenAL64.dll
8 File(s) 905,216 bytes
2 Dir(s) 155,163,058,176 bytes free
Can anyone help point out what I'm doing wrong? Can anyone reproduce this by downloading the LWJGL library and attempting to run the test command given in the installation instructions?
It seems that you do not have a space between your classpath argument (-cp jar1.jar;jar2.jar) and your system property setting (-D..).
E.g. your classpath looks like this-cp .;res;jar\lwjgl.jar;jar\lwjgl_test.jar;jar\lwr...;-Djava.library.path=native\windows. This way java will interpret your property setting of native library path argument like a classpath!
Just add a space between those arguments and you should be up and running, this is the corrected command (also tested on Windows 7):
java -cp jar\lwjgl.jar;jar\lwjgl_test.jar;jar\lwjgl_util.jar -Djava.library.path=native\windows org.lwjgl.test.WindowCreationTest
Note that I removed the unused jars from the classpath since you only want to run the WindowCreationTest example.
Check again that directory
C:\Users\Nestor\Downloads\lwjgl-2.6\lwjgl-2.6\native\windows
exists and contains lwjgl.dll and lwjglaudio.dll
I believe that something is wrong with your installation, i.e. the directory does not exist or files are not there.
Just throwing this out there, because Ive had some issues related to this. Go to your Java/JRE/BIN folder. Right click on Java, and go to properties. Under Privilege Level, check the box by run as an administrator.

Categories

Resources