HBase crashes after few seconds - java

I have been following the HBase installation instructions as given on this page:
https://hbase.apache.org/book.html#quickstart
I am using HBase version 1.1.3 (stable release) and have configured in standalone mode. I have installed OpenJDK 7
When I try to start hbase it crashes after few seconds and I get the following error:
2016-01-29 17:37:04,136 ERROR [main] master.HMasterCommandLine: Master exiting
java.lang.RuntimeException: Failed construction of Master: class org.apache.hadoop.hbase.master.HMasterCommandLine$LocalHMaster
at org.apache.hadoop.hbase.util.JVMClusterUtil.createMasterThread(JVMClusterUtil.java:143)
at org.apache.hadoop.hbase.LocalHBaseCluster.addMaster(LocalHBaseCluster.java:219)
at org.apache.hadoop.hbase.LocalHBaseCluster.<init>(LocalHBaseCluster.java:155)
at org.apache.hadoop.hbase.master.HMasterCommandLine.startMaster(HMasterCommandLine.java:224)
at org.apache.hadoop.hbase.master.HMasterCommandLine.run(HMasterCommandLine.java:139)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.hadoop.hbase.util.ServerCommandLine.doMain(ServerCommandLine.java:126)
at org.apache.hadoop.hbase.master.HMaster.main(HMaster.java:2355)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.hbase.util.Bytes$LexicographicalComparerHolder$UnsafeComparer
at org.apache.hadoop.hbase.util.Bytes.putInt(Bytes.java:899)
at org.apache.hadoop.hbase.KeyValue.createByteArray(KeyValue.java:1082)
at org.apache.hadoop.hbase.KeyValue.<init>(KeyValue.java:652)
at org.apache.hadoop.hbase.KeyValue.<init>(KeyValue.java:580)
at org.apache.hadoop.hbase.KeyValue.<init>(KeyValue.java:483)
at org.apache.hadoop.hbase.KeyValue.<init>(KeyValue.java:370)
at org.apache.hadoop.hbase.KeyValue.<clinit>(KeyValue.java:267)
at org.apache.hadoop.hbase.HConstants.<clinit>(HConstants.java:978)
at org.apache.hadoop.hbase.HTableDescriptor.<clinit>(HTableDescriptor.java:1488)
at org.apache.hadoop.hbase.util.FSTableDescriptors.<init>(FSTableDescriptors.java:124)
at org.apache.hadoop.hbase.regionserver.HRegionServer.<init>(HRegionServer.java:570)
at org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:365)
at org.apache.hadoop.hbase.master.HMasterCommandLine$LocalHMaster.<init>(HMasterCommandLine.java:307)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.hadoop.hbase.util.JVMClusterUtil.createMasterThread(JVMClusterUtil.java:139)
... 7 more
Can anyone tell me the reason for this error ?
Please let me know if you need more info.

The exception causing this is NoClassDefFoundError which usually means you are missing something from your classpath. In this you're missing org.apache.hadoop.hbase.util.Bytes which comes from hbase-common.jar. However, if one is missing, there are probably others too. This might mean you're starting HBase the wrong way. Have you tried the included start-up scripts?

Related

Hybris, ant clean all command failes with message Error occurred during initialization of VM

I'm getting this error while trying to build my Hybris project via ant clean all && ./hybrisserver.sh debug:
Error occurred during initialization of VM
java.lang.Error: Could not create SecurityManager
at java.lang.System.initPhase3(java.base#11.0.9.1/System.java:2065)
Caused by: java.lang.ClassNotFoundException: allow
at jdk.internal.loader.BuiltinClassLoader.loadClass(java.base#11.0.9.1/BuiltinClassLoader.java:581)
at jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(java.base#11.0.9.1/ClassLoaders.java:178)
at java.lang.ClassLoader.loadClass(java.base#11.0.9.1/ClassLoader.java:522)
at java.lang.Class.forName0(java.base#11.0.9.1/Native Method)
at java.lang.Class.forName(java.base#11.0.9.1/Class.java:398)
at java.lang.System.initPhase3(java.base#11.0.9.1/System.java:2050)
Maybe I put a typo somewhere (see Caused by: java.lang.ClassNotFoundException: allow
) but I can't find anything and my repo's up to date with master which is correctly working, Is there a way to find the problem?
Note: i have Ubuntu 20.04.5 LTS
Thank you
Solution was simple:
Somehow I lost my setantenv.sh configuration, i had to re-run it in /platform

Problem with Big Sur 11.0.1 and PC/SC library

I have a problem with the newest version of macOS (BigSur 11.0.1) and the library PC/SC; before BigSur the program the uses the library worked fine but after the update isn't working anymore. I am using the java version 1.8.0_271
In the code, I use the method TerminalFactory.getDefaultType() to get the default type of Terminal Factory. Before the update I was receiving "PC/SC" but after the update I am receiving None.
If I want force to connect to an instance with this line
TerminalFactory factory = TerminalFactory.getInstance("PC/SC", null);
It will return the following error:
java.security.NoSuchAlgorithmException: Error constructing implementation (algorithm: PC/SC, provider: SunPCSC, class: sun.security.smartcardio.SunPCSC$Factory)
at java.security.Provider$Service.newInstance(Provider.java:1711)
at sun.security.jca.GetInstance.getInstance(GetInstance.java:243)
at sun.security.jca.GetInstance.getInstance(GetInstance.java:190)
at javax.smartcardio.TerminalFactory.getInstance(TerminalFactory.java:245)
at prueba.Prueba.isConnected(Prueba.java:165)
at prueba.Prueba.main(Prueba.java:63)
Caused by: java.lang.UnsupportedOperationException: PC/SC not available on this platform
at sun.security.smartcardio.PCSC.checkAvailable(PCSC.java:46)
at sun.security.smartcardio.SunPCSC$Factory.<init>(SunPCSC.java:59)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at java.security.Provider$Service.newInstance(Provider.java:1703)
... 5 more
Caused by: java.io.IOException: No PC/SC library found on this system
at sun.security.smartcardio.PlatformPCSC.getLibraryName(PlatformPCSC.java:122)
at sun.security.smartcardio.PlatformPCSC.access$000(PlatformPCSC.java:43)
at sun.security.smartcardio.PlatformPCSC$1.run(PlatformPCSC.java:64)
at sun.security.smartcardio.PlatformPCSC$1.run(PlatformPCSC.java:60)
at java.security.AccessController.doPrivileged(Native Method)
at sun.security.smartcardio.PlatformPCSC.<clinit>(PlatformPCSC.java:60)
at sun.security.smartcardio.SunPCSC$Factory.<init>(SunPCSC.java:59)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at java.security.Provider$Service.newInstance(Provider.java:1703)
at sun.security.jca.GetInstance.getInstance(GetInstance.java:243)
at sun.security.jca.GetInstance.getInstance(GetInstance.java:190)
at javax.smartcardio.TerminalFactory.getInstance(TerminalFactory.java:245)
at javax.smartcardio.TerminalFactory.<clinit>(TerminalFactory.java:106)
at prueba.Prueba.isConnected(Prueba.java:164)
... 1 more
entro isConnected--2
Exception in thread "main" java.lang.NullPointerException
at prueba.Prueba.isConnected(Prueba.java:173)
at prueba.Prueba.main(Prueba.java:63)
I found that Big Sur eliminates the library PC/SC and it is no possible to install it.
I donĀ“t know if there is someone with the same error or someone that has already fix it.
Thanks for the help.
Because of the changes in macOS Big Sur, Java PC/SC implementation no longer works correctly:
https://bugs.openjdk.java.net/browse/JDK-8255877
The workaround is to set the system property:
sun.security.smartcardio.library=/System/Library/Frameworks/PCSC.framework/Versions/Current/PCSC
before trying to use TerminalFactory.

Cannot setup Apache Spark 2.1.1 on Windows 10

I have installed Apache Spark 2.1.1 on Windows 10, with Java 1.8 and Python version 3.6 Anaconda 4.3.1. I have also downloaded the winutils.exe and setup environment avriables for JAVA_HOME, HADOOP_HOME and SPARK_HOME as well as updated the path variable. I have also run winutils.exe chmod -R 777 \tmp\hive. But I am getting the below error when running pyspark in cmd prompt.
Please please can someone help, let me know if I missed out any important detail
Thanks in advance!
c:\Spark>bin\pyspark
Python 3.6.0 |Anaconda 4.3.1 (64-bit)| (default, Dec 23 2016, 11:57:41) [MSC v.1900 64 bit (AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Traceback (most recent call last):
File "c:\Spark\python\pyspark\sql\utils.py", line 63, in deco
return f(*a, **kw)
File "c:\Spark\python\lib\py4j-0.10.4-src.zip\py4j\protocol.py", line 319, in get_return_value
py4j.protocol.Py4JJavaError: **An error occurred while calling o22.sessionState.
: java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveSessionState':**
at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:981)
at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
I still get errors when launching [spark-shell], but it looks like Spark launches since I get the 'Welcome to Spark' piece. The error I get is
C:\Spark>bin\spark-shell
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
17/06/23 12:20:15 WARN General: Plugin (Bundle) "org.datanucleus.api.jdo" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/C:/Spark/jars/datanucleus-api-jdo-3.2.6.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/C:/Spark/bin/../jars/datanucleus-api-jdo-3.2.6.jar."
17/06/23 12:20:15 WARN General: Plugin (Bundle) "org.datanucleus.store.rdbms" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/C:/Spark/jars/datanucleus-rdbms-3.2.9.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/C:/Spark/bin/../jars/datanucleus-rdbms-3.2.9.jar."
17/06/23 12:20:15 WARN General: Plugin (Bundle) "org.datanucleus" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/C:/Spark/bin/../jars/datanucleus-core-3.2.10.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/C:/Spark/jars/datanucleus-core-3.2.10.jar."
java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveSessionState':
at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:981)
at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
at org.apache.spark.repl.Main$.createSparkSession(Main.scala:96)
... 47 elided
Caused by: java.lang.reflect.InvocationTargetException:
java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveExternalCatalog':
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
... 58 more
Caused by: java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveExternalCatalog':
at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:169)
at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
... 63 more
Caused by: java.lang.reflect.InvocationTargetException: java.lang.reflect.InvocationTargetException: java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode0(Ljava/lang/String;I)V
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
... 71 more
Caused by: java.lang.reflect.InvocationTargetException:
java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode0(Ljava/lang/String;I)V
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:358)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:262)
at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:66)
... 76 more
Caused by: java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode0(Ljava/lang/String;I)V
at org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode0(Native Method)
at org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode(NativeIO.java:524)
at org.apache.hadoop.fs.RawLocalFileSystem.mkOneDirWithMode(RawLocalFileSystem.java:478)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirsWithOptionalPermission(RawLocalFileSystem.java:532)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:509)
at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:305)
at org.apache.hadoop.hive.ql.session.SessionState.createPath(SessionState.java:639)
at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:561)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:508)
at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:188)
... 84 more
14: error: not found: value spark
import spark.implicits._
^
14: error: not found: value spark
import spark.sql
^
Welcome to
Setup that worked for me is as follows "i didn't use winutils.exe":-
install pyspark and findspark using "Anaconda Command Prompt" as
pip3 install pyspark
and
pip3 install findspark
as you have already downloaded spark setup. unzip it and keep it in "C" drive i.e. "C:\spark-2.2.0-bin-hadoop2.7" and create new environment variable "SPARK_HOME" and set it to "C:\spark-2.2.0-bin-hadoop2.7\bin" and open "path" variable in system variables and add the same there as well.
now open your command prompt and come from "C:\User*" to "C:\" by doing cd.. twice and run following command
set SPARK_HOME='spark-2.2.0-bin-hadoop2.7'
and you are good to go.
Now you just have to import the file location before importing pyspark in your jupyter notebook. use following code:-
import findspark
findspark.init('C:\spark-2.2.0-bin-hadoop2.7')
import pyspark

Neo4j store upgrade error

I have created a large graph using the Neo4j's 2.2M02 import tool.
Now I want to use the same database in embedded version in 2.2RC01. I get the following error in Java, when I initialize the database:
Exception in thread "main" java.lang.RuntimeException: Error starting org.neo4j.kernel.EmbeddedGraphDatabase, D:\Neo4j\data\test3.db
at org.neo4j.kernel.InternalAbstractGraphDatabase.run(InternalAbstractGraphDatabase.java:331)
at org.neo4j.kernel.EmbeddedGraphDatabase.<init>(EmbeddedGraphDatabase.java:59)
at org.neo4j.graphdb.factory.GraphDatabaseFactory.newDatabase(GraphDatabaseFactory.java:103)
at org.neo4j.graphdb.factory.GraphDatabaseFactory$1.newDatabase(GraphDatabaseFactory.java:90)
at org.neo4j.graphdb.factory.GraphDatabaseBuilder.newGraphDatabase(GraphDatabaseBuilder.java:176)
at RCNeo4j.initDB(RCNeo4j.java:419)
at RCNeo4j.main(RCNeo4j.java:46)
Caused by: org.neo4j.kernel.lifecycle.LifecycleException: Component 'org.neo4j.kernel.impl.transaction.state.DataSourceManager#2c7e0aa0' was successfully initialized, but failed to start. Please see attached cause exception.
at org.neo4j.kernel.lifecycle.LifeSupport$LifecycleInstance.start(LifeSupport.java:513)
at org.neo4j.kernel.lifecycle.LifeSupport.start(LifeSupport.java:115)
at org.neo4j.kernel.InternalAbstractGraphDatabase.run(InternalAbstractGraphDatabase.java:326)
... 6 more
Caused by: org.neo4j.kernel.lifecycle.LifecycleException: Component 'org.neo4j.kernel.NeoStoreDataSource#37b86b14' was successfully initialized, but failed to start. Please see attached cause exception.
at org.neo4j.kernel.lifecycle.LifeSupport$LifecycleInstance.start(LifeSupport.java:513)
at org.neo4j.kernel.lifecycle.LifeSupport.start(LifeSupport.java:115)
at org.neo4j.kernel.impl.transaction.state.DataSourceManager.start(DataSourceManager.java:117)
at org.neo4j.kernel.lifecycle.LifeSupport$LifecycleInstance.start(LifeSupport.java:507)
... 8 more
Caused by: org.neo4j.kernel.impl.storemigration.StoreUpgrader$UnexpectedUpgradingStoreVersionException: 'neostore.nodestore.db' has a store version number that we cannot upgrade from. Expected 'v0.A.3' but file is version 'NodeStore v0.A.4'.
at org.neo4j.kernel.impl.storemigration.UpgradableDatabase.checkUpgradeable(UpgradableDatabase.java:88)
at org.neo4j.kernel.impl.storemigration.StoreMigrator.needsMigration(StoreMigrator.java:157)
at org.neo4j.kernel.impl.storemigration.StoreUpgrader.getParticipantsEagerToMigrate(StoreUpgrader.java:259)
at org.neo4j.kernel.impl.storemigration.StoreUpgrader.migrateIfNeeded(StoreUpgrader.java:134)
at org.neo4j.kernel.NeoStoreDataSource.upgradeStore(NeoStoreDataSource.java:562)
at org.neo4j.kernel.NeoStoreDataSource.start(NeoStoreDataSource.java:471)
at org.neo4j.kernel.lifecycle.LifeSupport$LifecycleInstance.start(LifeSupport.java:507)
... 11 more
message.log inside the database doesn't seem to show any exception either. I get the same error when I try to move from 2.1.7 to RC01.
Also, on a different note I would also like to know if it's possible to use the database generated from 2.2M02 in 2.1.7 (kind of like a downgrade). Because I prefer to have a more stable version to do some analysis.
Neo4j does not provide an upgrade path between milestone releases, so there is no direct way to upgrade 2.2.0-M02 to 2.2.0-RC1. Upgrades are only supported from one stable to another stable version. Downgrades are not supported at all in the product.
However there is a potential way to do it. Use Michael's store-utils (https://github.com/jexp/store-utils) and change the code using classloader separation in a way that the store you're reading from and the one you're writing to are using separate classloaders with different Neo4j versions.

Troubles running Mahout 0.9 text-processing examples

TL/DR : are mahout 0.9 examples compatible with hadoop 2.4 ?
My problem:
I would like to classify a bunch of documents using Mahout 0.9. To do so, I'm following the example described here.
I'm on windows and trying to go full native (ie, no cygwin). I already dispose of a local hadoop 2.4.1 cluster.
I downloaded the mahout sources and compiled it according to the wiki :
mvn "-Dhadoop2.version=2.4.1" -DskipTests clean install
I then tried to execute the example with the following example :
hadoop jar $Env:mahout_home/examples/target/mahout-examples-0.9-job.jar org.apache.mahout.driver.MahoutDriver seqdirectory -i Decomposition -o output
It all seems to work : I'm getting logs showing the mapreduce job begins to run. However, I quickly get the following errors :
Error: java.lang.RuntimeException: java.lang.reflect.InvocationTargetException
at org.apache.hadoop.mapreduce.lib.input.CombineFileRecordReader.initNextRecordReader(CombineFileRecordReader.ja
va:166)
at org.apache.hadoop.mapreduce.lib.input.CombineFileRecordReader.<init>(CombineFileRecordReader.java:126)
at org.apache.mahout.text.MultipleTextFileInputFormat.createRecordReader(MultipleTextFileInputFormat.java:43)
at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.<init>(MapTask.java:492)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:735)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1556)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.hadoop.mapreduce.lib.input.CombineFileRecordReader.initNextRecordReader(CombineFileRecordReader.ja
va:157)
... 10 more
Caused by: java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.mapreduce.TaskAttemptContext, but c
lass was expected
at org.apache.mahout.text.WholeFileRecordReader.<init>(WholeFileRecordReader.java:59)
... 15 more
According to the various links I found, it seems to come from code intented for Hadoop 1.0.
Am I missing something, or are mahout provided examples not suited for a Hadoop 2.4 cluster ?
The problem is because the object TaskAttemptContext is a Interface in the version 2.4 of Hadoop and the job was expecting a class (version 1.1.2 Hadoop). TaskAttemptContext has been changed in version 2.0.

Categories

Resources