I have setup ISRA in my local webshpere and trying to upload a document to filenet thourgh ISRA. The ISRA connection is happening fine but, I am getting the following exception while trying to upload document through ISRAutil.jar and ISRA.jar:
at com.citigroup.rel.citip8.pe.PEUtils.uploadDocument(PEUtils.java:434)
at com.citigroup.rel.citip8.pe.PEUtils.uploadDocumentNewImage(PEUtils.java:334)
at com.citigroup.rel.defaultmail.action.WorkItemDetailOpsAction.delete(WorkItemDetailOpsAction.java:418)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:60)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:37)
at java.lang.reflect.Method.invoke(Method.java:611)
at org.apache.struts.actions.DispatchAction.dispatchMethod(DispatchAction.java:276)
at org.apache.struts.actions.DispatchAction.execute(DispatchAction.java:196)
at org.apache.struts.action.RequestProcessor.processActionPerform(RequestProcessor.java:421)
at org.apache.struts.action.RequestProcessor.process(RequestProcessor.java:226)
at org.apache.struts.action.ActionServlet.process(ActionServlet.java:1164)
at org.apache.struts.action.ActionServlet.doPost(ActionServlet.java:415)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:595)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:668)
at com.ibm.ws.webcontainer.servlet.ServletWrapper.service(ServletWrapper.java:1147)
at com.ibm.ws.webcontainer.servlet.ServletWrapper.handleRequest(ServletWrapper.java:722)
at com.ibm.ws.webcontainer.servlet.ServletWrapper.handleRequest(ServletWrapper.java:449)
at com.ibm.ws.webcontainer.servlet.ServletWrapperImpl.handleRequest(ServletWrapperImpl.java:178)
at com.ibm.ws.webcontainer.filter.WebAppFilterManager.invokeFilters(WebAppFilterManager.java:1020)
Caused by:
com.citigroup.rel.adg.israutil.v1.ISRAUtilException: Error running AddDoc
at com.citigroup.rel.adg.israutil.v1.ISRAUtilDao.addDocument(ISRAUtilDao.java:271)
at com.citigroup.rel.adg.israutil.v1.ISUploadHelper.addDocument(ISUploadHelper.java:105)
at com.citigroup.rel.citip8.pe.PEUtils.uploadDocument(PEUtils.java:423)
... 36 more
Caused by:
javax.resource.ResourceException: : Error in getting folder attributes.
at com.filenet.is.ra.cci.FN_IS_Document_Interactions.addDoc(Unknown Source)
at com.filenet.is.ra.cci.FN_IS_CciInteraction.executeInteractions(Unknown Source)
at com.filenet.is.ra.cci.FN_IS_CciInteraction.execute(Unknown Source)
at com.citigroup.rel.adg.israutil.v1.ISRAUtilDao.addDocument(ISRAUtilDao.java:266)
Can some1 please help me in this?
Related
I am trying to Use the ALS API on pyspark for a recommendation model but I get a java.lang.StackOverflowError Which after looking it up, I saw instructions on how to fix it by using check points.
When I try to set checkpoints directory, I get an hadoop not found error.
sc.setCheckpointDir('checkpoint')
This is the error response
Py4JJavaError: An error occurred while calling o83.setCheckpointDir.
: java.lang.RuntimeException: java.io.FileNotFoundException: java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset. -see https://wiki.apache.org/hadoop/WindowsProblems at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:736) at org.apache.hadoop.util.Shell.getSetPermissionCommand(Shell.java:271) at org.apache.hadoop.util.Shell.getSetPermissionCommand(Shell.java:287) at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:865) at org.apache.hadoop.fs.RawLocalFileSystem.mkOneDirWithMode(RawLocalFileSystem.java:547) at org.apache.hadoop.fs.RawLocalFileSystem.mkdirsWithOptionalPermission(RawLocalFileSystem.java:587) at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:559) at org.apache.hadoop.fs.RawLocalFileSystem.mkdirsWithOptionalPermission(RawLocalFileSystem.java:586) at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:559) at org.apache.hadoop.fs.ChecksumFileSystem.mkdirs(ChecksumFileSystem.java:705) at org.apache.spark.SparkContext.$anonfun$setCheckpointDir$2(SparkContext.scala:2483) at scala.Option.map(Option.scala:230) at org.apache.spark.SparkContext.setCheckpointDir(SparkContext.scala:2480) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) at java.lang.reflect.Method.invoke(Unknown Source) at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244) at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357) at py4j.Gateway.invoke(Gateway.java:282) at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) at py4j.commands.CallCommand.execute(CallCommand.java:79) at py4j.GatewayConnection.run(GatewayConnection.java:238) at java.lang.Thread.run(Unknown Source) Caused by: java.io.FileNotFoundException: java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset. -see https://wiki.apache.org/hadoop/WindowsProblems at org.apache.hadoop.util.Shell.fileNotFoundException(Shell.java:548) at org.apache.hadoop.util.Shell.getHadoopHomeDir(Shell.java:569) at org.apache.hadoop.util.Shell.getQualifiedBin(Shell.java:592) at org.apache.hadoop.util.Shell.(Shell.java:689) at org.apache.hadoop.util.StringUtils.(StringUtils.java:78) at org.apache.hadoop.conf.Configuration.getTimeDurationHelper(Configuration.java:1814) at org.apache.hadoop.conf.Configuration.getTimeDuration(Configuration.java:1791) at org.apache.hadoop.util.ShutdownHookManager.getShutdownTimeout(ShutdownHookManager.java:183) at org.apache.hadoop.util.ShutdownHookManager$HookEntry.(ShutdownHookManager.java:207) at org.apache.hadoop.util.ShutdownHookManager.addShutdownHook(ShutdownHookManager.java:302) at org.apache.spark.util.SparkShutdownHookManager.install(ShutdownHookManager.scala:181) at org.apache.spark.util.ShutdownHookManager$.shutdownHooks$lzycompute(ShutdownHookManager.scala:50) at org.apache.spark.util.ShutdownHookManager$.shutdownHooks(ShutdownHookManager.scala:48) at org.apache.spark.util.ShutdownHookManager$.addShutdownHook(ShutdownHookManager.scala:153) at org.apache.spark.util.ShutdownHookManager$.(ShutdownHookManager.scala:58) at org.apache.spark.util.ShutdownHookManager$.(ShutdownHookManager.scala) at org.apache.spark.util.Utils$.createTempDir(Utils.scala:326) at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:343) at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:894) at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180) at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203) at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90) at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1039) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1048) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Caused by: java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset. at org.apache.hadoop.util.Shell.checkHadoopHomeInner(Shell.java:468) at org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:439) at org.apache.hadoop.util.Shell.(Shell.java:516) ... 21 more
I will appreciate a response on how to fix this. I am working on a windows environment
I have a file named dog.jpg that resides in c:\Temp. So its full Windows path is C:\Temp\dog.jpg
Every answer on this site suggests to replace the \ with a /, but none of these statements I tried seems to work:
Image image = new Image("C:\\Temp\\dog.jpg");
Image image = new Image("C://Temp//dog.jpg");
Image image = new Image("C:/Temp/dog.jpg");
(btw, it does work if I put dog.jpg in the current working directory and use:
Image image = new Image("dog.jpg");
)
I am getting the following exception report:
Exception in Application start method
Exception in thread "main" java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at sun.launcher.LauncherHelper$FXHelper.main(Unknown Source)
Caused by: java.lang.RuntimeException: Exception in Application start method
at com.sun.javafx.application.LauncherImpl.launchApplication1(LauncherImpl.java:917)
at com.sun.javafx.application.LauncherImpl.lambda$launchApplication$155(LauncherImpl.java:182)
at java.lang.Thread.run(Unknown Source)
Caused by: java.lang.IllegalArgumentException: Invalid URL: unknown protocol: c
at javafx.scene.image.Image.validateUrl(Image.java:1121)
at javafx.scene.image.Image.<init>(Image.java:620)
at Inclass_week7_session_1_4.start(Inclass_week7_session_1_4.java:21)
at com.sun.javafx.application.LauncherImpl.lambda$launchApplication1$162(LauncherImpl.java:863)
at com.sun.javafx.application.PlatformImpl.lambda$runAndWait$175(PlatformImpl.java:326)
at com.sun.javafx.application.PlatformImpl.lambda$null$173(PlatformImpl.java:295)
at java.security.AccessController.doPrivileged(Native Method)
at com.sun.javafx.application.PlatformImpl.lambda$runLater$174(PlatformImpl.java:294)
at com.sun.glass.ui.InvokeLaterDispatcher$Future.run(InvokeLaterDispatcher.java:95)
at com.sun.glass.ui.win.WinApplication._runLoop(Native Method)
at com.sun.glass.ui.win.WinApplication.lambda$null$148(WinApplication.java:191)
... 1 more
Caused by: java.net.MalformedURLException: unknown protocol: c
at java.net.URL.<init>(Unknown Source)
at java.net.URL.<init>(Unknown Source)
at java.net.URL.<init>(Unknown Source)
at javafx.scene.image.Image.validateUrl(Image.java:1115)
C:\Java\Tutorial\JavaFX 2>javac -version
javac 1.8.0_102
What am I doing wrong? And how can I get this Windows absolute path to work?
Thanks for your help.
Image class constructor requires a url rather than an absolute path, so add the protocol in your url
Image img = new Image("file:///C:/Temp/dog.jpg");
We are using RSA for user Authentication.
We are facing same issues given in above,Sharing the stack trace for your reference below.
Could not complete request:org.springframework.web.util.NestedServletException: Handler processing failed; nested exception is java.lang.Error: **Problem loading Module.4**
Caused by:java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) [rt.jar:1.8.0_131] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) [rt.jar:1.8.0_131] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) [rt.jar:1.8.0_131] at java.lang.reflect.Method.invoke(Method.java:498) [rt.jar:1.8.0_131] ... 79 more
**Caused by: java.lang.SecurityException: Toolkit not encapsulated by a jar.**
at com.rsa.jcm.f.hz.a(Unknown Source) [jcmFIPS-6.1.3.3.jar:6.1]
at com.rsa.jcm.f.js.b(Unknown Source) [jcmFIPS-6.1.3.3.jar:6.1]
at com.rsa.crypto.jcm.ModuleLoader.a(Unknown Source) [jcmFIPS-6.1.3.3.jar:6.1]
at com.rsa.crypto.jcm.ModuleLoader.load(Unknown Source) [jcmFIPS-6.1.3.3.jar:6.1]
20:24:06,871 ERROR [com.utxfrmwk.util.ExceptionUtils] (http-CDCUDRIFDAPP1/10.40.18.113:3010-4) Exception raised: : java.lang.Error: Problem loading Module.4
at com.rsa.cryptoj.o.by.f(Unknown Source) [cryptojcommon-6.1.3.3.jar:6.1.3.3]
at com.rsa.cryptoj.o.by.c(Unknown Source) [cryptojcommon-6.1.3.3.jar:6.1.3.3]
at com.rsa.cryptoj.o.cf.a(Unknown Source) [cryptojcommon-6.1.3.3.jar:6.1.3.3]
at com.rsa.cryptoj.o.ce.b(Unknown Source) [cryptojcommon-6.1.3.3.jar:6.1.3.3]
Looking for your prompt support and cooperation's .
regards
Ashish
I am trying to connect Pentaho to Hive, so that I can run Hive queries through Pentaho.
I have installed Pentaho 6.0 on my Windows 7 (Professional 64bit).
I have configured Hadoop on "Bare metal server". The details of Hadoop system are as follows:
Apache Hadoop V 2.6
Hive v 1.1
YARN
I tried connecting to Pentaho by using connection type:Hadoop-Hive 2, Database name:default and Port No:10000. I have added Hive JDBC jar file from this web link: http://mvnrepository.com/artifact/org.apache.hive/hive-jdbc/1.1.0. At the end when I try to connect, I get the following errors:
Error connecting to database [Hive_connect] : org.pentaho.di.core.exception.KettleDatabaseException:
Error occurred while trying to connect to the database
Error connecting to database: (using class org.apache.hive.jdbc.HiveDriver)
org/apache/hive/service/cli/thrift/TCLIService$Iface
org.pentaho.di.core.exception.KettleDatabaseException:
Error occurred while trying to connect to the database
Error connecting to database: (using class org.apache.hive.jdbc.HiveDriver)
org/apache/hive/service/cli/thrift/TCLIService$Iface
at org.pentaho.di.core.database.Database.normalConnect(Database.java:459)
at org.pentaho.di.core.database.Database.connect(Database.java:357)
at org.pentaho.di.core.database.Database.connect(Database.java:328)
at org.pentaho.di.core.database.Database.connect(Database.java:318)
at org.pentaho.di.core.database.DatabaseFactory.getConnectionTestReport(DatabaseFactory.java:80)
at org.pentaho.di.core.database.DatabaseMeta.testConnection(DatabaseMeta.java:2734)
at org.pentaho.ui.database.event.DataHandler.testDatabaseConnection(DataHandler.java:588)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.pentaho.ui.xul.impl.AbstractXulDomContainer.invoke(AbstractXulDomContainer.java:313)
at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke(AbstractXulComponent.java:157)
at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke(AbstractXulComponent.java:141)
at org.pentaho.ui.xul.swt.tags.SwtButton.access$500(SwtButton.java:43)
at org.pentaho.ui.xul.swt.tags.SwtButton$4.widgetSelected(SwtButton.java:136)
at org.eclipse.swt.widgets.TypedListener.handleEvent(Unknown Source)
at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
at org.eclipse.jface.window.Window.runEventLoop(Window.java:820)
at org.eclipse.jface.window.Window.open(Window.java:796)
at org.pentaho.ui.xul.swt.tags.SwtDialog.show(SwtDialog.java:389)
at org.pentaho.ui.xul.swt.tags.SwtDialog.show(SwtDialog.java:318)
at org.pentaho.di.ui.core.database.dialog.XulDatabaseDialog.open(XulDatabaseDialog.java:116)
at org.pentaho.di.ui.core.database.dialog.DatabaseDialog.open(DatabaseDialog.java:60)
at org.pentaho.di.ui.spoon.delegates.SpoonDBDelegate.newConnection(SpoonDBDelegate.java:470)
at org.pentaho.di.ui.spoon.delegates.SpoonDBDelegate.newConnection(SpoonDBDelegate.java:457)
at org.pentaho.di.ui.spoon.Spoon.newConnection(Spoon.java:8750)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.pentaho.ui.xul.impl.AbstractXulDomContainer.invoke(AbstractXulDomContainer.java:313)
at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke(AbstractXulComponent.java:157)
at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke(AbstractXulComponent.java:141)
at org.pentaho.ui.xul.jface.tags.JfaceMenuitem.access$100(JfaceMenuitem.java:43)
at org.pentaho.ui.xul.jface.tags.JfaceMenuitem$1.run(JfaceMenuitem.java:106)
at org.eclipse.jface.action.Action.runWithEvent(Action.java:498)
at org.eclipse.jface.action.ActionContributionItem.handleWidgetSelection(ActionContributionItem.java:545)
at org.eclipse.jface.action.ActionContributionItem.access$2(ActionContributionItem.java:490)
at org.eclipse.jface.action.ActionContributionItem$5.handleEvent(ActionContributionItem.java:402)
at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
at org.pentaho.di.ui.spoon.Spoon.readAndDispatch(Spoon.java:1339)
at org.pentaho.di.ui.spoon.Spoon.waitForDispose(Spoon.java:7939)
at org.pentaho.di.ui.spoon.Spoon.start(Spoon.java:9214)
at org.pentaho.di.ui.spoon.Spoon.main(Spoon.java:653)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.pentaho.commons.launcher.Launcher.main(Launcher.java:92)
Caused by: org.pentaho.di.core.exception.KettleDatabaseException:
Error connecting to database: (using class org.apache.hive.jdbc.HiveDriver)
org/apache/hive/service/cli/thrift/TCLIService$Iface
at org.pentaho.di.core.database.Database.connectUsingClass(Database.java:574)
at org.pentaho.di.core.database.Database.normalConnect(Database.java:443)
... 55 more
Caused by: java.lang.NoClassDefFoundError: org/apache/hive/service/cli/thrift/TCLIService$Iface
at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:247)
at org.pentaho.di.core.database.Database.connectUsingClass(Database.java:554)
... 56 more
Caused by: java.lang.ClassNotFoundException: org.apache.hive.service.cli.thrift.TCLIService$Iface
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 60 more
There should be a JDBC Driver for Hive and you can use it to connect Penthao to your Hive Storage. It should be as simple as to register a new database instance.
Have a look at this link: HiveClient
Get the JDBC Driver for Hive if it isn't available
Follow the instructions to install the JDBC Driver Penthao Description
Configure a new Hive Database Connection
I am running a MapReduce program and encountered with the below error.
14/04/22 07:44:02 INFO mapred.JobClient: Cleaning up the staging area cfs://XX.XXX.XXX.XXX/tmp/hadoop-cassandra/mapred/staging/psadmin/.staging/job_201404180932_0063
14/04/22 07:44:02 ERROR security.UserGroupInformation: PriviledgedActionException as:psadmin cause:java.io.IOException: Could not get input splits
Exception in thread "main" java.io.IOException: Could not get input splits
at org.apache.cassandra.hadoop.AbstractColumnFamilyInputFormat.getSplits(AbstractColumnFamilyInputFormat.java:193)
at org.apache.hadoop.mapred.JobClient.writeNewSplits(JobClient.java:962)
at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:979)
at org.apache.hadoop.mapred.JobClient.access$600(JobClient.java:174)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:897)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:850)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Unknown Source)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:850)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:500)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:530)
at MultiOutMR.run(MultiOutMR.java:95)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at MultiOutMR.main(MultiOutMR.java:36)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: org.apache.thrift.transport.TTransportException
at java.util.concurrent.FutureTask.report(Unknown Source)
at java.util.concurrent.FutureTask.get(Unknown Source)
at org.apache.cassandra.hadoop.AbstractColumnFamilyInputFormat.getSplits(AbstractColumnFamilyInputFormat.java:189)
... 19 more
Caused by: java.lang.RuntimeException: org.apache.thrift.transport.TTransportException
at org.apache.cassandra.hadoop.AbstractColumnFamilyInputFormat.getSubSplits(AbstractColumnFamilyInputFormat.java:304)
at org.apache.cassandra.hadoop.AbstractColumnFamilyInputFormat.access$200(AbstractColumnFamilyInputFormat.java:60)
at org.apache.cassandra.hadoop.AbstractColumnFamilyInputFormat$SplitCallable.call(AbstractColumnFamilyInputFormat.java:226)
at org.apache.cassandra.hadoop.AbstractColumnFamilyInputFormat$SplitCallable.call(AbstractColumnFamilyInputFormat.java:211)
at java.util.concurrent.FutureTask.run(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
Caused by: org.apache.thrift.transport.TTransportException
at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132)
at org.apache.thrift.transport.TTransport.readAll(TTransport.java:84)
at org.apache.thrift.transport.TFramedTransport.readFrame(TFramedTransport.java:129)
at org.apache.thrift.transport.TFramedTransport.read(TFramedTransport.java:101)
at org.apache.thrift.transport.TTransport.readAll(TTransport.java:84)
at org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:378)
at org.apache.thrift.protocol.TBinaryProtocol.readI32(TBinaryProtocol.java:297)
at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:204)
at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:69)
at org.apache.cassandra.thrift.Cassandra$Client.recv_describe_splits_ex(Cassandra.java:1359)
at org.apache.cassandra.thrift.Cassandra$Client.describe_splits_ex(Cassandra.java:1343)
at org.apache.cassandra.hadoop.AbstractColumnFamilyInputFormat.getSubSplits(AbstractColumnFamilyInputFormat.java:281)
... 7 more
Note:-
PreRequisites used:-
Datastax Enterprise(DSE 3.2.5) with Cassandra 1.2.15.1 and Hadoop 1.0.4.9
We have configured a data center with 4 nodes. The nodetool status shows as follows:
XXXXXX#XXXXXXXXX:~$ nodetool status
Datacenter: XXXXXX
Status=Up/Down
|/ State=Normal/Leaving/Joining/Moving
-- Address Load Owns Host ID Token Rack
UN XX.XXX.XXX.XXX 14.65 MB 25.0% XX.XXX.XXX.XXX vm01
UN XX.XXX.XXX.XXX 34.25 MB 25.0% XX.XXX.XXX.XXX vm01
UN XX.XXX.XXX.XXX 57.45 MB 25.0% XX.XXX.XXX.XXX vm01
UN XX.XXX.XXX.XXX 57.08 MB 25.0% XX.XXX.XXX.XXX vm01
Could anyone please provide help in resolving this issue? Thanks in advance.
You need provide more information about how you set up the hadoop job. It's more of configuration issue. TTransportException is more a server internal issue.