WebSphere 8,5 spring job with remote ejb xa transaction error - java

I'm working with spring batch and WebSphere 8.5. Batch uses a remote ejb in xa distribuited transaction:
org.springframework.transaction.TransactionSystemException: UOWManager transaction processing failed; nested exception is com.ibm.wsspi.uow.UOWException: javax.transaction.SystemException
at org.springframework.transaction.jta.WebSphereUowTransactionManager.execute(WebSphereUowTransactionManager.java:297) ~[spring-tx-4.1.6.RELEASE.jar:4.1.6.RELEASE]
at org.springframework.transaction.support.TransactionTemplate.execute(TransactionTemplate.java:127) ~[spring-tx-4.1.6.RELEASE.jar:4.1.6.RELEASE]
at org.springframework.batch.core.step.tasklet.TaskletStep$2.doInChunkContext(TaskletStep.java:271) ~[spring-batch-core-3.0.3.RELEASE.jar:3.0.3.RELEASE]
at org.spr....
Caused by: com.ibm.wsspi.uow.UOWException: javax.transaction.SystemException
at com.ibm.ws.uow.embeddable.EmbeddableUOWManagerImpl.runUnderNewUOW(EmbeddableUOWManagerImpl.java:823) ~[com.ibm.ws.runtime.jar:na]
at com.ibm.ws.uow.embeddable.EmbeddableUOWManagerImpl.runUnderUOW(EmbeddableUOWManagerImpl.java:370) ~[com.ibm.ws.runtime.jar:na]
at org.springframework.transaction.jta.WebSphereUowTransactionManager.execute(WebSphereUowTransactionManager.java:290) ~[spring-tx-4.1.6.RELEASE.jar:4.1.6.RELEASE]
... 18 common frames omitted
Caused by: javax.transaction.SystemException: null
at com.ibm.tx.jta.impl.TransactionImpl.stage3CommitProcessing(TransactionImpl.java:1251) ~[com.ibm.tx.jta.jar:na]
at com.ibm.tx.jta.impl.TransactionImpl.processCommit(TransactionImpl.java:1042) ~[com.ibm.tx.jta.jar:na]
at com.ibm.tx.jta.impl.TransactionImpl.commit(TransactionImpl.java:963) ~[com.ibm.tx.jta.jar:na]
at com.ibm.ws.tx.jta.TranManagerImpl.commit(TranManagerImpl.java:439) ~[com.ibm.ws.runtime.jar:na]
at com.ibm.tx.jta.impl.TranManagerSet.commit(TranManagerSet.java:191) ~[com.ibm.tx.jta.jar:na]
at com.ibm.ws.uow.UOWManagerImpl.uowCommit(UOWManagerImpl.java:807) ~[com.ibm.ws.runtime.jar:na]
at com.ibm.ws.uow.embeddable.EmbeddableUOWManagerImpl.uowEnd(EmbeddableUOWManagerImpl.java:881) ~[com.ibm.ws.runtime.jar:na]
at com.ibm.ws.uow.UOWManagerImpl.uowEnd(UOWManagerImpl.java:782) ~[com.ibm.ws.runtime.jar:na]
at com.ibm.ws.uow.embeddable.EmbeddableUOWManagerImpl.runUnderNewUOW(EmbeddableUOWManagerImpl.java:818) ~[com.ibm.ws.runtime.jar:na]
Any ideas?
Thanks
Giancarlo

after Greycon's help, I resolved disabling security on transaction service.
For WAS 8.5, the procedure is the same as that of WAS 7 described in the following link:
https://www-01.ibm.com/support/knowledgecenter/SSWSR9_11.4.0/com.ibm.mdmhs.bil_install.doc/t_disable_protocol_security.html

Related

Drools ClassCastException when tomcat upgraded from tomcat 7 to tomcat 9

Facing issues with drools jars when upgrading tomcat to tomcat9.
I had a drools based rule engine deployed in a tomcat 7. I was using 7.28.0.Final for drools.
When I upgraded the tomcat to tomcat 9, it started giving me following exception
Caused by: java.lang.RuntimeException: Unable to load dialect 'org.drools.compiler.rule.builder.dialect.java.JavaDialectConfiguration:java:null'
at org.drools.compiler.builder.impl.KnowledgeBuilderConfigurationImpl.addDialect(KnowledgeBuilderConfigurationImpl.java:394)
at org.drools.compiler.builder.impl.KnowledgeBuilderConfigurationImpl.buildDialectConfigurationMap(KnowledgeBuilderConfigurationImpl.java:380)
at org.drools.compiler.builder.impl.KnowledgeBuilderConfigurationImpl.init(KnowledgeBuilderConfigurationImpl.java:235)
at org.drools.compiler.builder.impl.KnowledgeBuilderConfigurationImpl.init(KnowledgeBuilderConfigurationImpl.java:187)
at org.drools.compiler.builder.impl.KnowledgeBuilderConfigurationImpl.<init>(KnowledgeBuilderConfigurationImpl.java:177)
at org.drools.compiler.builder.impl.KnowledgeBuilderImpl.<init>(KnowledgeBuilderImpl.java:293)
at org.drools.compiler.builder.impl.KnowledgeBuilderImpl.<init>(KnowledgeBuilderImpl.java:229)
at org.drools.compiler.builder.impl.KnowledgeBuilderFactoryServiceImpl.newKnowledgeBuilder(KnowledgeBuilderFactoryServiceImpl.java:54)
at org.kie.internal.builder.KnowledgeBuilderFactory.newKnowledgeBuilder(KnowledgeBuilderFactory.java:48)
at com.rbccm.fic.mrd.ruleEngine.evaluation.util.evaluator.impl.ModelMappingFileRulesKnowldegeBase.createKnowledgeBase(ModelMappingFileRulesKnowldegeBase.java:44)
at com.rbccm.fic.mrd.ruleEngine.evaluation.util.ModelMappingEvaluatorUtil.evaluateAndCreateOutputs(ModelMappingEvaluatorUtil.java:76)
at com.rbccm.fic.productcode.service.evaluation.impl.processor.ModelMappingRequestProcessorImpl.getResponseForASource(ModelMappingRequestProcessorImpl.java:85)
at com.rbccm.fic.productcode.service.evaluation.impl.processor.ModelMappingRequestProcessorImpl.get(ModelMappingRequestProcessorImpl.java:62)
at com.rbccm.fic.productcode.service.evaluation.impl.processor.ModelMappingRequestProcessorImpl.get(ModelMappingRequestProcessorImpl.java:21)
at java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1604)
at java.util.concurrent.CompletableFuture$AsyncSupply.exec(CompletableFuture.java:1596)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:157)
Caused by: java.lang.ClassNotFoundException: org.drools.compiler.rule.builder.dialect.java.JavaDialectConfiguration
at org.drools.core.common.ProjectClassLoader.tryDefineType(ProjectClassLoader.java:197)
at org.drools.core.common.ProjectClassLoader.loadType(ProjectClassLoader.java:187)
at org.drools.core.common.ProjectClassLoader.loadClass(ProjectClassLoader.java:154)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
at org.drools.compiler.builder.impl.KnowledgeBuilderConfigurationImpl.addDialect(KnowledgeBuilderConfigurationImpl.java:388)
Then I upgraded the Drools to much higher version 7.49.0.Final and now I am getting same error but for some other class.
Caused by: java.lang.RuntimeException: Error loading accumulate function for identifier sumBD. Class org.drools.core.base.accumulators.BigDecimalSumAccumulateFunction not found
at org.drools.compiler.rule.builder.util.AccumulateUtil.loadAccumulateFunction(AccumulateUtil.java:63)
at org.drools.compiler.rule.builder.util.AccumulateUtil.buildAccumulateFunctionsMap(AccumulateUtil.java:84)
at org.drools.compiler.builder.impl.KnowledgeBuilderConfigurationImpl.init(KnowledgeBuilderConfigurationImpl.java:245)
at org.drools.compiler.builder.impl.KnowledgeBuilderConfigurationImpl.init(KnowledgeBuilderConfigurationImpl.java:197)
at org.drools.compiler.builder.impl.KnowledgeBuilderConfigurationImpl.<init>(KnowledgeBuilderConfigurationImpl.java:187)
at org.drools.compiler.builder.impl.KnowledgeBuilderImpl.<init>(KnowledgeBuilderImpl.java:291)
at org.drools.compiler.builder.impl.KnowledgeBuilderImpl.<init>(KnowledgeBuilderImpl.java:227)
at org.drools.compiler.builder.impl.KnowledgeBuilderFactoryServiceImpl.newKnowledgeBuilder(KnowledgeBuilderFactoryServiceImpl.java:54)
at org.kie.internal.builder.KnowledgeBuilderFactory.newKnowledgeBuilder(KnowledgeBuilderFactory.java:52)
at com.rbccm.fic.mrd.ruleEngine.evaluation.util.evaluator.impl.ModelMappingFileRulesKnowldegeBase.createKnowledgeBase(ModelMappingFileRulesKnowldegeBase.java:44)
at com.rbccm.fic.mrd.ruleEngine.evaluation.util.ModelMappingEvaluatorUtil.evaluateAndCreateOutputs(ModelMappingEvaluatorUtil.java:88)
at com.rbccm.fic.productcode.service.evaluation.impl.processor.ModelMappingRequestProcessorImpl.getResponseForASource(ModelMappingRequestProcessorImpl.java:85)
at com.rbccm.fic.productcode.service.evaluation.impl.processor.ModelMappingRequestProcessorImpl.get(ModelMappingRequestProcessorImpl.java:62)
at com.rbccm.fic.productcode.service.evaluation.impl.processor.ModelMappingRequestProcessorImpl.get(ModelMappingRequestProcessorImpl.java:21)
at java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1590)
at java.util.concurrent.CompletableFuture$AsyncSupply.exec(CompletableFuture.java:1582)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:157)
Caused by: java.lang.ClassNotFoundException: org.drools.core.base.accumulators.BigDecimalSumAccumulateFunction
at org.drools.reflective.classloader.ProjectClassLoader.tryDefineType(ProjectClassLoader.java:171)
at org.drools.reflective.classloader.ProjectClassLoader.loadType(ProjectClassLoader.java:161)
at org.drools.reflective.classloader.ProjectClassLoader.loadClass(ProjectClassLoader.java:128)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at org.drools.compiler.rule.builder.util.AccumulateUtil.loadAccumulateFunction(AccumulateUtil.java:60)
... 19 more

Integrating spark and spring boot

After fighting with logger dependencies, I finally started successfully the spring boot application with the usual "java -jar" command.
In the application there is a REST service in which it is used Spark to extract data from Oracle and MongoDB.
When I called this REST service I got this exception:
Driver stacktrace:
Job 0 failed: treeAggregate at MongoInferSchema.scala:80, took 0.233175 s
Servlet.service() for servlet [dispatcherServlet] in context with path [] threw exception [Request processing failed; nested exception is org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage 0.0 (TID 3, 172.16.212.49, executor 0): java.lang.ClassNotFoundException: com.mongodb.spark.rdd.partitioner.MongoPartition
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:67)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1866)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1749)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2040)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1571)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2285)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2209)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2067)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1571)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431)
at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)
at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:114)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:313)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Driver stacktrace:] with root cause
java.lang.ClassNotFoundException: com.mongodb.spark.rdd.partitioner.MongoPartition
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:67)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1866)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1749)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2040)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1571)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2285)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2209)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2067)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1571)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431)
at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)
at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:114)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:313)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Closing MongoClient: [127.0.0.1:27017]
The pom.xml contains the mongodb dependencies:
<dependency>
<groupId>org.mongodb.spark</groupId>
<artifactId>mongo-spark-connector_2.11</artifactId>
<version>2.3.0</version>
</dependency>
And the compiled Jar contains the mongodb libraries:
....
825351 Mon Jul 30 14:42:22 CEST 2018 BOOT-INF/lib/mongo-spark-connector_2.11-2.3.0.jar
1897919 Mon May 28 23:33:28 CEST 2018 BOOT-INF/lib/mongo-java-driver-3.6.4.jar
....
I tried to add the libraries in the classpath too, but with no result.
Has anyone an idea how to get Spark to see the jars it needs?
EDIT:
Following the suggestion of #Ramdev, I added this portion of code to my code:
JavaSparkContext context = new JavaSparkContext(sparkSession.sparkContext());
context.addJar("/home/user/.m3/repository/org/mongodb/spark/mongo-spark-connector_2.11/2.3.0/mongo-spark-connector_2.11-2.3.0.jar");
context.addJar("/home/user/.m3/repository/org/mongodb/mongo-java-driver/3.8.1/mongo-java-driver-3.8.1.jar");
The result is Spark now sees the jars, but it seems to be in conflict with the ones in the applicacation jar:
018-09-25 11:39:51 ERROR [dispatcherServlet]:182 - Servlet.service() for servlet [dispatcherServlet] in context with path [] threw exception [Handler dispatch failed; nested exception is java.lang.NoSuchMethodError: com.mongodb.client.MongoCollection.countDocuments(Lorg/bson/conversions/Bson;)J] with root cause
java.lang.NoSuchMethodError: com.mongodb.client.MongoCollection.countDocuments(Lorg/bson/conversions/Bson;)J
at com.mongodb.spark.rdd.partitioner.MongoSamplePartitioner$$anonfun$7.apply(MongoSamplePartitioner.scala:88)
at com.mongodb.spark.rdd.partitioner.MongoSamplePartitioner$$anonfun$7.apply(MongoSamplePartitioner.scala:88)
at com.mongodb.spark.MongoConnector$$anonfun$withCollectionDo$1.apply(MongoConnector.scala:186)
at com.mongodb.spark.MongoConnector$$anonfun$withCollectionDo$1.apply(MongoConnector.scala:184)
at com.mongodb.spark.MongoConnector$$anonfun$withDatabaseDo$1.apply(MongoConnector.scala:171)
at com.mongodb.spark.MongoConnector$$anonfun$withDatabaseDo$1.apply(MongoConnector.scala:171)
at com.mongodb.spark.MongoConnector.withMongoClientDo(MongoConnector.scala:154)
at com.mongodb.spark.MongoConnector.withDatabaseDo(MongoConnector.scala:171)
at com.mongodb.spark.MongoConnector.withCollectionDo(MongoConnector.scala:184)
at com.mongodb.spark.rdd.partitioner.MongoSamplePartitioner.partitions(MongoSamplePartitioner.scala:88)
at com.mongodb.spark.rdd.partitioner.DefaultMongoPartitioner.partitions(DefaultMongoPartitioner.scala:34)
at com.mongodb.spark.rdd.MongoRDD.getPartitions(MongoRDD.scala:139)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:253)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:251)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.rdd.RDD.partitions(RDD.scala:251)
at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:253)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:251)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.rdd.RDD.partitions(RDD.scala:251)
at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:253)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:251)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.rdd.RDD.partitions(RDD.scala:251)
at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:253)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:251)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.rdd.RDD.partitions(RDD.scala:251)
at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:253)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:251)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.rdd.RDD.partitions(RDD.scala:251)
at org.apache.spark.ShuffleDependency.<init>(Dependency.scala:91)
at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$.prepareShuffleDependency(ShuffleExchangeExec.scala:318)
at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.prepareShuffleDependency(ShuffleExchangeExec.scala:91)
at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$doExecute$1.apply(ShuffleExchangeExec.scala:128)
at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$doExecute$1.apply(ShuffleExchangeExec.scala:119)
at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:52)
at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.doExecute(ShuffleExchangeExec.scala:119)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
at org.apache.spark.sql.execution.InputAdapter.inputRDDs(WholeStageCodegenExec.scala:371)
at org.apache.spark.sql.execution.SortExec.inputRDDs(SortExec.scala:121)
at org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:605)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
at org.apache.spark.sql.execution.InputAdapter.doExecute(WholeStageCodegenExec.scala:363)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
at org.apache.spark.sql.execution.joins.SortMergeJoinExec.inputRDDs(SortMergeJoinExec.scala:386)
at org.apache.spark.sql.execution.ProjectExec.inputRDDs(basicPhysicalOperators.scala:41)
at org.apache.spark.sql.execution.aggregate.HashAggregateExec.inputRDDs(HashAggregateExec.scala:150)
at org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:605)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.prepareShuffleDependency(ShuffleExchangeExec.scala:92)
at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$doExecute$1.apply(ShuffleExchangeExec.scala:128)
at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$doExecute$1.apply(ShuffleExchangeExec.scala:119)
at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:52)
at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.doExecute(ShuffleExchangeExec.scala:119)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
at org.apache.spark.sql.execution.InputAdapter.inputRDDs(WholeStageCodegenExec.scala:371)
at org.apache.spark.sql.execution.aggregate.HashAggregateExec.inputRDDs(HashAggregateExec.scala:150)
at org.apache.spark.sql.execution.BaseLimitExec$class.inputRDDs(limit.scala:62)
at org.apache.spark.sql.execution.LocalLimitExec.inputRDDs(limit.scala:97)
at org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:605)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
at org.apache.spark.sql.execution.SparkPlan.getByteArrayRdd(SparkPlan.scala:247)
at org.apache.spark.sql.execution.SparkPlan.executeTake(SparkPlan.scala:337)
at org.apache.spark.sql.execution.CollectLimitExec.executeCollect(limit.scala:38)
at org.apache.spark.sql.Dataset.org$apache$spark$sql$Dataset$$collectFromPlan(Dataset.scala:3273)
at org.apache.spark.sql.Dataset$$anonfun$head$1.apply(Dataset.scala:2484)
at org.apache.spark.sql.Dataset$$anonfun$head$1.apply(Dataset.scala:2484)
at org.apache.spark.sql.Dataset$$anonfun$52.apply(Dataset.scala:3254)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:77)
at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3253)
at org.apache.spark.sql.Dataset.head(Dataset.scala:2484)
at org.apache.spark.sql.Dataset.take(Dataset.scala:2698)
at org.apache.spark.sql.Dataset.showString(Dataset.scala:254)
at org.apache.spark.sql.Dataset.show(Dataset.scala:723)
at org.apache.spark.sql.Dataset.show(Dataset.scala:682)
at org.apache.spark.sql.Dataset.show(Dataset.scala:691)
at my.app.common.spark.SparkSpringBootHandler.querySpark(SparkSpringBootHandler.java:92)
SparkSpringBootHandler.java:85-92 lines
String queryJ = "select count(s.idlocalizator) "
+ "from installationOnBoard i join storicpos s on s.installation_uuid = i.uuid ";
result += sdf.format(new Date()) + " - ***************** QUERY ***************** Start...\n";
Dataset<Long> counter = sparkSession.sql(queryJ).as(Encoders.LONG());
counter.show();
I am not sure how you are integrating Spark jobs and Spring Boot. I am sharing my views based on what I did in one project.
We had a separate project for Spark/Scala and building a fat jar with all dependency using sbt assembly.
On the Spring Boot project side, we were calling Spark job using Apache Livy API and tracking status of the job using Apache Livy generated batch Id.
Apache Livy is available for both Spark 1.x and Spark 2.x
https://livy.incubator.apache.org/docs/latest/rest-api.html
I hope it may help in some direction.

Spring resource not found - 3.2

We have a project which is using Spring with 3.0 xsd running on tomcat 7. I tried to change the xsd to Spring 3.2 as we have been using spring jars for spring 3.2.4 and 'exclude-mapping' cannot be used with spring 3.0 xsd.
After this change, application stops working, though it does not throw any errors while starting/stopping the server or adding and removing the application.
After the change any request to the application fails with error 404.
Here are the Jars we are using:
beandoc-0.7.0.jar
commons-fileupload-1.2.2.jar
commons-io-1.3.2.jar
commons-logging-1.2-javadoc.jar
commons-logging-1.2.jar
jstl.jar
log4j-1.2.14.jar
ojdbc6.jar
slf4j-api-1.5.6.jar
slf4j-log4j12-1.5.6.jar
spring-beandoc-0.7.1.jar
spring-beans-3.2.4.RELEASE.jar
spring-context-3.2.4.RELEASE.jar
spring-context-support-3.2.4.RELEASE.jar
spring-core-3.2.4.RELEASE.jar
spring-dao-2.0.8.jar
spring-expression-3.2.4.RELEASE.jar
spring-jdbc-3.2.4.RELEASE.jar
spring-web-3.2.4.RELEASE.jar
spring-webmvc-3.2.4.RELEASE.jar
spring-webmvc-portlet-3.2.4.RELEASE.jar
standard.jar
When we try to clean the server, it results in the below exception:
SEVERE: Exception starting Context with name [/COETOOLSACCL]
org.apache.catalina.LifecycleException: Failed to start component [StandardEngine[Catalina].StandardHost[localhost].StandardContext[/COETOOLSACCL]]
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:153)
at org.apache.catalina.core.StandardContext.reload(StandardContext.java:4119)
at org.apache.catalina.loader.WebappLoader.backgroundProcess(WebappLoader.java:425)
at org.apache.catalina.core.ContainerBase.backgroundProcess(ContainerBase.java:1341)
at org.apache.catalina.core.ContainerBase$ContainerBackgroundProcessor.processChildren(ContainerBase.java:1542)
at org.apache.catalina.core.ContainerBase$ContainerBackgroundProcessor.processChildren(ContainerBase.java:1552)
at org.apache.catalina.core.ContainerBase$ContainerBackgroundProcessor.processChildren(ContainerBase.java:1552)
at org.apache.catalina.core.ContainerBase$ContainerBackgroundProcessor.run(ContainerBase.java:1520)
at java.lang.Thread.run(Unknown Source)
Caused by: java.lang.NoClassDefFoundError: org/springframework/web/context/support/ServletRequestHandledEvent
at java.lang.Class.getDeclaredFields0(Native Method)
at java.lang.Class.privateGetDeclaredFields(Unknown Source)
at java.lang.Class.getDeclaredFields(Unknown Source)
at org.apache.catalina.util.Introspection.getDeclaredFields(Introspection.java:106)
at org.apache.catalina.startup.WebAnnotationSet.loadFieldsAnnotation(WebAnnotationSet.java:270)
at org.apache.catalina.startup.WebAnnotationSet.loadApplicationServletAnnotations(WebAnnotationSet.java:139)
at org.apache.catalina.startup.WebAnnotationSet.loadApplicationAnnotations(WebAnnotationSet.java:65)
at org.apache.catalina.startup.ContextConfig.applicationAnnotationsConfig(ContextConfig.java:416)
at org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:890)
at org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:387)
at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
at org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5503)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:147)
... 8 more
Caused by: java.lang.ClassNotFoundException: org.springframework.web.context.support.ServletRequestHandledEvent
at org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1891)
at org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1734)
... 22 more
This exception prevails regardless of the xsd versions and with 3.0 application runs normal despite we get this error while cleaning the server.
Please help.

Getting AbstractMethodError while creating a connection to Oracle9 database with Tomcat 8 server

I am getting the below error when I start the Tomcat 8 server.
I am using ojdbc14.jar and I have tried with ojdbc6.jar as well,but its not working.This is happening only with Tomcat 8. If I use Tomcat 7 then it is not throwing any exception. JRE version is 7
Caused by: java.lang.AbstractMethodError:
oracle.jdbc.driver.T4CConnection.isValid(I)Z at
org.apache.tomcat.dbcp.dbcp2.DelegatingConnection.isValid(DelegatingConnection.java:917)
at
org.apache.tomcat.dbcp.dbcp2.PoolableConnection.validate(PoolableConnection.java:282)
at
org.apache.tomcat.dbcp.dbcp2.PoolableConnectionFactory.validateConnection(PoolableConnectionFactory.java:356)
at
org.apache.tomcat.dbcp.dbcp2.BasicDataSource.validateConnectionFactory(BasicDataSource.java:2306) at
org.apache.tomcat.dbcp.dbcp2.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:2289)
at
org.apache.tomcat.dbcp.dbcp2.BasicDataSource.createDataSource(BasicDataSource.java:2038)
at
org.apache.tomcat.dbcp.dbcp2.BasicDataSource.getConnection(BasicDataSource.java:1532)
at
org.hibernate.ejb.connection.InjectedDataSourceConnectionProvider.getConnection(InjectedDataSourceConnectionProvider.java:70)
at
org.hibernate.engine.jdbc.internal.JdbcServicesImpl$ConnectionProviderJdbcConnectionAccess.obtainConnection(JdbcServicesImpl.java:242)
at
org.hibernate.engine.jdbc.internal.JdbcServicesImpl.configure(JdbcServicesImpl.java:117)
at
org.hibernate.service.internal.StandardServiceRegistryImpl.configureService(StandardServiceRegistryImpl.java:75)
at
org.hibernate.service.internal.AbstractServiceRegistryImpl.initializeService(AbstractServiceRegistryImpl.java:159)
at
org.hibernate.service.internal.AbstractServiceRegistryImpl.getService(AbstractServiceRegistryImpl.java:131)
at
org.hibernate.cfg.SettingsFactory.buildSettings(SettingsFactory.java:78)
at
org.hibernate.cfg.Configuration.buildSettingsInternal(Configuration.java:2283)
at
org.hibernate.cfg.Configuration.buildSettings(Configuration.java:2279)
at
org.hibernate.cfg.Configuration.buildSessionFactory(Configuration.java:1748)
at
org.hibernate.ejb.EntityManagerFactoryImpl.(EntityManagerFactoryImpl.java:94)
at
org.hibernate.ejb.Ejb3Configuration.buildEntityManagerFactory(Ejb3Configuration.java:920)
at
org.hibernate.ejb.Ejb3Configuration.buildEntityManagerFactory(Ejb3Configuration.java:904)
at
org.hibernate.ejb.HibernatePersistence.createContainerEntityManagerFactory(HibernatePersistence.java:92)
at
org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean.createNativeEntityManagerFactory(LocalContainerEntityManagerFactoryBean.java:290)
at
org.springframework.orm.jpa.AbstractEntityManagerFactoryBean.afterPropertiesSet(AbstractEntityManagerFactoryBean.java:310)
at
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1571)
at
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1509)
... 21 more
Use ojdbc7.jar with Java 7, it should work.

Error in setting Hibernate Configuration

I am getting error coming in to configure hibernate for postgresql database -
Error Log Details -
org.hibernate.HibernateException: could not instantiate RegionFactory [org.hibernate.cache.impl.bridge.RegionFactoryCacheProviderBridge]
at org.hibernate.cfg.SettingsFactory.createRegionFactory(SettingsFactory.java:402)
at org.hibernate.cfg.SettingsFactory.buildSettings(SettingsFactory.java:270)
at org.hibernate.cfg.Configuration.buildSettingsInternal(Configuration.java:2163)
at org.hibernate.cfg.Configuration.buildSettings(Configuration.java:2159)
at org.hibernate.cfg.Configuration.buildSessionFactory(Configuration.java:1383)
at org.hibernate.console.ConsoleConfiguration$5.execute(ConsoleConfiguration.java:278)
at org.hibernate.console.execution.DefaultExecutionContext.execute(DefaultExecutionContext.java:63)
at org.hibernate.console.ConsoleConfiguration.execute(ConsoleConfiguration.java:107)
at org.hibernate.console.ConsoleConfiguration.buildSessionFactory(ConsoleConfiguration.java:273)
at org.hibernate.eclipse.console.workbench.LazySessionFactoryAdapter.getChildren(LazySessionFactoryAdapter.java:43)
at org.hibernate.eclipse.console.workbench.BasicWorkbenchAdapter.getChildren(BasicWorkbenchAdapter.java:100)
at org.hibernate.eclipse.console.workbench.BasicWorkbenchAdapter.fetchDeferredChildren(BasicWorkbenchAdapter.java:106)
at org.eclipse.ui.progress.DeferredTreeContentManager$1.run(DeferredTreeContentManager.java:235)
at org.eclipse.core.internal.jobs.Worker.run(Worker.java:53)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.hibernate.cfg.SettingsFactory.createRegionFactory(SettingsFactory.java:397)
... 13 more
Caused by: org.hibernate.cache.CacheException: could not instantiate CacheProvider [org.hibernate.cache.internal.NoCacheProvider]
at org.hibernate.cache.impl.bridge.RegionFactoryCacheProviderBridge.<init>(RegionFactoryCacheProviderBridge.java:66)
... 18 more
Caused by: java.lang.ClassNotFoundException: org.hibernate.cache.internal.NoCacheProvider cannot be found by org.hibernate.eclipse.libs_3.7.1.Final-v20131205-0918-B107
at org.eclipse.osgi.internal.loader.BundleLoader.findClassInternal(BundleLoader.java:501)
at org.eclipse.osgi.internal.loader.BundleLoader.findClass(BundleLoader.java:421)
at org.eclipse.osgi.internal.loader.BundleLoader.findClass(BundleLoader.java:412)
at org.eclipse.osgi.internal.baseadaptor.DefaultClassLoader.loadClass(DefaultClassLoader.java:107)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:191)
at org.hibernate.util.ReflectHelper.classForName(ReflectHelper.java:192)
at org.hibernate.cache.impl.bridge.RegionFactoryCacheProviderBridge.<init>(RegionFactoryCacheProviderBridge.java:63)
... 18 more
UPDATE:
After the changes mentioned by #ConMan, I am getting a new error at the same place -
org.hibernate.HibernateException: Could not instantiate dialect class
at org.hibernate.dialect.resolver.DialectFactory.constructDialect(DialectFactory.java:163)
at org.hibernate.dialect.resolver.DialectFactory.buildDialect(DialectFactory.java:109)
at org.hibernate.cfg.SettingsFactory.buildSettings(SettingsFactory.java:146)
at org.hibernate.cfg.Configuration.buildSettingsInternal(Configuration.java:2163)
at org.hibernate.cfg.Configuration.buildSettings(Configuration.java:2155)
at org.hibernate.console.ConsoleConfiguration$6.execute(ConsoleConfiguration.java:430)
at org.hibernate.console.execution.DefaultExecutionContext.execute(DefaultExecutionContext.java:63)
at org.hibernate.console.ConsoleConfiguration.execute(ConsoleConfiguration.java:107)
at org.hibernate.console.ConsoleConfiguration.getSettings(ConsoleConfiguration.java:428)
at org.hibernate.eclipse.console.workbench.LazyDatabaseSchemaWorkbenchAdapter$2.execute(LazyDatabaseSchemaWorkbenchAdapter.java:119)
at org.hibernate.console.execution.DefaultExecutionContext.execute(DefaultExecutionContext.java:63)
at org.hibernate.console.ConsoleConfiguration.execute(ConsoleConfiguration.java:107)
at org.hibernate.eclipse.console.workbench.LazyDatabaseSchemaWorkbenchAdapter.readDatabaseSchema(LazyDatabaseSchemaWorkbenchAdapter.java:115)
at org.hibernate.eclipse.console.workbench.LazyDatabaseSchemaWorkbenchAdapter.getChildren(LazyDatabaseSchemaWorkbenchAdapter.java:65)
at org.hibernate.eclipse.console.workbench.BasicWorkbenchAdapter.fetchDeferredChildren(BasicWorkbenchAdapter.java:106)
at org.eclipse.ui.progress.DeferredTreeContentManager$1.run(DeferredTreeContentManager.java:235)
at org.eclipse.core.internal.jobs.Worker.run(Worker.java:53)
Caused by: java.lang.ClassCastException: org.hibernate.dialect.PostgreSQLDialect cannot be cast to org.hibernate.dialect.Dialect
at org.hibernate.dialect.resolver.DialectFactory.constructDialect(DialectFactory.java:157)
... 16 more
The problem is here:
Caused by: java.lang.ClassNotFoundException: org.hibernate.cache.internal.NoCacheProvider cannot be found by org.hibernate.eclipse.libs_3.7.1.Final-v20131205-0918-B107
This basically means that the implementation of hibernate you are using (org.hibernate.eclipse.libs...) does not contain the class NoCacheProvider.class.
An implementation of the NoCacheProvider can be found in the following dependencies:
hibernate-core - 3.6.0.Final, 3.5.0-Final, 3.3.0.SP1, 3.3.0.GA
com.springsource.org.hibernate - 3.3.2, 3.3.1, 3.2.6
hibernate-core - 3.6.10.Final-patched-play-1.2.5, 3.6.1.Final-patched-play-1.2, 3.5.6-Final-patched-play-1.1.1, 3.5.6-Final-patched-play-1.1
hibernate - 3.2.7.ga, 3.2.6.ga, 3.2.5.ga, 3.2.4.sp1, 3.2.4.ga,
3.2.3.ga, 3.2.2.ga, 3.2.1.ga, 3.2.0.ga, 3.2.0.cr3, 3.2.0.cr2, 3.2.0.cr1, 3.1.3, 3.1.2, 3.1.1, 3.1
hibernate - 3.1beta3, 3.1beta2, 3.1beta1, 3.0.5, 3.0.3
hibernate-all - beta3.SP15
Source: Grep Code
EDIT:
I have just seen that you are using hibernate-core 4.3.0.Final. It would appear that the NoCacheProvider class no longer exists in this version of hibernate. The recommended alternative is to use the following class instead:
org.hibernate.cache.internal.NoCachingRegionFactory

Categories

Resources