Integrating spark and spring boot - java

After fighting with logger dependencies, I finally started successfully the spring boot application with the usual "java -jar" command.
In the application there is a REST service in which it is used Spark to extract data from Oracle and MongoDB.
When I called this REST service I got this exception:
Driver stacktrace:
Job 0 failed: treeAggregate at MongoInferSchema.scala:80, took 0.233175 s
Servlet.service() for servlet [dispatcherServlet] in context with path [] threw exception [Request processing failed; nested exception is org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage 0.0 (TID 3, 172.16.212.49, executor 0): java.lang.ClassNotFoundException: com.mongodb.spark.rdd.partitioner.MongoPartition
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:67)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1866)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1749)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2040)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1571)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2285)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2209)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2067)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1571)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431)
at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)
at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:114)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:313)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Driver stacktrace:] with root cause
java.lang.ClassNotFoundException: com.mongodb.spark.rdd.partitioner.MongoPartition
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:67)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1866)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1749)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2040)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1571)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2285)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2209)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2067)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1571)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431)
at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)
at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:114)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:313)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Closing MongoClient: [127.0.0.1:27017]
The pom.xml contains the mongodb dependencies:
<dependency>
<groupId>org.mongodb.spark</groupId>
<artifactId>mongo-spark-connector_2.11</artifactId>
<version>2.3.0</version>
</dependency>
And the compiled Jar contains the mongodb libraries:
....
825351 Mon Jul 30 14:42:22 CEST 2018 BOOT-INF/lib/mongo-spark-connector_2.11-2.3.0.jar
1897919 Mon May 28 23:33:28 CEST 2018 BOOT-INF/lib/mongo-java-driver-3.6.4.jar
....
I tried to add the libraries in the classpath too, but with no result.
Has anyone an idea how to get Spark to see the jars it needs?
EDIT:
Following the suggestion of #Ramdev, I added this portion of code to my code:
JavaSparkContext context = new JavaSparkContext(sparkSession.sparkContext());
context.addJar("/home/user/.m3/repository/org/mongodb/spark/mongo-spark-connector_2.11/2.3.0/mongo-spark-connector_2.11-2.3.0.jar");
context.addJar("/home/user/.m3/repository/org/mongodb/mongo-java-driver/3.8.1/mongo-java-driver-3.8.1.jar");
The result is Spark now sees the jars, but it seems to be in conflict with the ones in the applicacation jar:
018-09-25 11:39:51 ERROR [dispatcherServlet]:182 - Servlet.service() for servlet [dispatcherServlet] in context with path [] threw exception [Handler dispatch failed; nested exception is java.lang.NoSuchMethodError: com.mongodb.client.MongoCollection.countDocuments(Lorg/bson/conversions/Bson;)J] with root cause
java.lang.NoSuchMethodError: com.mongodb.client.MongoCollection.countDocuments(Lorg/bson/conversions/Bson;)J
at com.mongodb.spark.rdd.partitioner.MongoSamplePartitioner$$anonfun$7.apply(MongoSamplePartitioner.scala:88)
at com.mongodb.spark.rdd.partitioner.MongoSamplePartitioner$$anonfun$7.apply(MongoSamplePartitioner.scala:88)
at com.mongodb.spark.MongoConnector$$anonfun$withCollectionDo$1.apply(MongoConnector.scala:186)
at com.mongodb.spark.MongoConnector$$anonfun$withCollectionDo$1.apply(MongoConnector.scala:184)
at com.mongodb.spark.MongoConnector$$anonfun$withDatabaseDo$1.apply(MongoConnector.scala:171)
at com.mongodb.spark.MongoConnector$$anonfun$withDatabaseDo$1.apply(MongoConnector.scala:171)
at com.mongodb.spark.MongoConnector.withMongoClientDo(MongoConnector.scala:154)
at com.mongodb.spark.MongoConnector.withDatabaseDo(MongoConnector.scala:171)
at com.mongodb.spark.MongoConnector.withCollectionDo(MongoConnector.scala:184)
at com.mongodb.spark.rdd.partitioner.MongoSamplePartitioner.partitions(MongoSamplePartitioner.scala:88)
at com.mongodb.spark.rdd.partitioner.DefaultMongoPartitioner.partitions(DefaultMongoPartitioner.scala:34)
at com.mongodb.spark.rdd.MongoRDD.getPartitions(MongoRDD.scala:139)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:253)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:251)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.rdd.RDD.partitions(RDD.scala:251)
at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:253)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:251)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.rdd.RDD.partitions(RDD.scala:251)
at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:253)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:251)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.rdd.RDD.partitions(RDD.scala:251)
at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:253)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:251)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.rdd.RDD.partitions(RDD.scala:251)
at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:253)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:251)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.rdd.RDD.partitions(RDD.scala:251)
at org.apache.spark.ShuffleDependency.<init>(Dependency.scala:91)
at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$.prepareShuffleDependency(ShuffleExchangeExec.scala:318)
at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.prepareShuffleDependency(ShuffleExchangeExec.scala:91)
at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$doExecute$1.apply(ShuffleExchangeExec.scala:128)
at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$doExecute$1.apply(ShuffleExchangeExec.scala:119)
at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:52)
at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.doExecute(ShuffleExchangeExec.scala:119)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
at org.apache.spark.sql.execution.InputAdapter.inputRDDs(WholeStageCodegenExec.scala:371)
at org.apache.spark.sql.execution.SortExec.inputRDDs(SortExec.scala:121)
at org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:605)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
at org.apache.spark.sql.execution.InputAdapter.doExecute(WholeStageCodegenExec.scala:363)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
at org.apache.spark.sql.execution.joins.SortMergeJoinExec.inputRDDs(SortMergeJoinExec.scala:386)
at org.apache.spark.sql.execution.ProjectExec.inputRDDs(basicPhysicalOperators.scala:41)
at org.apache.spark.sql.execution.aggregate.HashAggregateExec.inputRDDs(HashAggregateExec.scala:150)
at org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:605)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.prepareShuffleDependency(ShuffleExchangeExec.scala:92)
at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$doExecute$1.apply(ShuffleExchangeExec.scala:128)
at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$doExecute$1.apply(ShuffleExchangeExec.scala:119)
at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:52)
at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.doExecute(ShuffleExchangeExec.scala:119)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
at org.apache.spark.sql.execution.InputAdapter.inputRDDs(WholeStageCodegenExec.scala:371)
at org.apache.spark.sql.execution.aggregate.HashAggregateExec.inputRDDs(HashAggregateExec.scala:150)
at org.apache.spark.sql.execution.BaseLimitExec$class.inputRDDs(limit.scala:62)
at org.apache.spark.sql.execution.LocalLimitExec.inputRDDs(limit.scala:97)
at org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:605)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
at org.apache.spark.sql.execution.SparkPlan.getByteArrayRdd(SparkPlan.scala:247)
at org.apache.spark.sql.execution.SparkPlan.executeTake(SparkPlan.scala:337)
at org.apache.spark.sql.execution.CollectLimitExec.executeCollect(limit.scala:38)
at org.apache.spark.sql.Dataset.org$apache$spark$sql$Dataset$$collectFromPlan(Dataset.scala:3273)
at org.apache.spark.sql.Dataset$$anonfun$head$1.apply(Dataset.scala:2484)
at org.apache.spark.sql.Dataset$$anonfun$head$1.apply(Dataset.scala:2484)
at org.apache.spark.sql.Dataset$$anonfun$52.apply(Dataset.scala:3254)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:77)
at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3253)
at org.apache.spark.sql.Dataset.head(Dataset.scala:2484)
at org.apache.spark.sql.Dataset.take(Dataset.scala:2698)
at org.apache.spark.sql.Dataset.showString(Dataset.scala:254)
at org.apache.spark.sql.Dataset.show(Dataset.scala:723)
at org.apache.spark.sql.Dataset.show(Dataset.scala:682)
at org.apache.spark.sql.Dataset.show(Dataset.scala:691)
at my.app.common.spark.SparkSpringBootHandler.querySpark(SparkSpringBootHandler.java:92)
SparkSpringBootHandler.java:85-92 lines
String queryJ = "select count(s.idlocalizator) "
+ "from installationOnBoard i join storicpos s on s.installation_uuid = i.uuid ";
result += sdf.format(new Date()) + " - ***************** QUERY ***************** Start...\n";
Dataset<Long> counter = sparkSession.sql(queryJ).as(Encoders.LONG());
counter.show();

I am not sure how you are integrating Spark jobs and Spring Boot. I am sharing my views based on what I did in one project.
We had a separate project for Spark/Scala and building a fat jar with all dependency using sbt assembly.
On the Spring Boot project side, we were calling Spark job using Apache Livy API and tracking status of the job using Apache Livy generated batch Id.
Apache Livy is available for both Spark 1.x and Spark 2.x
https://livy.incubator.apache.org/docs/latest/rest-api.html
I hope it may help in some direction.

Related

Liferay 7.3 - org.hibernate.MappingException: entity class not found

I'm pretty new to Liferay development. I'm working on migrating a project from 6.2 to 7.3.
The plugin I'm working on has the -api / -service / -portlet structure that I've seen in the tutorials.
When I try to deploy the service JAR to my server, I'm getting the following error:
2021-07-07 10:30:11.759 ERROR [pipe-start 1356][LiferayServiceExtender:88] org.hibernate.MappingException: entity class not found: PhishRodLab.db.model.impl.Customer_PreferencesImpl
org.hibernate.MappingException: entity class not found: PhishRodLab.db.model.impl.Customer_PreferencesImpl
at org.hibernate.mapping.PersistentClass.getMappedClass(PersistentClass.java:125)
at org.hibernate.tuple.PropertyFactory.getGetter(PropertyFactory.java:191)
at org.hibernate.tuple.PropertyFactory.buildIdentifierProperty(PropertyFactory.java:67)
at org.hibernate.tuple.entity.EntityMetamodel.(EntityMetamodel.java:135)
at org.hibernate.persister.entity.AbstractEntityPersister.(AbstractEntityPersister.java:485)
at org.hibernate.persister.entity.SingleTableEntityPersister.(SingleTableEntityPersister.java:133)
at org.hibernate.persister.PersisterFactory.createClassPersister(PersisterFactory.java:84)
at org.hibernate.impl.SessionFactoryImpl.(SessionFactoryImpl.java:286)
at org.hibernate.cfg.Configuration.buildSessionFactory(Configuration.java:1872)
at org.springframework.orm.hibernate3.LocalSessionFactoryBean.newSessionFactory(LocalSessionFactoryBean.java:795)
at com.liferay.portal.spring.hibernate.PortalHibernateConfiguration.newSessionFactory(PortalHibernateConfiguration.java:298)
at org.springframework.orm.hibernate3.LocalSessionFactoryBean.buildSessionFactory(LocalSessionFactoryBean.java:717)
at com.liferay.portal.spring.hibernate.PortalHibernateConfiguration.buildSessionFactory(PortalHibernateConfiguration.java:99)
at com.liferay.portal.spring.extender.internal.LiferayServiceExtender$LiferayServiceExtension.start(LiferayServiceExtender.java:148)
at com.liferay.portal.spring.extender.internal.LiferayServiceExtender.addingBundle(LiferayServiceExtender.java:83)
at com.liferay.portal.spring.extender.internal.LiferayServiceExtender.addingBundle(LiferayServiceExtender.java:61)
at org.osgi.util.tracker.BundleTracker$Tracked.customizerAdding(BundleTracker.java:475)
at org.osgi.util.tracker.BundleTracker$Tracked.customizerAdding(BundleTracker.java:1)
at org.osgi.util.tracker.AbstractTracked.trackAdding(AbstractTracked.java:256)
at org.osgi.util.tracker.AbstractTracked.track(AbstractTracked.java:229)
at org.osgi.util.tracker.BundleTracker$Tracked.bundleChanged(BundleTracker.java:450)
at org.eclipse.osgi.internal.framework.BundleContextImpl.dispatchEvent(BundleContextImpl.java:908)
at org.eclipse.osgi.framework.eventmgr.EventManager.dispatchEvent(EventManager.java:230)
at org.eclipse.osgi.framework.eventmgr.ListenerQueue.dispatchEventSynchronous(ListenerQueue.java:148)
at org.eclipse.osgi.internal.framework.EquinoxEventPublisher.publishBundleEventPrivileged(EquinoxEventPublisher.java:230)
at org.eclipse.osgi.internal.framework.EquinoxEventPublisher.publishBundleEvent(EquinoxEventPublisher.java:137)
at org.eclipse.osgi.internal.framework.EquinoxEventPublisher.publishBundleEvent(EquinoxEventPublisher.java:129)
at org.eclipse.osgi.internal.framework.EquinoxContainerAdaptor.publishModuleEvent(EquinoxContainerAdaptor.java:191)
at org.eclipse.osgi.container.Module.publishEvent(Module.java:476)
at org.eclipse.osgi.container.Module.doStart(Module.java:578)
at org.eclipse.osgi.container.Module.start(Module.java:449)
at org.eclipse.osgi.internal.framework.EquinoxBundle.start(EquinoxBundle.java:428)
at org.eclipse.osgi.internal.framework.EquinoxBundle.start(EquinoxBundle.java:447)
at org.eclipse.equinox.console.commands.EquinoxCommandProvider.start(EquinoxCommandProvider.java:243)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.felix.gogo.runtime.Reflective.invoke(Reflective.java:139)
at org.apache.felix.gogo.runtime.CommandProxy.execute(CommandProxy.java:91)
at org.apache.felix.gogo.runtime.Closure.executeCmd(Closure.java:599)
at org.apache.felix.gogo.runtime.Closure.executeStatement(Closure.java:526)
at org.apache.felix.gogo.runtime.Closure.execute(Closure.java:415)
at org.apache.felix.gogo.runtime.Pipe.doCall(Pipe.java:416)
at org.apache.felix.gogo.runtime.Pipe.call(Pipe.java:229)
at org.apache.felix.gogo.runtime.Pipe.call(Pipe.java:59)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.ClassNotFoundException: PhishRodLab.db.model.impl.Customer_PreferencesImpl
at org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1365)
at org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1188)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:264)
at org.hibernate.util.ReflectHelper.classForName(ReflectHelper.java:200)
at org.hibernate.mapping.PersistentClass.getMappedClass(PersistentClass.java:122)
... 49 more
I've encountered that error when migrating from previous versions when the DTD of the service.xml file is not the right one. You must make sure you are using the latest version available for the version you are using (7.3 in this case).
For example, the header of the service.xml file should start like this:
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE service-builder PUBLIC "-//Liferay//DTD Service Builder 7.3.0//EN" "http://www.liferay.com/dtd/liferay-service-builder_7_3_0.dtd">
<service-builder...>
The best option is to generate a test servicebuilder project using the tutorial, and then compare the generated code with yours, making note of the differences.
More info: https://help.liferay.com/hc/es/articles/360029028211-Introduction-to-Service-Builder

Multithreading spark sql jobs always fail

I have a spark application that runs multiple tests against a dataset, the tests are functions that contains spark sql queries like groupBy, filter ...
Dataset<Row> dataset = loadDataset();
test1(dataset);
test2(dataset);
test3(dataset);
At this point everything works fine, however I can see that the cluster is used about 30%, so to optimize this I thought about parallelizing the tests to run at the same time, to do so I launched each test in a thread:
Dataset<Row> dataset = loadDataset();
Thread thread1 = new Thread(()-> test3(dataset));
thread1.start();
Thread thread2 = new Thread(()-> test2(dataset));
thread2.start();
Thread thread3 = new Thread(()-> test1(dataset));
thread3.start();
However this is doesn't seems to work because I got some strange error:
The currently active SparkContext was created at:
org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:914)
com.test.spark.Loader.loadDataset(Loader.java:96)
com.test.spark.Loader.run(Loader.java:29)
com.test.spark.Main.main(Main.java:15)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:498)
org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:646)
at org.apache.spark.SparkContext.assertNotStopped(SparkContext.scala:100)
at org.apache.spark.SparkContext.broadcast(SparkContext.scala:1485)
at org.apache.spark.sql.execution.datasources.csv.CSVFileFormat.buildReader(CSVFileFormat.scala:96)
at org.apache.spark.sql.execution.datasources.FileFormat$class.buildReaderWithPartitionValues(FileFormat.scala:117)
at org.apache.spark.sql.execution.datasources.TextBasedFileFormat.buildReaderWithPartitionValues(FileFormat.scala:148)
at org.apache.spark.sql.execution.FileSourceScanExec.inputRDD$lzycompute(DataSourceScanExec.scala:291)
at org.apache.spark.sql.execution.FileSourceScanExec.inputRDD(DataSourceScanExec.scala:289)
at org.apache.spark.sql.execution.FileSourceScanExec.inputRDDs(DataSourceScanExec.scala:309)
at org.apache.spark.sql.execution.FilterExec.inputRDDs(basicPhysicalOperators.scala:124)
at org.apache.spark.sql.execution.ProjectExec.inputRDDs(basicPhysicalOperators.scala:42)
at org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:386)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:117)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:117)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:138)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:135)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:116)
at org.apache.spark.sql.execution.columnar.InMemoryRelation.buildBuffers(InMemoryRelation.scala:91)
at org.apache.spark.sql.execution.columnar.InMemoryRelation.<init>(InMemoryRelation.scala:86)
at org.apache.spark.sql.execution.columnar.InMemoryRelation$.apply(InMemoryRelation.scala:42)
at org.apache.spark.sql.execution.CacheManager$$anonfun$cacheQuery$1.apply(CacheManager.scala:100)
at org.apache.spark.sql.execution.CacheManager.writeLock(CacheManager.scala:68)
at org.apache.spark.sql.execution.CacheManager.cacheQuery(CacheManager.scala:92)
at org.apache.spark.sql.Dataset.persist(Dataset.scala:2514)
at com.test.spark.Loader.test3(Loader.java:45)
at com.test.spark.Loader.lambda$run$0(Loader.java:32)
at java.lang.Thread.run(Thread.java:748)
19/07/13 22:05:08 INFO FileSourceStrategy: Pruning directories with:
19/07/13 22:05:08 INFO FileSourceStrategy: Post-Scan Filters: isnotnull(Sens#29),(Sens#29 = C)
19/07/13 22:05:08 INFO FileSourceStrategy: Output Data Schema: struct<JournalCode: string, JournalLib: string, EcritureNum: string, EcritureDate: string, CompteNum: string ... 16 more fields>
19/07/13 22:05:08 INFO FileSourceScanExec: Pushed Filters: IsNotNull(Sens),EqualTo(Sens,C)
19/07/13 22:05:08 INFO CodeGenerator: Code generated in 21.213109 ms
Exception in thread "Thread-29" java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext.
This stopped SparkContext was created at:
org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:914)
com.test.spark.Loader.loadDataset(Loader.java:96)
com.test.spark.Loader.run(Loader.java:29)
com.test.spark.Main.main(Main.java:15)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:498)
org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:646)
The currently active SparkContext was created at:
org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:914)
com.test.spark.Loader.loadDataset(Loader.java:96)
com.test.spark.Loader.run(Loader.java:29)
com.test.spark.Main.main(Main.java:15)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:498)
org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:646)
at org.apache.spark.SparkContext.assertNotStopped(SparkContext.scala:100)
at org.apache.spark.SparkContext.broadcast(SparkContext.scala:1485)
at org.apache.spark.sql.execution.datasources.csv.CSVFileFormat.buildReader(CSVFileFormat.scala:96)
at org.apache.spark.sql.execution.datasources.FileFormat$class.buildReaderWithPartitionValues(FileFormat.scala:117)
at org.apache.spark.sql.execution.datasources.TextBasedFileFormat.buildReaderWithPartitionValues(FileFormat.scala:148)
at org.apache.spark.sql.execution.FileSourceScanExec.inputRDD$lzycompute(DataSourceScanExec.scala:291)
at org.apache.spark.sql.execution.FileSourceScanExec.inputRDD(DataSourceScanExec.scala:289)
at org.apache.spark.sql.execution.FileSourceScanExec.inputRDDs(DataSourceScanExec.scala:309)
at org.apache.spark.sql.execution.FilterExec.inputRDDs(basicPhysicalOperators.scala:124)
at org.apache.spark.sql.execution.ProjectExec.inputRDDs(basicPhysicalOperators.scala:42)
at org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:386)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:117)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:117)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:138)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:135)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:116)
at org.apache.spark.sql.execution.columnar.InMemoryRelation.buildBuffers(InMemoryRelation.scala:91)
at org.apache.spark.sql.execution.columnar.InMemoryRelation.<init>(InMemoryRelation.scala:86)
at org.apache.spark.sql.execution.columnar.InMemoryRelation$.apply(InMemoryRelation.scala:42)
at org.apache.spark.sql.execution.CacheManager$$anonfun$cacheQuery$1.apply(CacheManager.scala:100)
at org.apache.spark.sql.execution.CacheManager.writeLock(CacheManager.scala:68)
at org.apache.spark.sql.execution.CacheManager.cacheQuery(CacheManager.scala:92)
at org.apache.spark.sql.Dataset.persist(Dataset.scala:2514)
at com.test.spark.Loader.test1(Loader.java:67)
at com.test.spark.Loader.lambda$run$2(Loader.java:36)
at java.lang.Thread.run(Thread.java:748)
19/07/13 22:05:08 INFO FileSourceStrategy: Pruning directories with:
19/07/13 22:05:08 INFO FileSourceStrategy: Post-Scan Filters:
19/07/13 22:05:08 INFO FileSourceStrategy: Output Data Schema: struct<CompteNum: string>
19/07/13 22:05:08 INFO FileSourceScanExec: Pushed Filters:
19/07/13 22:05:08 INFO HashAggregateExec: spark.sql.codegen.aggregate.map.twolevel.enable is set to true, but current version of codegened fast hashmap does not support this aggregate.
19/07/13 22:05:08 INFO CodeGenerator: Code generated in 29.090949 ms
19/07/13 22:05:08 INFO HashAggregateExec: spark.sql.codegen.aggregate.map.twolevel.enable is set to true, but current version of codegened fast hashmap does not support this aggregate.
19/07/13 22:05:08 INFO CodeGenerator: Code generated in 20.861207 ms
Exception in thread "Thread-28" org.apache.spark.sql.catalyst.errors.package$TreeNodeException: execute, tree:
Exchange SinglePartition
+- *HashAggregate(keys=[], functions=[partial_count(1)], output=[count#297L])
+- *HashAggregate(keys=[CompteNum#21], functions=[], output=[])
+- Exchange hashpartitioning(CompteNum#21, 10)
+- *HashAggregate(keys=[CompteNum#21], functions=[], output=[CompteNum#21])
+- *FileScan csv [CompteNum#21] Batched: false, Format: CSV, Location: InMemoryFileIndex[adl://home/home/azhdipaasssh/fecs/Abdennacer/9-5Gb/2019-01-07/FEC/2019-01-07-16..., PartitionFilters: [], PushedFilters: [], ReadSchema: struct<CompteNum:string>
at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:56)
at org.apache.spark.sql.execution.exchange.ShuffleExchange.doExecute(ShuffleExchange.scala:115)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:117)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:117)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:138)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:135)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:116)
at org.apache.spark.sql.execution.InputAdapter.inputRDDs(WholeStageCodegenExec.scala:252)
at org.apache.spark.sql.execution.aggregate.HashAggregateExec.inputRDDs(HashAggregateExec.scala:141)
at org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:386)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:117)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:117)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:138)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:135)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:116)
at org.apache.spark.sql.execution.SparkPlan.getByteArrayRdd(SparkPlan.scala:228)
at org.apache.spark.sql.execution.SparkPlan.executeCollect(SparkPlan.scala:275)
at org.apache.spark.sql.Dataset$$anonfun$count$1.apply(Dataset.scala:2431)
at org.apache.spark.sql.Dataset$$anonfun$count$1.apply(Dataset.scala:2430)
at org.apache.spark.sql.Dataset$$anonfun$55.apply(Dataset.scala:2838)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:65)
at org.apache.spark.sql.Dataset.withAction(Dataset.scala:2837)
at org.apache.spark.sql.Dataset.count(Dataset.scala:2430)
at com.test.spark.Loader.test2(Loader.java:60)
at com.test.spark.Loader.lambda$run$1(Loader.java:34)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.spark.sql.catalyst.errors.package$TreeNodeException: execute, tree:
Exchange hashpartitioning(CompteNum#21, 10)
+- *HashAggregate(keys=[CompteNum#21], functions=[], output=[CompteNum#21])
+- *FileScan csv [CompteNum#21] Batched: false, Format: CSV, Location: InMemoryFileIndex[adl://home/home/azhdipaasssh/fecs/Abdennacer/9-5Gb/2019-01-07/FEC/2019-01-07-16..., PartitionFilters: [], PushedFilters: [], ReadSchema: struct<CompteNum:string>
at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:56)
at org.apache.spark.sql.execution.exchange.ShuffleExchange.doExecute(ShuffleExchange.scala:115)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:117)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:117)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:138)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:135)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:116)
at org.apache.spark.sql.execution.InputAdapter.inputRDDs(WholeStageCodegenExec.scala:252)
at org.apache.spark.sql.execution.aggregate.HashAggregateExec.inputRDDs(HashAggregateExec.scala:141)
at org.apache.spark.sql.execution.aggregate.HashAggregateExec.inputRDDs(HashAggregateExec.scala:141)
at org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:386)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:117)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:117)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:138)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:135)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:116)
at org.apache.spark.sql.execution.exchange.ShuffleExchange.prepareShuffleDependency(ShuffleExchange.scala:88)
at org.apache.spark.sql.execution.exchange.ShuffleExchange$$anonfun$doExecute$1.apply(ShuffleExchange.scala:124)
at org.apache.spark.sql.execution.exchange.ShuffleExchange$$anonfun$doExecute$1.apply(ShuffleExchange.scala:115)
at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:52)
... 27 more
Caused by: java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext.
This stopped SparkContext was created at:
org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:914)
com.test.spark.Loader.loadDataset(Loader.java:96)
com.test.spark.Loader.run(Loader.java:29)
com.test.spark.Main.main(Main.java:15)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:498)
org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:646)
The currently active SparkContext was created at:
org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:914)
com.test.spark.Loader.loadDataset(Loader.java:96)
com.test.spark.Loader.run(Loader.java:29)
com.test.spark.Main.main(Main.java:15)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:498)
org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:646)
at org.apache.spark.SparkContext.assertNotStopped(SparkContext.scala:100)
at org.apache.spark.SparkContext.broadcast(SparkContext.scala:1485)
at org.apache.spark.sql.execution.datasources.csv.CSVFileFormat.buildReader(CSVFileFormat.scala:96)
at org.apache.spark.sql.execution.datasources.FileFormat$class.buildReaderWithPartitionValues(FileFormat.scala:117)
at org.apache.spark.sql.execution.datasources.TextBasedFileFormat.buildReaderWithPartitionValues(FileFormat.scala:148)
at org.apache.spark.sql.execution.FileSourceScanExec.inputRDD$lzycompute(DataSourceScanExec.scala:291)
at org.apache.spark.sql.execution.FileSourceScanExec.inputRDD(DataSourceScanExec.scala:289)
at org.apache.spark.sql.execution.FileSourceScanExec.inputRDDs(DataSourceScanExec.scala:309)
at org.apache.spark.sql.execution.aggregate.HashAggregateExec.inputRDDs(HashAggregateExec.scala:141)
at org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:386)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:117)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:117)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:138)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:135)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:116)
at org.apache.spark.sql.execution.exchange.ShuffleExchange.prepareShuffleDependency(ShuffleExchange.scala:88)
at org.apache.spark.sql.execution.exchange.ShuffleExchange$$anonfun$doExecute$1.apply(ShuffleExchange.scala:124)
at org.apache.spark.sql.execution.exchange.ShuffleExchange$$anonfun$doExecute$1.apply(ShuffleExchange.scala:115)
at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:52)
... 48 more
19/07/13 22:05:09 INFO YarnAllocator: Driver requested a total number of 0 executor(s).
The logs doesn't give much informations, does any body had the same error ?
UPDATE
Here's the loadDataset(), I don't think it will add much though:
private Dataset<Row> loadDataset() {
SparkSession session = SparkSession.builder().getOrCreate();
String path = "/home/user/files/file.txt";
return session.read().option("header", "true").option("delimiter", "|").csv(path);
}
One thing that i found here is that seems to be required is this:
SparkEnv.set(SparkEnv.get)
This code shall be executed in each thread that uses spark context/session.
Please try it and share your result.
The problem was after the execution of each thread, the spark context will reach the end and it will shut down, so to prevent this I added join() for each thread, in this case the main thread will be at a wiating state till all threads are executed:
Thread thread1 = new Thread(()-> test3(dataset));
thread1.start();
Thread thread2 = new Thread(()-> test2(dataset));
thread2.start();
Thread thread3 = new Thread(()-> test1(dataset));
thread3.start();
thread1.join();
thread2.join();
thread3.join();

Spring resource not found - 3.2

We have a project which is using Spring with 3.0 xsd running on tomcat 7. I tried to change the xsd to Spring 3.2 as we have been using spring jars for spring 3.2.4 and 'exclude-mapping' cannot be used with spring 3.0 xsd.
After this change, application stops working, though it does not throw any errors while starting/stopping the server or adding and removing the application.
After the change any request to the application fails with error 404.
Here are the Jars we are using:
beandoc-0.7.0.jar
commons-fileupload-1.2.2.jar
commons-io-1.3.2.jar
commons-logging-1.2-javadoc.jar
commons-logging-1.2.jar
jstl.jar
log4j-1.2.14.jar
ojdbc6.jar
slf4j-api-1.5.6.jar
slf4j-log4j12-1.5.6.jar
spring-beandoc-0.7.1.jar
spring-beans-3.2.4.RELEASE.jar
spring-context-3.2.4.RELEASE.jar
spring-context-support-3.2.4.RELEASE.jar
spring-core-3.2.4.RELEASE.jar
spring-dao-2.0.8.jar
spring-expression-3.2.4.RELEASE.jar
spring-jdbc-3.2.4.RELEASE.jar
spring-web-3.2.4.RELEASE.jar
spring-webmvc-3.2.4.RELEASE.jar
spring-webmvc-portlet-3.2.4.RELEASE.jar
standard.jar
When we try to clean the server, it results in the below exception:
SEVERE: Exception starting Context with name [/COETOOLSACCL]
org.apache.catalina.LifecycleException: Failed to start component [StandardEngine[Catalina].StandardHost[localhost].StandardContext[/COETOOLSACCL]]
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:153)
at org.apache.catalina.core.StandardContext.reload(StandardContext.java:4119)
at org.apache.catalina.loader.WebappLoader.backgroundProcess(WebappLoader.java:425)
at org.apache.catalina.core.ContainerBase.backgroundProcess(ContainerBase.java:1341)
at org.apache.catalina.core.ContainerBase$ContainerBackgroundProcessor.processChildren(ContainerBase.java:1542)
at org.apache.catalina.core.ContainerBase$ContainerBackgroundProcessor.processChildren(ContainerBase.java:1552)
at org.apache.catalina.core.ContainerBase$ContainerBackgroundProcessor.processChildren(ContainerBase.java:1552)
at org.apache.catalina.core.ContainerBase$ContainerBackgroundProcessor.run(ContainerBase.java:1520)
at java.lang.Thread.run(Unknown Source)
Caused by: java.lang.NoClassDefFoundError: org/springframework/web/context/support/ServletRequestHandledEvent
at java.lang.Class.getDeclaredFields0(Native Method)
at java.lang.Class.privateGetDeclaredFields(Unknown Source)
at java.lang.Class.getDeclaredFields(Unknown Source)
at org.apache.catalina.util.Introspection.getDeclaredFields(Introspection.java:106)
at org.apache.catalina.startup.WebAnnotationSet.loadFieldsAnnotation(WebAnnotationSet.java:270)
at org.apache.catalina.startup.WebAnnotationSet.loadApplicationServletAnnotations(WebAnnotationSet.java:139)
at org.apache.catalina.startup.WebAnnotationSet.loadApplicationAnnotations(WebAnnotationSet.java:65)
at org.apache.catalina.startup.ContextConfig.applicationAnnotationsConfig(ContextConfig.java:416)
at org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:890)
at org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:387)
at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
at org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5503)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:147)
... 8 more
Caused by: java.lang.ClassNotFoundException: org.springframework.web.context.support.ServletRequestHandledEvent
at org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1891)
at org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1734)
... 22 more
This exception prevails regardless of the xsd versions and with 3.0 application runs normal despite we get this error while cleaning the server.
Please help.

WebSphere 8,5 spring job with remote ejb xa transaction error

I'm working with spring batch and WebSphere 8.5. Batch uses a remote ejb in xa distribuited transaction:
org.springframework.transaction.TransactionSystemException: UOWManager transaction processing failed; nested exception is com.ibm.wsspi.uow.UOWException: javax.transaction.SystemException
at org.springframework.transaction.jta.WebSphereUowTransactionManager.execute(WebSphereUowTransactionManager.java:297) ~[spring-tx-4.1.6.RELEASE.jar:4.1.6.RELEASE]
at org.springframework.transaction.support.TransactionTemplate.execute(TransactionTemplate.java:127) ~[spring-tx-4.1.6.RELEASE.jar:4.1.6.RELEASE]
at org.springframework.batch.core.step.tasklet.TaskletStep$2.doInChunkContext(TaskletStep.java:271) ~[spring-batch-core-3.0.3.RELEASE.jar:3.0.3.RELEASE]
at org.spr....
Caused by: com.ibm.wsspi.uow.UOWException: javax.transaction.SystemException
at com.ibm.ws.uow.embeddable.EmbeddableUOWManagerImpl.runUnderNewUOW(EmbeddableUOWManagerImpl.java:823) ~[com.ibm.ws.runtime.jar:na]
at com.ibm.ws.uow.embeddable.EmbeddableUOWManagerImpl.runUnderUOW(EmbeddableUOWManagerImpl.java:370) ~[com.ibm.ws.runtime.jar:na]
at org.springframework.transaction.jta.WebSphereUowTransactionManager.execute(WebSphereUowTransactionManager.java:290) ~[spring-tx-4.1.6.RELEASE.jar:4.1.6.RELEASE]
... 18 common frames omitted
Caused by: javax.transaction.SystemException: null
at com.ibm.tx.jta.impl.TransactionImpl.stage3CommitProcessing(TransactionImpl.java:1251) ~[com.ibm.tx.jta.jar:na]
at com.ibm.tx.jta.impl.TransactionImpl.processCommit(TransactionImpl.java:1042) ~[com.ibm.tx.jta.jar:na]
at com.ibm.tx.jta.impl.TransactionImpl.commit(TransactionImpl.java:963) ~[com.ibm.tx.jta.jar:na]
at com.ibm.ws.tx.jta.TranManagerImpl.commit(TranManagerImpl.java:439) ~[com.ibm.ws.runtime.jar:na]
at com.ibm.tx.jta.impl.TranManagerSet.commit(TranManagerSet.java:191) ~[com.ibm.tx.jta.jar:na]
at com.ibm.ws.uow.UOWManagerImpl.uowCommit(UOWManagerImpl.java:807) ~[com.ibm.ws.runtime.jar:na]
at com.ibm.ws.uow.embeddable.EmbeddableUOWManagerImpl.uowEnd(EmbeddableUOWManagerImpl.java:881) ~[com.ibm.ws.runtime.jar:na]
at com.ibm.ws.uow.UOWManagerImpl.uowEnd(UOWManagerImpl.java:782) ~[com.ibm.ws.runtime.jar:na]
at com.ibm.ws.uow.embeddable.EmbeddableUOWManagerImpl.runUnderNewUOW(EmbeddableUOWManagerImpl.java:818) ~[com.ibm.ws.runtime.jar:na]
Any ideas?
Thanks
Giancarlo
after Greycon's help, I resolved disabling security on transaction service.
For WAS 8.5, the procedure is the same as that of WAS 7 described in the following link:
https://www-01.ibm.com/support/knowledgecenter/SSWSR9_11.4.0/com.ibm.mdmhs.bil_install.doc/t_disable_protocol_security.html

Error in setting Hibernate Configuration

I am getting error coming in to configure hibernate for postgresql database -
Error Log Details -
org.hibernate.HibernateException: could not instantiate RegionFactory [org.hibernate.cache.impl.bridge.RegionFactoryCacheProviderBridge]
at org.hibernate.cfg.SettingsFactory.createRegionFactory(SettingsFactory.java:402)
at org.hibernate.cfg.SettingsFactory.buildSettings(SettingsFactory.java:270)
at org.hibernate.cfg.Configuration.buildSettingsInternal(Configuration.java:2163)
at org.hibernate.cfg.Configuration.buildSettings(Configuration.java:2159)
at org.hibernate.cfg.Configuration.buildSessionFactory(Configuration.java:1383)
at org.hibernate.console.ConsoleConfiguration$5.execute(ConsoleConfiguration.java:278)
at org.hibernate.console.execution.DefaultExecutionContext.execute(DefaultExecutionContext.java:63)
at org.hibernate.console.ConsoleConfiguration.execute(ConsoleConfiguration.java:107)
at org.hibernate.console.ConsoleConfiguration.buildSessionFactory(ConsoleConfiguration.java:273)
at org.hibernate.eclipse.console.workbench.LazySessionFactoryAdapter.getChildren(LazySessionFactoryAdapter.java:43)
at org.hibernate.eclipse.console.workbench.BasicWorkbenchAdapter.getChildren(BasicWorkbenchAdapter.java:100)
at org.hibernate.eclipse.console.workbench.BasicWorkbenchAdapter.fetchDeferredChildren(BasicWorkbenchAdapter.java:106)
at org.eclipse.ui.progress.DeferredTreeContentManager$1.run(DeferredTreeContentManager.java:235)
at org.eclipse.core.internal.jobs.Worker.run(Worker.java:53)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.hibernate.cfg.SettingsFactory.createRegionFactory(SettingsFactory.java:397)
... 13 more
Caused by: org.hibernate.cache.CacheException: could not instantiate CacheProvider [org.hibernate.cache.internal.NoCacheProvider]
at org.hibernate.cache.impl.bridge.RegionFactoryCacheProviderBridge.<init>(RegionFactoryCacheProviderBridge.java:66)
... 18 more
Caused by: java.lang.ClassNotFoundException: org.hibernate.cache.internal.NoCacheProvider cannot be found by org.hibernate.eclipse.libs_3.7.1.Final-v20131205-0918-B107
at org.eclipse.osgi.internal.loader.BundleLoader.findClassInternal(BundleLoader.java:501)
at org.eclipse.osgi.internal.loader.BundleLoader.findClass(BundleLoader.java:421)
at org.eclipse.osgi.internal.loader.BundleLoader.findClass(BundleLoader.java:412)
at org.eclipse.osgi.internal.baseadaptor.DefaultClassLoader.loadClass(DefaultClassLoader.java:107)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:191)
at org.hibernate.util.ReflectHelper.classForName(ReflectHelper.java:192)
at org.hibernate.cache.impl.bridge.RegionFactoryCacheProviderBridge.<init>(RegionFactoryCacheProviderBridge.java:63)
... 18 more
UPDATE:
After the changes mentioned by #ConMan, I am getting a new error at the same place -
org.hibernate.HibernateException: Could not instantiate dialect class
at org.hibernate.dialect.resolver.DialectFactory.constructDialect(DialectFactory.java:163)
at org.hibernate.dialect.resolver.DialectFactory.buildDialect(DialectFactory.java:109)
at org.hibernate.cfg.SettingsFactory.buildSettings(SettingsFactory.java:146)
at org.hibernate.cfg.Configuration.buildSettingsInternal(Configuration.java:2163)
at org.hibernate.cfg.Configuration.buildSettings(Configuration.java:2155)
at org.hibernate.console.ConsoleConfiguration$6.execute(ConsoleConfiguration.java:430)
at org.hibernate.console.execution.DefaultExecutionContext.execute(DefaultExecutionContext.java:63)
at org.hibernate.console.ConsoleConfiguration.execute(ConsoleConfiguration.java:107)
at org.hibernate.console.ConsoleConfiguration.getSettings(ConsoleConfiguration.java:428)
at org.hibernate.eclipse.console.workbench.LazyDatabaseSchemaWorkbenchAdapter$2.execute(LazyDatabaseSchemaWorkbenchAdapter.java:119)
at org.hibernate.console.execution.DefaultExecutionContext.execute(DefaultExecutionContext.java:63)
at org.hibernate.console.ConsoleConfiguration.execute(ConsoleConfiguration.java:107)
at org.hibernate.eclipse.console.workbench.LazyDatabaseSchemaWorkbenchAdapter.readDatabaseSchema(LazyDatabaseSchemaWorkbenchAdapter.java:115)
at org.hibernate.eclipse.console.workbench.LazyDatabaseSchemaWorkbenchAdapter.getChildren(LazyDatabaseSchemaWorkbenchAdapter.java:65)
at org.hibernate.eclipse.console.workbench.BasicWorkbenchAdapter.fetchDeferredChildren(BasicWorkbenchAdapter.java:106)
at org.eclipse.ui.progress.DeferredTreeContentManager$1.run(DeferredTreeContentManager.java:235)
at org.eclipse.core.internal.jobs.Worker.run(Worker.java:53)
Caused by: java.lang.ClassCastException: org.hibernate.dialect.PostgreSQLDialect cannot be cast to org.hibernate.dialect.Dialect
at org.hibernate.dialect.resolver.DialectFactory.constructDialect(DialectFactory.java:157)
... 16 more
The problem is here:
Caused by: java.lang.ClassNotFoundException: org.hibernate.cache.internal.NoCacheProvider cannot be found by org.hibernate.eclipse.libs_3.7.1.Final-v20131205-0918-B107
This basically means that the implementation of hibernate you are using (org.hibernate.eclipse.libs...) does not contain the class NoCacheProvider.class.
An implementation of the NoCacheProvider can be found in the following dependencies:
hibernate-core - 3.6.0.Final, 3.5.0-Final, 3.3.0.SP1, 3.3.0.GA
com.springsource.org.hibernate - 3.3.2, 3.3.1, 3.2.6
hibernate-core - 3.6.10.Final-patched-play-1.2.5, 3.6.1.Final-patched-play-1.2, 3.5.6-Final-patched-play-1.1.1, 3.5.6-Final-patched-play-1.1
hibernate - 3.2.7.ga, 3.2.6.ga, 3.2.5.ga, 3.2.4.sp1, 3.2.4.ga,
3.2.3.ga, 3.2.2.ga, 3.2.1.ga, 3.2.0.ga, 3.2.0.cr3, 3.2.0.cr2, 3.2.0.cr1, 3.1.3, 3.1.2, 3.1.1, 3.1
hibernate - 3.1beta3, 3.1beta2, 3.1beta1, 3.0.5, 3.0.3
hibernate-all - beta3.SP15
Source: Grep Code
EDIT:
I have just seen that you are using hibernate-core 4.3.0.Final. It would appear that the NoCacheProvider class no longer exists in this version of hibernate. The recommended alternative is to use the following class instead:
org.hibernate.cache.internal.NoCachingRegionFactory

Categories

Resources