java.lang.ClassCastException for TDBFactory - java

I made a tdb index with Jena for the data I have.
In order to refer to the data while querying, I tried using TDBFactory and Model (both statements given below). I am getting the same exception for both, so this seems to be independent of the statement I write.
Dataset dataset = TDBFactory.createDataset(directory) ;
and
Model model=FileManager.get().loadModel(directory);
The runtime exception is:
Exception in thread "main" java.lang.ExceptionInInitializerError
at com.hp.hpl.jena.rdf.model.impl.RDFReaderFImpl.reset(RDFReaderFImpl.java:81)
at com.hp.hpl.jena.rdf.model.impl.RDFReaderFImpl.<clinit>(RDFReaderFImpl.java:74)
at com.hp.hpl.jena.rdf.model.impl.ModelCom.<clinit>(ModelCom.java:54)
at com.hp.hpl.jena.rdf.model.ModelFactory.createDefaultModel(ModelFactory.java:114)
at com.hp.hpl.jena.vocabulary.OWL.<clinit>(OWL.java:36)
at com.hp.hpl.jena.sparql.graph.NodeConst.<clinit>(NodeConst.java:29)
at com.hp.hpl.jena.sparql.engine.optimizer.reorder.ReorderFixed.<clinit>(ReorderFixed.java:23)
at com.hp.hpl.jena.sparql.engine.optimizer.reorder.ReorderLib.fixed(ReorderLib.java:53)
at com.hp.hpl.jena.tdb.sys.SystemTDB.<clinit>(SystemTDB.java:187)
at com.hp.hpl.jena.tdb.TDB.<clinit>(TDB.java:90)
at com.hp.hpl.jena.tdb.setup.DatasetBuilderStd.<clinit>(DatasetBuilderStd.java:64)
at com.hp.hpl.jena.tdb.StoreConnection.make(StoreConnection.java:227)
at com.hp.hpl.jena.tdb.transaction.DatasetGraphTransaction.<init>(DatasetGraphTransaction.java:75)
at com.hp.hpl.jena.tdb.sys.TDBMaker._create(TDBMaker.java:57)
at com.hp.hpl.jena.tdb.sys.TDBMaker.createDatasetGraphTransaction(TDBMaker.java:45)
at com.hp.hpl.jena.tdb.TDBFactory._createDatasetGraph(TDBFactory.java:104)
at com.hp.hpl.jena.tdb.TDBFactory.createDatasetGraph(TDBFactory.java:73)
at com.hp.hpl.jena.tdb.TDBFactory.createDataset(TDBFactory.java:52)
at com.hp.hpl.jena.tdb.TDBFactory.createDataset(TDBFactory.java:48)
Caused by: java.lang.ClassCastException: org.apache.xerces.dom.DeferredTextImpl cannot be cast to org.w3c.dom.Element
at sun.util.xml.PlatformXmlPropertiesProvider.importProperties(PlatformXmlPropertiesProvider.java:118)
at sun.util.xml.PlatformXmlPropertiesProvider.load(PlatformXmlPropertiesProvider.java:90)
at java.util.Properties$XmlSupport.load(Properties.java:1201)
at java.util.Properties.loadFromXML(Properties.java:881)
at com.hp.hpl.jena.util.Metadata.read(Metadata.java:76)
at com.hp.hpl.jena.util.Metadata.addMetadata(Metadata.java:54)
at com.hp.hpl.jena.util.Metadata.<init>(Metadata.java:48)
at com.hp.hpl.jena.JenaRuntime.<clinit>(JenaRuntime.java:34)
The jar files that I am using are:
arq-2.8.7.jar,
commons-cli-1.2.jar,
commons-codec-1.6.jar,
commons-collections-3.2.1.jar,
commons-csv-1.0.jar,
commons-io-2.4.jar,
commons-lang3-3.1.jar,
commons-math3-3.0.jar,
httpclient-4.2.6.jar,
httpclient-cache-4.2.6.jar,
httpcore-4.2.5.jar,
jackson-annotations-2.3.0.jar,
jackson-core-2.3.3.jar,
jackson-databind-2.3.3.jar,
jcl-over-slf4j-1.7.6.jar,
jena-arq-2.12.1.jar,
jena-core-2.12.1.jar,
jena-iri-1.1.1.jar,
jena-sdb-1.5.1.jar,
jena-tdb-1.1.1.jar,
jgraph.jar,
jsonld-java-0.5.0.jar,
libthrift-0.9.1.jar,
log4j-1.2.17.jar,
slf4j-api-1.7.6.jar,
slf4j-log4j12-1.7.6.jar,
xercesImpl-2.11.0.jar,
xml-apis-1.4.01.jar
How do I fix this?

Related

How to fix UnsupportedOperationException while using spark joinWith to create Tuple2

I am using Java with Spark. I need to create a Tuple2 Dataset by combining two separate Datasets. I am using joinWith as I want the individual objects to remain intact (cannot use join). However this is failing with:
Exception in thread "main" java.lang.UnsupportedOperationException: Cannot evaluate expression: NamePlaceholder
I tried it with and without Alias but am still getting the same error. What am I doing wrong?
Dataset<MyObject1> dsOfMyObject1;
Dataset<MyObject2> dsOfMyObject2;
Dataset<Tuple2<MyObject1, MyObject2>> tuple2Dataset =
dsOfMyObject1.as("A").
joinWith(dsOfMyObject2.as("B"),col("A.keyfield")
.equalTo(col("B.keyfield")));
Exception in thread "main" java.lang.UnsupportedOperationException: Cannot evaluate
expression: NamePlaceholder
at org.apache.spark.sql.catalyst.expressions.Unevaluable$class.eval(Expression.scala:255)
at org.apache.spark.sql.catalyst.expressions.NamePlaceholder$.eval(complexTypeCreator.scala:243)
at org.apache.spark.sql.catalyst.expressions.CreateNamedStructLike$$anonfun$names$1.apply(complexTypeCreator.scala:289)
at org.apache.spark.sql.catalyst.expressions.CreateNamedStructLike$$anonfun$names$1.apply(complexTypeCreator.scala:289)
at scala.collection.immutable.List.map(List.scala:274)

How to load saved model created by (Weka GUI) into my java application and browse the predicted result

My model is done using "FilteredClassifier" algorithm, then SMO as a
"classifier" parameter. "weka.classifiers.functions.SMO".
I tried to load my model into java using this code but it is not work
SupportVector SOM = (SupportVector) SerializationHelper.read(new
FileInputStream("C:\\Users\\HP\\Desktop\\SOM.model"));
and this code
FilteredClassifier SOM = (FilteredClassifier )
SerializationHelper.read(new
FileInputStream("C:\\Users\\HP\\Desktop\\SOM.model"));
both not working
then I want to browse the data used in building this model (actual value and predicted value).
how I can do it? Once I have created the model, do I need to load the dataset again?
This is the error
Exception in thread "main" java.lang.ClassCastException: weka.classifiers.meta.FilteredClassifier cannot be cast to weka.core.pmml.jaxbbindings.SupportVector
at weka.api.Model.main(Model.java:28)
This is the error
Exception in thread "main" java.lang.ClassCastException: weka.classifiers.meta.FilteredClassifier cannot be cast to weka.core.pmml.jaxbbindings.SupportVector
at weka.api.Model.main(Model.java:28)
weka.classifiers.meta.FilteredClassifier cannot be cast to weka.core.pmml.jaxbbindings.SupportVector
pmml and jaxb are XML related classes, you appear to have imported the wrong package.

BaseX-Exception (Interrupted) while loading large XML File

I am trying to query a large xml-file like this:
ClientSession session = DatabaseConnection.getConnection();
session.execute(new
XQuery("doc('path/dataset.xml')")).getBytes());
I get the folloing exception:
Exception in thread "main" org.basex.core.BaseXException: Interrupted.
at org.basex.api.client.ClientSession.receive(ClientSession.java:191)
at org.basex.api.client.ClientSession.execute(ClientSession.java:160)
at org.basex.api.client.ClientSession.execute(ClientSession.java:165)
at org.basex.api.client.Session.execute(Session.java:36)
at testing.Main.main(Main.java:124)
I tried to increase the java-heapspace as well as the Xmx-value in the
basexserver-script but it did not help.
What else could causing this exception?
Files having the same structure can be loaded. It seems that the dataset is just to big..

JAXB moxy Error: Invalid parameter type encountered while processing external metadata via properties Map

I followed this example to implement moxy in my project. However I am getting the following error and have not been able to find a solution yet.
Exception in thread "main" Local Exception Stack:
Exception [EclipseLink-50019] (Eclipse Persistence Services - 2.0.2.v20100323-r6872): org.eclipse.persistence.exceptions.JAXBException
Exception Description: Invalid parameter type encountered while processing external metadata via properties Map. The Value associated with Key [eclipselink-oxm-xml] is required to be one of [Map<String, Source>], where String = package, Source = handle to metadata file.
at org.eclipse.persistence.exceptions.JAXBException.incorrectValueParameterTypeForOxmXmlKey(JAXBException.java:245)
at org.eclipse.persistence.jaxb.JAXBContextFactory.getXmlBindingsFromProperties(JAXBContextFactory.java:330)
at org.eclipse.persistence.jaxb.JAXBContextFactory.createContext(JAXBContextFactory.java:115)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at javax.xml.bind.ContextFinder.newInstance(ContextFinder.java:128)
at javax.xml.bind.ContextFinder.find(ContextFinder.java:249)
at javax.xml.bind.JAXBContext.newInstance(JAXBContext.java:372)
at blog.bindingfile.Demo.main(Demo.java:13)
I am new to jaxb and moxy, any help towards getting my moxy setup is appreciated.
Use instance of Source, it is in the error message:
Value associated with Key [eclipselink-oxm-xml] is required to be one of [Map<String, Source>], where String = package, Source = handle to metadata file.
example:
...
javax.xml.transform.Source source =
new javax.xml.transform.stream.StreamSource("mypackage-metadata.xml")
map.put("example.mypackage", source);
...

Trying to catch ehcache Element ClassCastException

I'm having trouble with ehcache trying to cache a table which is bigger than the storage I set to it. Although the application do not fail (because it end up performing the query in the database and returning the data) my log is getting full of ClassCastException.
I don't want to change the settings because it will happen again, I'd like to catch the ClassCastException but it did not work. (I tried filtering the exception with a seam component and at a specific point where the exception is thown)
Versions are: Seam 2.2.2, Hibernate 3.3.3 and ehCache 1.5
ERROR [user:] [DiskStore.java/get] – com.milestone.model.PersonItemCache:
Could not read disk store element for key com.milestone.model.PersonItem#438480.
Error was net.sf.ehcache.Element cannot be cast to net.sf.ehcache.Element
java.lang.ClassCastException: net.sf.ehcache.Element cannot be cast to net.sf.ehcache.Element
at net.sf.ehcache.store.DiskStore.loadElementFromDiskElement(DiskStore.java:302)
at net.sf.ehcache.store.DiskStore.get(DiskStore.java:257)
at net.sf.ehcache.Cache.searchInDiskStore(Cache.java:1202)
at net.sf.ehcache.Cache.get(Cache.java:803)
at org.hibernate.cache.EhCache.get(EhCache.java:80)
at org.hibernate.cache.ReadWriteCache.put(ReadWriteCache.java:178)
at org.hibernate.cache.impl.bridge.EntityAccessStrategyAdapter.putFromLoad(EntityAccessStrategyAdapter.java:68)
at org.hibernate.engine.TwoPhaseLoad.initializeEntity(TwoPhaseLoad.java:179)
at org.hibernate.loader.Loader.initializeEntitiesAndCollections(Loader.java:877)
at org.hibernate.loader.Loader.doQuery(Loader.java:752)
at org.hibernate.loader.Loader.doQueryAndInitializeNonLazyCollections(Loader.java:259)
at org.hibernate.loader.Loader.loadCollection(Loader.java:2019)
at org.hibernate.loader.collection.CollectionLoader.initialize(CollectionLoader.java:59)
at org.hibernate.persister.collection.AbstractCollectionPersister.initialize(AbstractCollectionPersister.java:587)
at org.hibernate.event.def.DefaultInitializeCollectionEventListener.onInitializeCollection(DefaultInitializeCollectionEventListener.java:83)
at org.hibernate.impl.SessionImpl.initializeCollection(SessionImpl.java:1744)
at org.hibernate.collection.AbstractPersistentCollection.initialize(AbstractPersistentCollection.java:366)
at org.hibernate.collection.AbstractPersistentCollection.read(AbstractPersistentCollection.java:108)
at org.hibernate.collection.AbstractPersistentCollection.readElementExistence(AbstractPersistentCollection.java:164)
at org.hibernate.collection.PersistentBag.contains(PersistentBag.java:262)
at com.milestone.person.PersonItemController.addAsFans(PersonItemController.java:141)
Your exception is ClassCastException: net.sf.ehcache.Element cannot be cast to net.sf.ehcache.Element This is an indication of a bigger problem. You most probably have 2 ehcace core or related jars in classpath. Catching the exception is not a solution. The exception should not happen in the first place.

Categories

Resources