Jena TDB hangs/freezes on named model access - java

I have a problem with Apache Jena TDB. Basically I create a new Dataset, load data from an RDF/XML file into a named model with the name "http://example.com/model/filename" where filename is the name of the XML/RDF file. After loading the data, all statements from the named model are inserted into the default model. The named model is kept in the dataset for backup reasons.
When I now try to query the named models in the Dataset, TDB freezes and the application seems to run in an infinite loop, so it is not terminated nor does it throw an exception.
What is causing that freeze and how can I prevent it?
Example code:
Dataset ds = TDBFactory.createDataset("tdb");
Model mod = ds.getDefaultModel();
File f = new File("example.rdf");
FileInputStream fis = new FileInputStream(f);
ds.begin(ReadWrite.WRITE);
// Get a new named model to load the data into
Model nm = ds.getNamedModel("http://example.com/model/example.rdf");
nm.read(fis, null);
// Do some queries on the Model using the utility methods of Model, no SPARQL used
// Add all statements from the named model to the default model
mod.add(nm);
ds.commit();
ds.end();
// So far everything works as expected, but the following line causes the freeze
Iterator<String> it = ds.listNames();
Any method call that accesses the existing named models causes the same freeze reaction, so this is the same for getNamedModel("http://example.com/model/example.rdf"); for example. Adding new named models by calling getNamedModel("http://example.com/model/example123.rdf"); works fine, so only access to existing models is broken.
Used environment: Linux 64bit, Oracle Java 1.7.0_09, Jena 2.7.4 (incl. TDB 0.9.4)
Thanks in advance for any help!
Edit: Fixed mistake in code fragment
Edit2: Solution (my last comment to AndyS answer)
Ok, I went through the whole program and added all missing transactions. Not it is working as expected. I suspect Jena throwing an Exception during the shutdown sequence of my program but that Exception was not reported properly and the "freeze" was caused by other threads not terminating correctly. Thanks for pointing the faulty transaction usage out.

Could you turn this into a test case and send it to the jena users mailing list please?
You should get the default model inside the transaction - you got it outside.
Also, if you have used a dataset transactionally, you can't use it untransactionally as you do at ds.listNames. It shouldn't freeze - you should get some kind of warning.

Related

WrongClassExceptionObject:object with id:22 was not of the specified subclass ,only for new entries.old entries are working fine

I an using Ejb,Spring+hibernates in my application.
My application fetches a row from DB and based on its discriminator column value(entry1,entry2,entry3...) it connects to the JavaClass file which is marked in the tag of my tablename.hbm.xml file and code gets executed.
All my old code is working fine,
I have added a new.java file ,and when i tried to add a new entry to the tablename.hbm.xml file i am facing the below error.
org.springframework.orm.hibernate.HibernateobjectRetrivalFailureException:
object with id:22 was not of the specified subclass:(path of table1 related class)(Discriminator :entry1)
nested exception is net.sf.hibernate.WrongClassException : Object with id:22 was not of the specified subclass:(path of table1 related class)(Discriminator :entry1)
There are no duplicates in my table neither any space issues.
none of my new entires are getting executed.Is it due to my .hbm.xml file not getting refreshed everytime.please let me know for any suggestions.
Simply, add to project file properties following:
spring.jpa.properties.hibernate.discriminator.ignore_explicit_for_joined=true
It is works for me.

Java method that writes to file does nothing when invoked from a JSP

Hey, all! I have a class method who's primary function is to get a Map object, which works fine; however, it's an expensive operation that doesn't need to be done every time, so I'd like to have the results stored in an XML file using JAXB, to be read from for the majority of calls and updated infrequently.
When I run a class that calls it out of NetBeans the file is created no problem with exactly what I want -- but when I have my JSP call the method nothing happens whatsoever, even though the rest of the information is passed normally. I have the feeling it's somehow lacking write privileges, but the file is just in the root directory so I'm not sure what I'm missing. Thanks for the help!
The code looks roughly like this:
public class DataHandler() {
...
public void config() {
MapHolder bucket = new MapHolder();
MapExporter exp = new MapExporter();
Map map = makeMap();
bucket.setMap(map);
exp.exportMap(bucket);
}
}
And then the JSP has a javabean of Datahandler, and this line:
databean.config();
It's probably a tad more fragmented than it needs to be; the whole bucket rigamarole was because I was stumbling trying to learn how to write a map to an xml file. Mapholder is just a class that I wrap around the map, and MapExporter just uses a JAXB marshaller, and it all does work properly when run from NetBeans.
OK turns out I'm just dumb; everything was working fine, the file was just being stored in a folder at the localhost location. Whoops! That'd be my inexperience with web development at work.

How to do multiple add operation apache jena tdb

I have to serialize some specific properties (about ten film's properties) for a set of 1500 entity from DBpedia. So for each entity I run a sparql query in order to retrieve them and after that, for each ResultSet I store all the data in the tdb dataset using the default apache jena tdb API. I create a single statement for each property and I add them using this code:
public void addSolution(QuerySolution currSolution, String subjectURI) {
if(isWriteMode) {
Resource currResource = datasetModel.createResource(subjectURI);
Property prop = datasetModel.createProperty(currSolution.getResource("?prop").toString());
Statement stat = datasetModel.createStatement(currResource, prop, currSolution.get("?value").toString());
datasetModel.add(stat);
}
}
How can I do in order to execute multiple add operations on a single dataset? What's the strategy that I should use?
EDIT:
I'm able to execute all the code without errors, but no files were created by the TDBFactory. Why this happens?
I think that I need Joshua Taylor's help
It sounds like the query is running over the remote dbpedia endpoint. Assuming that's correct you can do a couple of things.
Firstly wrap the update in a transaction:
dataset.begin(ReadWrite.WRITE);
try {
for (QuerySolution currSolution: results) {
addSolution(...);
}
dataset.commit();
} finally {
dataset.end();
}
Secondly, you might be able to save yourself work by using CONSTRUCT to get a model back, rather than having to loop through the results. I'm not clear what's going on with subjectURI, however, but it might be as simple as:
CONSTRUCT { <subjectURI> ?prop ?value }
WHERE
... existing query body ...
I've solved my problem and I want to put here the problem that I've got for anyone will have the same.
For each transaction that you do, you need to re-obtain the dataset model and don't use the same for all the transaction.
So for each transaction that you start you need to obtain the dataset model just after the call to begin().
I hope that will be helpful.

Can't use Row Data Resulted from Scripted DataSet

Performing a test with BIRT I was able to create a report and render it in PDF, but unfortunatelly I'm not getting the expected result.
For my DataSource I created a Scripted DataSource and no code was needed in there (as far as I could see the documentation to achieve what I'm trying to do).
For my DataSet I create a Scripted DataSet using my Scripted DataSource as source. In there I defined the script for open like:
importPackage(Packages.org.springframework.context);
importPackage(Packages.org.springframework.web.context.support);
var sc = reportContext.getHttpServletRequest().getSession().getServletContext();
var spring = WebApplicationContextUtils.getWebApplicationContext(sc);
myPojo = spring.getBean("myDao").findById(params["pojoId"]);
And script for fetch like:
if(myPojo != null){
row["title"] = myPojo.getTitle();
myPojo = null;
return true;
}
return false;
As the population of row will be done on runtime, I wasn't able to automatically get the DataSet columns, so I created one with the following configuration: name: columnTitle (as this is the name used to populated row object in fetch code).
Afterwards I edited the layout of my report and added the column to my layout.
I was able to confirm that spring.getBean("myDao").findById(params["pojoId"]); is executed. But my rendered report is not showing the title. If I double click on my column label on report layout I can see there that expression is dataSetRow["columnTitle"] is it right? Even I'm using row in my fetch script? What am I missing here?
Well, what is conctractVersion?
It is obviously not initialized in the open event.
Should this read myPojo.contractVersion or perhaps myPojo.getContractVersion()?
Another point: Is the DS with the column "columnTitle" bound to the layout?
You should also run your report as HTML or in the previewer to check for script errors.
Unfortunately, these are silently ignored when generating the report as PDF...
The problem was the use of batik twice (two different versions), one dependency for BIRT and other for DOCX4J.
The issue is quite difficult to identify because there is no log output rendering PDF files.
Rendering HTML I could see an error message which I could investigate and find the problem.
For my case I could remove the DocX4j from maven POM.

How to read and write the records in EhCache?

Hi All,
My current requirement is to store and read the records using EhCache. I am new to EhCache Implementation. I have read the EhCache Documentation and started to implement. I have done the records insert part and also read part. While the records are inserted, there will be *.data nd *.index files are created. Following is Code.
public class Driver
{
public static void main(String[] args) {
CacheManager cm = CacheManager.create("ehcache.xml");
Cache cache = cm.getCache("test");
// I do a couple of puts
for(int i=0;i<10;i++){
cache.put(new Element("key1", "val1"));
cache.flush();
}
System.out.println(cache.getKeys());
for(int i=0;i<10;i++){
Element el = cache.get("key"+i);
System.out.println(el.getObjectValue());
}
cm.shutdown();
}
}
Now what the issue is cm.shutdown(). If I am commenting this line and comment out the insert part and run the program means, Not able to retrieve the records and also *.index file is deleted. So In real scenario if the program is stopped abruptly means we can't read the records after startup. I want to know why the file is deleted and why I cant read the records in this situation... The Exception coming in the console is
net.sf.ehcache.util.SetAsList#b66cc
Exception in thread "main" java.lang.NullPointerException
at Driver.main(Driver.java:29)...
Any Input is needed Please..
What you are doing is correct and the expected behaviour is correct too. Caches are typically used to enhance application performance by providing frequently used data quickly, while avoiding costly trips to datastore.
Not all applications need to persist cache after the system is shutdown- and that's the default behaviour you are seeing (Most applications will build cache on application startup or as requests start coming in). The data you are caching is in heap - and as soon as your JVM dies- the cache is gone. Now you want to persist it beyond restart? There are options available. Loook up here
And I am copying the code snippet right from the same page:
DiskStoreConfiguration diskStoreConfiguration = new DiskStoreConfiguration();
diskStoreConfiguration.setPath("/my/path/dir");
// Already created a configuration object ...
configuration.addDiskStore(diskStoreConfiguration);
// By adding configuration for storing the cache in a file - you are not using default cache manager
CacheManager mgr = new CacheManager(configuration);
In addition, you will have to also configure the persistence options as explained here
Again copying code snippet from link:
<cache>
<persistence strategy=”localRestartable” synchronousWrites=”true”/>
</cache>
Hope this helps!

Categories

Resources