Can JPA entities be serialized to disk? - java

I have two machines:
A) Windows XP, JDK 1.7.45
B) Windows Server 2003, JDK 1.7.45
In machine A I can successfully serialize an object to file system and its children and deserialize back.
In machine B, when I deserialize, the children objects are missing. No exception at any stage is thrown.
If I copy the serialized file from A to B then deserialization in B creates the child objects just fine.
This points to a problem in serialization in B.
The problem does not happen with very simple objects. But, when I use objects annotated with JPA, the problem happens.
#Entity
#Table(name="...")
#NamedQuery(name="Category.findAll", query="SELECT c FROM Category c")
public class Category implements Serializable {
private static final long serialVersionUID = 1L;
#Id
#Column(name="...")
private long id;
#Transient
private List<Category> subCategories; //These go missing
...
}
This problem happens for only certain but not all JPA entity classes.
Any idea what might be causing this? Can JPA entities be serialzed without issues? My eventual goal is to cache JPA entities in Couchbase. This works in A, but fails in B. Even simple disk based serialization has problem in B as described here.

There is nothing that prevents you from serializing JPA entities, after all they are POJOs. But what could be happening is that when you try to deserialize it you can't because in the other JVM it doesn't have in the classpath the JPA annotations. Anyway, it should be throwing an exception, so recheck your log.

Here is the tool for you:
EclipseLink MOXy is an implementation of JAXB (JSR-222) specification. As EclipseLink also provides a JPA implementation many of its extensions are aimed at mapping JPA entities:
#XmlInverseReference for supporting bidirectional relationships (see: http://blog.bdoughan.com/2010/07/jpa-entities-to-xml-bidirectional.html)
#XmlPath for mapping embedded IDs (see: http://blog.bdoughan.com/2010/07/xpath-based-mapping.html).
#XmlJoinNodes (similar to JPA's #JoinColumns) when you need to map by key/foreign key.

Related

Multiple levels of cascading persists in Ebean

I have a model class which defines a list of children that are models of the same class. Persisting a new object with some initial children works fine, but when I have two or more levels of children Ebean does not seem to be able to handle it well. This seemed unexpected so I'm worried I made a mistake. At the same time I couldn't find any examples or mentions about multiple level persist cascades so my questions are:
Is there an error in my code, Is this even a supported feature or did I find a bug?
My model class:
#Entity
public class TestEntity extends Model {
#Id
private int id;
private String text;
#ManyToOne
private TestEntity parentEntity;
#OneToMany(cascade = CascadeType.ALL)
private List<TestEntity> childEntities;
...
}
My program:
TestEntity grandparent = new TestEntity();
grandparent.setText("grandparent");
TestEntity parent = new TestEntity();
parent.setText("parent");
TestEntity child = new TestEntity();
child.setText("child");
grandparent.setChildEntities(Collections.singletonList(parent));
parent.setChildEntities(Collections.singletonList(child));
grandparent.save();
I added logging for the sql statements and it is evident that the third insert didn't get the correct value for parent_entity_id. That row fails due to 0 not being a valid foreign key and the batch is reverted.
insert into test_entity (text, parent_entity_id) values ('grandparent',null);
insert into test_entity (text, parent_entity_id) values ('parent',1);
insert into test_entity (text, parent_entity_id) values ('child',0);
I'm using Play framework 2.7.3 with the ebean plugin version 5.0.2 and Ebean version 11.39
This is indeed a supported feature and the code snippet above is expected to persist all three entities.
There was a unit test added to verify that this is working correctly in the latest version of ebean.
In ebean 11.39 which is currently the latest supported by play framework the test fails. An easy workaround when using that version is to use Long instead of primitive int as ID for the models.
While not an answer to this specific question, it is good to be aware that these same symptoms also appear if the collections are set without using setters enhanced by ebean. I had some trouble using public fields and play enhancer .

How to avoid loading lazy bidirectional relationships with MOXy?

My question is a follow up to this comment.
I'm mixing JPA and JAXB (MOXy) annotations on the same class, which works fine most of the time. As described in the linked thread, #XmlInverseReference prevents cycle exceptions when bidirectional relationships are marshalled. But in order to detect the cycle, MOXy has to inspect the back reference of the linked entity, which leads to extra SQL SELECTs if a lazy relation needs to be populated.
To illustrate the problem in detail, consider this made-up example:
#Entity
#Access( AccessType.FIELD )
#XmlRootElement
#XmlAccessorType( XmlAccessType.FIELD )
public class Phone {
#ManyToOne
#JoinColumn( name = "employeeID" )
#XmlElement( name = "employee" )
#XmlInverseReference( mappedBy = "phones" )
private Employee employee;
private String number;
[...]
}
#Entity
#Access( AccessType.FIELD )
#XmlRootElement
#XmlAccessorType( XmlAccessType.FIELD )
public class Employee {
#OneToMany( mappedBy = "employee" )
#XmlElementWrapper( name = "phones" )
#XmlElement( name = "phone" )
#XmlInverseReference( mappedBy = "employee" )
private List<Phone> phones;
private String name;
[...]
}
Now I'd run queries on Phones with a JAX-RS method like this (using an underlying EJB):
#Inject
private PhoneService phoneService;
#GET
#Path( "/phones" )
public List<Phone> getPhonesByNumber( #QueryParam( "number" ) String number ) {
List<Phone> result = phoneService.getPhonesByNumber( number );
return result;
}
What happens is this: The JPQL query within the PhoneService EJB triggers an SQL SELECT on the Phone table (filtered by the number), and if I use a JOIN FETCH query, I can get the associated Employee with the same single SELECT statement.
When the JAX-RS method returns, the JAXB marshalling kicks in, which leads to an additional SQL SELECT: this one selects all Phones whose employeeID points to the Employee who is associated with the originally requested Phones. So the lazy relationship from Employee to Phone is resolved now, presumably because MOXy must be able to determine if the original Phone is contained in the collection.
I've tried using JPA property access and JAXB field access for the phones field, as suggested in the other thread, to no avail. I've also tried nulling out the phones field in the linked Employee instance after retrieving the result from the EJB, i.e. when my entities are detached already, but this led to an immediate SQL SELECT again (it seems like EclipseLink will do this whenever any manipulation is done to an IndirectList?). The only workaround solution I could find is to use MOXy #XmlNamedObjectGraphs with a subgraph that excludes the phones field. But that's not practical, especially if the involved entities have many attributes.
As I may need to query in the other direction too, e.g. employees by name with their associated phones, I can't just mark phones as #XmlTransient.
Does anyone have an elegant solution to suppress those extra SQL statements?
From my experience the easiest way to accomplish what you are trying is to detach all the entity classes before you pass them to a presentation layer like a JAX-RS rest api. You can even use the #OneToMany(mappedBy = "employee", cascade = CascadeType.DETACH) and EntityManager.detach() to detach your phone class and subsequently detach your employee class or vice versa. This will ensure that during the marshaling of your entity, Jax-RS doesn't trigger any SELECT statements that you wouldn't normally want.
I always detach model entities before I pass them to the presentation layer so that they can interact with the model classes how they please without affecting performance or the database.
I collected some information about EclipseLink from these three threads. Important bits:
Detached Objects get the connection need to traverse the LAZY relationship from the EntityManagerFactory and will able able to use it as long as the EntityManagerFactory is open. The connection used in not the transactional one and when you want to use the entity in a transaction it will have to be properly merged.
 
This is a special feature of TopLink's implementation where the detached instances created from non-tx reads still have access in their proxies to retrieve additional dettached instances. If the object was detached through serialization this would not be possible.
 
If you would like TopLink Essentials to not process lazy relationships after the EM has closed I would recommend filing an enhancement request in GlassFish.
I couldn't find such an enhancement request though, let alone an implemented possibility to disable this feature (on a case-by-case basis).
There are five possible workarounds I could think of, each with its own drawbacks:
Just don't mix JAXB and JPA annotations on the same class: use a different set of additionatlly instantiated JAXB classes instead and perform explicit mapping between the two views. This could be a little expensive if lots of entities are returned from a query.
Like I mentioned in my question, use MOXy's (named) object graph feature to exclude (relationship) fields from being traversed.
Use a JAXB Marshaller.Listener to exclude all uninstantiated IndirectContainers.
Since serialization is supposed to break this EclipseLink feature for detached entities, serialize them before marshalling them. Seems awkward and even more expensive though.
This comes closest to emulating turning off the feature, but also looks hackish: access the wrapping IndirectContainer and its contained ValueHolderInterface and set them to null. Sample code:
(...)
import org.eclipse.persistence.indirection.IndirectContainer;
// entities must already be detached here, otherwise SQL UPDATEs will be triggered!
Employee e = phone.getEmployee();
IndirectContainer container = (IndirectContainer) e.getPhones();
container.setValueHolder( null );
e.setPhones( null );

JPA fetch collection of subclass

I have a case where we have an inheritance strategy like this (this is an example of the jpa wiki, our real example is an other business case :))
#Entity
#Inheritance
#DiscriminatorColumn(name="PROJ_TYPE")
#Table(name="PROJECT")
public abstract class Project {
#Id
private long id;
...
}
#Entity
#DiscriminatorValue("L")
public class LargeProject extends Project {
#OneToMany
private Set<Member> members;
}
#Entity
#DiscriminatorValue("S")
public class SmallProject extends Project {
}
I have a bunch of projects in my database and want to fetch all the projects at once, but by doing this, i also want to fetch my list of members at once.
Is there a way to do this with jpql? I know the TYPE annotation allows me to look at the type, but can I combine this with a JOIN FETCH?
I'm using hibernate, but don't want to downgrade back to the hibernate api if I don't need to
JPA and Hibernate doesn't support fetching associations from subclasses, unless the property is also present in the topmost member of the hierarchy. But according to this post (https://thorben-janssen.com/fetch-association-of-subclass/) you can work around this limitation by exploiting hibernate's level 1 cache mechanism.
In you case you'll fetch all instances of Member first, in a separated query, and then perform your query on Project, letting LargeProject.members to be Lazy loaded. Instead of performing N + 1 SELECTs, hibernate will fetch those from the cache.
A bit late but I found a way by using only JPQL.
In your case :
SELECT p FROM Project p JOIN FETCH ((TREAT(p as LargeProject)).members)

Why is a field with a JAXB object not recognized as persistent state by OpenJPA?

I want to persist the XML of a JAXB object in a CLOB column in the table of the owning entity. OpenJPA ships with support for such constructs using its XMLValueHandler.
I followed this tutorial from IBM.
My sample code is:
#Entity
#Access(AccessType.FIELD)
public class EntityContainingXml {
#Id
private Long id;
#Persistent
#Strategy("org.apache.openjpa.jdbc.meta.strats.XMLValueHandler")
#Column(name = "xml")
#Lob
private SomeJaxbType xmlStuff;
//...
}
However the field xmlStuff is not recognized as persistent state by the OpenJPA enhancer. It does not make a change if SomeJaxbType is contained in the persistence unit.
What do I need to do so that the OpenJPA enhancer recognizes the field xmlStuff as persistent state?
The problem was a classpath issue due to maven dependencies. I can't say what exactly caused this issue. The original classpath contained some combination of the org.apache.openjpa artifacts openjpa-persistence-jdbc, openjpa-persistence, openjpa and openjpa-all. That was causing the org.apache.openjpa.persistence.PersistenceMetaDataFactory to be used during build time enhancement. But the parser org.apache.openjpa.persistence.AnnotationPersistenceMetaDataParser.AnnotationPersistenceMetaDataParser created by this factory is not able to recognize #Strategy.
Now only openjpa-persistence-jdbc is currently used as a compile time dependency. And the build time enhancement has a dependency on the artifact openjpa.
I debugged through the maven build to find that out. In my case it was not possible to use the configuration property openjpa.MetaDataFactory to set the appropriate meta data factory, because it caused a class cast exception. I then started to throw out OpenJPA. During that I build and ran the application once more and it suddenly worked.

Can Hibernate be used to store HashMaps of data without classes to represent their structure?

I have pretty much zero experience with Hibernate, though I've used similar persistence libraries in other languages before. I'm working on a Java project that will require a way to define "models" (in the MVC sense) in text configuration files, generate the database tables automatically, and (ideally) be database-backend-agnostic. As far as I can tell from some quick Googling, Hibernate is the only widely-used backend-agnostic Java database library; while I could write my own compatibility layer between my model system and multiple DB backends, that's a debugging endeavor that I'd like to avoid if possible.
My question is: Can Hibernate be used to store data whose structure is represented in some other way than an annotated Java class file, such as a HashMap with some configuration object that describes its structure? And, if not, are there any other relatively-stable Java database libraries that could provide this functionality?
EDIT: Here's a clearer description of what I'm trying to accomplish:
I am writing a data-modeling library. When a certain method of the library is called, and passed a configuration object (loaded from a text file), it should create a new "model," and create a database table for that model if necessary. The library can be queried for items of this model, which should return HashMaps containing the models' fields. It should also allow HashMaps to be saved to the database, provided their structure matches the configuration files. None of these models should be represented by actual compiled Java classes.
I think you could try use #MapKey annotation provided by JPA (not the Hibernate #MapKey annotation, it's pretty different!).
#javax.persistence.OneToMany(cascade = CascadeType.ALL)
#javax.persistence.MapKey(name = "name")
private Map<String, Configuration> configurationMap = new HashMap<String, Configuration>();
I don't believe Hibernate will let you have a Map as an #Entity but it will let you have a custom class that contains a map field:
#Entity
public class Customer {
#Id #GeneratedValue public Integer getId() { return id; }
public void setId(Integer id) { this.id = id; }
private Integer id;
#OneToMany #JoinTable(name="Cust_Order")
#MapKeyColumn(name"orders_number")
public Map<String,Order> getOrders() { return orders; }
public void setOrders(Map<String,Order> orders) { this.orders = orders; }
private Map<String,Order> orders;
}
(example from Hibernate docs)
Additionally, you don't have to use annotations (if that is what you're trying to avoid): Hibernate relationships can be described via xml files and there are utilities (maven plugins for example) which can automatically generate the necessary java pojo's from the xml.
Does your model require a relational database? You might consider a database like Mongo that stores any object you can represent with JSON.
you can configure hibernate to work without any entity classes (beans linked to tables),
1. you need to use xml configuration for this. in place of class use entity-name and in place of <property name="something" use <property node="something".
create a hibernate session with entiy-mode as map.
you can use a map to store and retuive information from db. Remember, since you are using map there will be difficulties in 2-way mapping (this was as of 3.3, not sure if its been fixed in later releses)

Categories

Resources