I have a table with a generated id, but in some cases I would like to set it on my own. Can I, somehow, force Hibernate to ignore the #GeneratedValue?
It may be an overkill but have you thought about writing your own CustomIDGenerator which probably subclasses say the AutoGenerator of hibernate and exposes a couple of methods where you can set the id of the next class object to be generated so for example
class MyGenerator extends .... {
public void setIdForObject(Class clazz, Long id) {
//once you use this API, the next time an object of
//type clazz is saved the id is used
}
public void setIdForObject(Class clazz, Long id, Matcher matcher) {
//once you use this API, the next time an object of
//type clazz is saved and the matcher matches yes the id will be
//assigned. Your matcher can match properties like name, age etc
//to say the matched object
}
}
This could get complicated but at the least is possible as per hibernate doco
create your own identifiergenerator/sequencegenerator
public class FilterIdentifierGenerator extends IdentityGenerator implements IdentifierGenerator{
#Override
public Serializable generate(SessionImplementor session, Object object)
throws HibernateException {
// TODO Auto-generated method stub
Serializable id = session.getEntityPersister(null, object)
.getClassMetadata().getIdentifier(object, session);
return id != null ? id : super.generate(session, object);
}
}
modify your entity as:
#Id
#GeneratedValue(generator="myGenerator")
#GenericGenerator(name="myGenerator", strategy="package.FilterIdentifierGenerator")
#Column(unique=true, nullable=false)
private int id;
...
and while saving instead of using persist() use merge() or update()
Although this question was asked quite a while ago, I found the perfect answer for it in this post by #lOranger, and wanted to share it.
This proposal checks if the object's current id is set to something other than null, and if so, it uses it, otherwise, it generates it using the default (or configured) generation strategy.
It's simple, straight forward, and addresses the issue brought up by #Jens, of one not being able to retrieve the object's current id.
I just implemented it (by extending the UUIDGenerator), and it works like a charm :-D
For you use case, you can manually add this no user.
One way to do it is to put the insert operation on a file named "./import.sql" (in your classpath).
Hibernate will go execute these statements when the SessionFactory is started.
Related
Imagine a situation:
#javax.persistence.Inheritance(strategy=javax.persistence.InheritanceType.JOINED)
#javax.persistence.DiscriminatorColumn
#javax.persistence.Entity
#javax.persistence.Table(name="PARENT")
public abstract class Parent{
...
}
#javax.persistence.Entity
#javax.persistence.Table(name="A")
public class A extends Parent{
...
}
#javax.persistence.Entity
#javax.persistence.Table(name="B")
public class B extends Parent{
...
}
Parent p = new A();
Now we call this:
p instance of A
always returns false!!
works ok on OpenJPA!
Should I file a bug? Hibernate 4.3.10
This is most likely because hibernate is returning a proxy.
Why does it do this? To implement lazy loading the framework needs to intercept your method calls that return a lazy loaded object or list of objects. It does this so it can first load the object from the DB and then allow your method to run. Hibernate does this by creating a proxy class. If you check the type in debug you should be able to see the actual type is a generated class which does not extend from your base class.
How to get around it? I had this problem once and successfully used the visitor pattern instead of using instanceof. It does add extra complication so it's not everyone's favorite pattern but IMHO it is a much cleaner approach than using instanceof.
If you use instanceof then you typically end up with if...else blocks checking for the different types. As you add more types you will have to re-visit each of these blocks. The advantage of the visitor pattern is that the conditional logic is built into your class hierarchy so if you add more types it makes it less likely you need to change everywhere that uses these classes.
I found this article useful when implementing the visitor pattern.
Not sure, but I think this will work.
public static boolean instanceOf(Object object, Class<?> superclass) {
return superclass.isAssignableFrom(Hibernate.getClass(object));
}
You can try to unproxy your object :
/**
*
* #param <T>
* #param entity
* #return
*/
#SuppressWarnings("unchecked")
public static <T> T initializeAndUnproxy(T entity) {
if (entity == null) {
// throw new NullPointerException("Entity passed for initialization is null");
return null;
}
Hibernate.initialize(entity);
if (entity instanceof HibernateProxy) {
entity = (T) ((HibernateProxy) entity).getHibernateLazyInitializer().getImplementation();
}
return entity;
}
That is because Hibernate uses run time proxies and OpenJPA, while supporting the proxy approach, prefers either compile time or runtime byte code enhancement.
See:
http://openjpa.apache.org/entity-enhancement.html
//Hibernate
Entity e = repository.load(entityId); // may return a proxy
//OpenJPA
Entity e = repository.load(entityId); //will return an (enhanced) actual instance of E
Hibernate returns Proxied Object. Rather than implementing a Visitor pattern (as described here), you can use the isAssignableFrom() method on the class you want to test (https://docs.oracle.com/javase/8/docs/api/java/lang/Class.html#isAssignableFrom-java.lang.Class-).
I have the following situation: my application needs to save all the strings typed by the user capitalized on the database, no matter how the user types it, the application needs to capitalize everything before saving it.
I know I could just call the toUpperCase() method on every string before saving it, or call it on every setter method, but I really don't want to do that, I'm looking for a more automatic way to do it without having to change too much code on the application.
I'm using JSF, JPA2, Eclipselink and EJB3.
Does anyone have any suggestion?
You can use lifecycle event listeners for this. There are several ways to implement it:
1: default listeners:
public class StringCapListener {
#PrePersist
#PreUpdate
public void capitalize(Object o) {
// capitalize string attributes
}
...
For the capitalizing you will either need to use reflection (extracting all string fields and changing their value) or let your entities implement an interface.
If you are using the listener only on several entities, prefer using the #EntityListeners annotation on the entity classes. In order to use the listeners on all entities, use default listeners. Unfortunately, you can only define them in XML:
<entity-mappings ...>
<persistence-unit-metadata>
<persistence-unit-defaults>
<entity-listeners>
<entity-listener class="com.example.StringCapListener">
2: inherited listener method
Let your entities derive from a BaseEntity of sorts. This base class can implement a listener method that is triggered on persist & update.
#PrePersist
#PreUpdate
public void capitalize(BaseEntity o) {
// capitalize string attributes
}
You will need to employ the same reflection magic to get and change all string attributes.
I'm thinking of an interface
public interface Processor<TResult, TInput> {
public TResult process(TInput input);
}
public class StringProcessor implements Processor<String, String> {
public String process(String input) {
return input.toUpperCase();
}
}
Then you'd call the interface on every string before persisting it
//...
public void persistString(String input) {
input = processor.process(input);
// Persistence logic
}
I have a class which derives from org.ektorp.support.CouchDbDocument:
#TypeDiscriminator( value="doc.type == 'TYPE_PRODUCT' )
public class Product extends CouchDbDocument {
....
There is also a repository class:
public class ProductRepo extends CouchDbRepositorySupport<Product> { ....
The repository class has a method:
public List<DocumentOperationResult> executeBulk( Set<Product> bulk ) {
return db.executeBulk( bulk );
}
The method is used for creating and updating items. Creation goes well. But on update, Ektorp throws this exception:
Caused by: java.lang.IllegalStateException: cannot set id, id already set
at org.ektorp.support.CouchDbDocument.setId(CouchDbDocument.java:39)
... 18 more
What I'm doing is sending a set of objects that were initially fetched by a view in the same repository - and of course the objects do have a not-null Id. This of course should not happen, since the object does have an id and must be updated, not created. According to Ektorp documentation, the db.executeBulk should handle both creating and updating documents.
The exception is being thrown in CouchDbDocument.setId:
#JsonProperty("_id")
public void setId(String s) {
Assert.hasText(s, "id must have a value");
if (id != null && id.equals(s)) {
return;
}
if (id != null) {
throw new IllegalStateException("cannot set id, id already set");
}
id = s;
}
But why ? The object sent do indeed have the Id set (revision is set too), so Ektorp should detect that we're talking about existing objects and not try to generate a new id for them. Anyone know how to this can be fix, or is the solution to ditch Ektorp and go for pure json over http in this case ?
(Project running on Jboss 7.1.1.Final, CouchDB 1.2.0, Ektorp 1.2.2)
The bulk operation response handler in Ektorp does not check if the id is already set. This causes problems when the bulked object extends CouchDbDocument which does not allow the id to be set more than once. This think this is a bug and it will be fixed in Ektorp 1.3.0
A workaround until 1.3.0 is to override setId in your Product class and relax the assertion a little bit.
Is there a nice and elegant way to set a bean value (column) before Hibernate persists an entity? Basically I have a field called "modification_date". It's on a bunch of entities. Whenever one of these entities is updated/modified, I'd basically like that field set automatically.
I could write the code in the service layer to set the date every time the object is saved/updated manually...
I also have a Dao Layer. Every Dao extends from a support class that contains a save() method. I could just use reflection and set the value inside of this method. I could check to see if that class has a field with the name "modicationDate", and if it does, set it to new Date().
Is there a better way than this? Or is using my generic save() method the best approach? This is something I'd like to be robust and not have to worry about it ever again. I will be happy knowing that by simply making a "modificationDate" property that this will be taken care of for me automatically from this point on. Using the save() method seems like the best place, but if there's a better way, I'd like to become aware of it.
Checkout event listeners:
#Entity
#EntityListeners(class=LastUpdateListener.class)
public class Cat {
#Id private Integer id;
private String name;
private Calendar dateOfBirth;
#Transient private int age;
private Date lastUpdate;
#PostLoad
public void calculateAge() {
...
}
}
public class LastUpdateListener {
/**
* automatic property set before any database persistence
*/
#PreUpdate
#PrePersist
public void setLastUpdate(Cat o) {
o.setLastUpdate( new Date() );
}
}
I'm working on GAE-based applications, which uses JDO to access datastore. I need to implement polymorphic relationship between persisted objects.
There's abstract parent class:
#PersistenceCapable
#Inheritance(strategy = InheritanceStrategy.SUBCLASS_TABLE)
public abstract class Parent {
#PrimaryKey
#Persistent(valueStrategy = IdGeneratorStrategy.IDENTITY)
#Extension(vendorName = "datanucleus", key = "gae.encoded-pk", value = "true")
String id;
// ....
And several child classes:
#PersistenceCapable (identityType = IdentityType.APPLICATION)
public class Child extends Parent {
// ....
Also, there's one more class, which should have reference to one of child classes. According to "Polymorphic Relationships" section of "Entity Relationships in JDO" article, the best way to implement such relationship is to store key of an object, so this class looks in the following way:
#PersistenceCapable (identityType = IdentityType.APPLICATION)
public class OtherClass {
#Persistent
private String reference;
// ....
I retrieve string key of referenced object from instance of OtherClass. Then I would like to obtain referenced object itself: it's an instance of one of Parent subclasses. BUT:
If I do it with pm.getObjectById(oid) method:
Object object = pm.getObjectById(reference);
JDOObjectNotFoundException exception is thrown (javax.jdo.JDOObjectNotFoundException: No such object FailedObject:...).
If I do it with getObjectById(class, key) method:
Parent object = pm.getObjectById(Parent.class, reference);
FatalNucleusUserException exception is thrown (org.datanucleus.store.appengine.FatalNucleusUserException: Received a request to find an object of kind Parent but the provided identifier is the String representation of a Key for kind Child)
What is correct way to retrieve instance of one of subclasses referenced in another entity?
UPDATE: I found this thread in GAE google group, but frankly it did not help me a lot.
I found the same problem with JDO and App Engine, so I started a project that implements a workaround for this. https://code.google.com/p/datanucleus-appengine-patch/
My first test with the code I have now looks okay, feel free to try it out at give me some feedback.
Actually my workaround may solve your problem 2 ways.
I implemented a getObjectById(class, id) that also looks for kinds that are instances of the provided class.
I implemented a getObjectById(oid) that does some special handling of lookup if oid is of type com.google.appengine.api.datastore.Key, then it will figure out the correct class to return.
I added a new annotation #PolymorphicRelationship that will make is easy to handle to workaround that App Engine describes, with storing the keys. Sample shown below:
#Persist
public Collection<Key> myChildKeys;
#NotPersistent
#PolymorphicRelationship(keyField ="myChildKeys")
public Collection<TestChild> myChildren;
I'm using this rather cancerous and smelly anti-pattern to get around this limitation of JDO/App Engine.
#JsonIgnore
#Persistent(mappedBy="account")
private List<XProvider> xProviders;
#JsonIgnore
#Persistent(mappedBy="account")
private List<YProvider> yProviders;
// TODO: add extra providers here and in getProviders() below...
And then to get the collection:
public List<XProvider> getXProviders() {
if (xProviders == null) {
xProviders = new ArrayList<XProvider>();
}
return xProviders;
}
//etc with other getters and setters for each collection.
public List<Provider> getProviders() {
List<Provider> allProviders = new ArrayList<Provider>();
// TODO: add extra providers here...
allProviders.addAll(getXProviders());
allProviders.addAll(getYProviders());
return allProviders;
}
It's a bad solution, but any port in a storm...
(Also relates a little to this bug, using interfaces as the collection type http://code.google.com/p/datanucleus-appengine/issues/detail?id=207)
App Engine's JDO layer doesn't currently support polymorphism. In fact, I'm not sure if JDO supports it in general or not.