I've added hibernate validator to my project, and annotated my class with the relevant constraints. This is my pom.xml:
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-hibernate-orm-panache</artifactId>
</dependency>
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-hibernate-validator</artifactId>
</dependency>
and this is the annotated class:
public class EmployeeProject extends PanacheEntity {
#NotNull
#ManyToOne
private Employee employee;
#NotNull
#ManyToOne
private Project project;
#Column
#Min(value = 1)
private int quantity;
}
When I try to persist a not valid bean - null employee/project and/or non-positive quantity - through Java code, a validation exception is raised, as expected.
However, when I try to persist a not valid tuple directly into the database, surprisingly no exceptions or DB errors are raised. Indeed the reason is that the employee_project auto-generated by Hibernate have no validation constraints - just the two foreign keys constraints. I know there is no simple way to translate #Min in SQL, but boy I was at least expecting a NOT NULL on employee_id and project_id!
Is this the regular behavior, or am I missing something?
This is explained in the #Min javadoc:
null elements are considered valid.
A null value is an undefined values, with #Min you are saying that, if the value is defined, it must be at least something.
The other constraints are missing because #ManyToOne default value is optional=true and the JPA annotation has precedence over the Bean Validation one when it comes to the creation of the DB.
It makes sense because somebody might want to validate columns on the database that are without constraints.
It will work if you use #ManyToOne(optional=false).
Related
I am facing a very strange issue at the moment.
I have an entity that contains a property that is an element collection.
#ElementCollection(targetClass=Integer.class, fetch = FetchType.EAGER)
#CollectionTable(name="campaign_publisher", joinColumns=#JoinColumn(name="campaign_id"))
#Column(name = "publisher_id")
...
#NotEmpty(message = "campaign.publishers.missing")
public Set<Integer> getPublishers() {
return this.publishers;
}
public Campaign setPublishers(Set<Integer> publisherId) {
this.publishers = publisherId;
return this;
}
This all works fine. The values are validated and saved correct.
I also want this entity to have optimistic concurrency so I applied a #Version annotation as well.
#Version
private Long etag = 0L;
...
public Long getEtag() {
return etag;
}
public void setEtag(Long etag) {
this.etag = etag;
}
By adding the #Version annotation the #NotEmpty validation on my set of publishers always returns invalid.
To try and diagnose this I have tried the following:
Creating a custom validator at the entity level so I can inspect the values in the entity. I found that the Set of values have been replaced with an empty PersistentSet which is causing the validation to always fail.
I created some unit tests for the entity that uses a validator that is retrieved from the validationfactory and this validator seems to work as expected.
I have also tried to change the ElementCollection to a many-to-many relationship and a bi-directional one-to-many but the issue persists.
Right now I am out of ideas. The only thing I have found that works correctly is disabling the hibernate validation and manually calling the validator just before I save my data.
So my questions are:
Has anyone encountered this issue before?
Any advice on what I could try next?
Thank you all for reading!
Short answer: Set the initial value for etag = null.
// this should do the trick
#Version
private Long etag = null;
Longer one : When you are adding a optimistic locking via adding #Version annotation on a field with a default value you are making hibernate/spring-data think that the entity is not a new one (even the id is null). So on initial save instead of persisting entity undelying libraries try to do a merge. And merging transient entity forces hibernate to just one by one copy all the properties from source entity (the ones which you are persisting) to the target one (which is autocreate by hibernate with all the properties set to default values aka nulls) and here comes the problem, as hibernate will just copy the values of associations of FROM_PARENT type or in other words only associations which are hold on entity side but in your case the association is TO_PARENT (a foreign key from child to parent) hibernate will try to postpone association persistance after main entity save, but save will not work as entity will not pass #NotEmpty validation.
First I would suggest to remove the default value initialization for your #Version property. This property is maintained by hibernate, and should be initialized by it.
Second: are you sure that you are validating the fully constructed entity? i.e. you are constructing something, then do something, and for exact persist/flush cycle your entity is in wrong condition.
To clarify this, while you are on a Spring side, I would suggest to introduce service-level validation on your DAO layer. I.e. force the bean validation during initial call to DAO, rather then bean validation of entity during flush (yeap hibernate batches lots of things, and exact validation happens only during flush cycle).
To achieve this: mark your DAO #Validated and make your function arguments beign validated: FancyEntity store(#Valid #NotNull FancyEntity fancyEntity) { fancyEntity = em.persist(fancyEntity); em.flush(); return fancyEntity;}
By making this, you will be sure that you are storing valid entity: the validation would happen before store method is called. This will reveal the place where your entity became invalid: in your service layer, or in bad behaving hibernate layer.
I noticed that you use mixed access: methods and fields. In this case you can try to set #Version on the method:
#Version
public Long getEtag() {
return etag;
}
not on the field.
Just started using Hibernate Validator. I have a case where a bean's id is autogenerated when saved. I'd live to validate the bean before the save. At which time the id can be null. However, when I want to update it the id must be notnull.
So the generic #NotNull on the field won't work because when I go to save it it will fail validation.
There are ways to work around this, but I was wondering if the spec or hibernate implementation have a standard way of doing this. I'd like to not have any validation errors on save and no validation on update.
Such as applying a constraint but it's ignored unless implicitly named or something like that.
Thanks in advance.
You can achieve that with groups.
public class MyBean {
#NotNull(groups = UpdateBean.class)
private Long id;
}
Validate without the id:
validator.validate(myBean);
Validate with the id:
validator.validate(myBean, UpdateBean.class);
The list of supported Javax Validations are here.
The list of supported Hibernate Validations (extending Javax) are here.
Do each of these annotations "extend" or imply #NotNull?
For instance, if I annotate the following entity:
public class Widget {
#Email
private String email;
#URL
private String website;
// etc...
}
Does #Email and #URL also enforce #NotNull? Or are they simply applied/enforced if those properties are defined for a particular Widget instance?
Which annotations extend/imply #NotNull, and which ones don't? Are there any other implied relationships with these annotations?
No. If you look at the source code of EmailValidator from hibernate validator for example, it contains this code at the beginning of isValid() method:
if ( value == null || value.length() == 0 ) {
return true;
}
Same for URLValidator and so on. Generally all validators for the corresponding annotations consider the value valid when you try to validate null value, so a good rule of thumb would be to always perform the not-null validations separately.
Edit: here is a quote from JSR-303 specification related to this issue:
While not mandatory, it is considered a good practice to split the
core constraint validation from the not null constraint validation
(for example, an #Email constraint will return true on a null object,
i.e. will not take care of the #NotNull validation)
When they appear on a field/getter of an #Entity, what is the difference between them? (I persist the Entity through Hibernate).
What framework and/or specification each one of them belongs to?
#NotNull is located within javax.validation.constraints. In the javax.validation.constraints.NotNull javadoc it says
The annotated element must not be null
but it does not speak of the element's representation in the database, so why would I add the constraint nullable=false to the column?
#NotNull is a JSR 303 Bean Validation annotation. It has nothing to do with database constraints itself. As Hibernate is the reference implementation of JSR 303, however, it intelligently picks up on these constraints and translates them into database constraints for you, so you get two for the price of one. #Column(nullable = false) is the JPA way of declaring a column to be not-null. I.e. the former is intended for validation and the latter for indicating database schema details. You're just getting some extra (and welcome!) help from Hibernate on the validation annotations.
The most recent versions of hibernate JPA provider applies the bean validation constraints (JSR 303) like #NotNull to DDL by default (thanks to hibernate.validator.apply_to_ddl property defaults to true). But there is no guarantee that other JPA providers do or even have the ability to do that.
You should use bean validation annotations like #NotNull to ensure, that bean properties are set to a none-null value, when validating java beans in the JVM (this has nothing to do with database constraints, but in most situations should correspond to them).
You should additionally use the JPA annotation like #Column(nullable = false) to give the jpa provider hints to generate the right DDL for creating table columns with the database constraints you want. If you can or want to rely on a JPA provider like Hibernate, which applies the bean validation constraints to DDL by default, then you can omit them.
The JPA #Column Annotation
The nullable attribute of the #Column annotation has two purposes:
it's used by the schema generation tool
it's used by Hibernate during flushing the Persistence Context
Schema Generation Tool
The HBM2DDL schema generation tool translates the #Column(nullable = false) entity attribute to a NOT NULL constraint for the associated table column when generating the CREATE TABLE statement.
As I explained in the Hibernate User Guide, it's better to use a tool like Flyway instead of relying on the HBM2DDL mechanism for generating the database schema.
Persistence Context Flush
When flushing the Persistence Context, Hibernate ORM also uses the #Column(nullable = false) entity attribute:
new Nullability( session ).checkNullability( values, persister, true );
If the validation fails, Hibernate will throw a PropertyValueException, and prevents the INSERT or UPDATE statement to be executed needesly:
if ( !nullability[i] && value == null ) {
//check basic level one nullablilty
throw new PropertyValueException(
"not-null property references a null or transient value",
persister.getEntityName(),
persister.getPropertyNames()[i]
);
}
The Bean Validation #NotNull Annotation
The #NotNull annotation is defined by Bean Validation and, just like Hibernate ORM is the most popular JPA implementation, the most popular Bean Validation implementation is the Hibernate Validator framework.
When using Hibernate Validator along with Hibernate ORM, Hibernate Validator will throw a ConstraintViolation when validating the entity.
Interesting to note, all sources emphasize that #Column(nullable=false) is used only for DDL generation.
However, even if there is no #NotNull annotation, and hibernate.check_nullability option is set to true, Hibernate will perform validation of entities to be persisted.
It will throw PropertyValueException saying that "not-null property references a null or transient value", if nullable=false attributes do not have values, even if such restrictions are not implemented in the database layer.
More information about hibernate.check_nullability option is available here: http://docs.jboss.org/hibernate/orm/5.0/userguide/html_single/Hibernate_User_Guide.html#configurations-mapping.
According to my JPA 2.0 book (and online documentation), I should be able to mix field and property access within a single entity or entity hierarchy. The annotation of #Access on the class specifies the default access. When placed on a field or property getter #Access can specify that the default should be overridden for this field.
#Entity
#Access(AccessType.FIELD)
Class Foo {
#Id
int id;
#Column(name = "myfield")
String myField;
#Column(name = "myProp")
#Access(AccessType.PROPERTY)
public int getMyProp () {
return 3;
}
public void setMyProp (int p) {
// do nothing
}
}
This class should result in a table with three columns. However it doesn't with Hibernate...the "myProp" column is missing from the table because apparently Hibernate takes its field vs property cue from the entity ID and runs with it...totally ignoring the JPA spec with regards to #Access.
Can anyone confirm this or did I make a stupid mistake somewhere?
I've seen similar (not the same but similar) issues like HHH-5004 so I wouldn't exclude that this might be a new one (the TCK doesn't seem exhaustive). But what version of Hibernate are you using? Did you try with the latest?
Based on the docs your code seems to be right. The #Access(AccessType.FIELD) annotation on top is unnecessary, because you annotated the field int id;
This tells hibernate to use field access. I tried a very similar example with annotations and xml config mixed. This leads to the same behaviour, so it's probably a bug in hibernate.
I tried with hibernate 3.5.3