I am validating fields of my Data Access Object classes. In one attempt, I have started adding Bean Validation annotations to the properties (#NotNull, #NotBlank, #Min, #Max and so on). I also have more annotations Jackson (JsonProperty (..)) for swagger library and documentation (#Api (...)). In my opinion the class was very "dirty" with many annotations (each property has at least three annotations). Example of one field:
#JsonProperty("ownName")
#Api(description="it is my own name" required=true)
#Valid
#NotNull
private SomeObject object;
In another attempt, I have performed my own validation with the Spring Validator interface. If a custom validator such as the Spring Interface is used it seems to be cleaner and also allows you freedom to generate more than one validator for different situations. Also, the class does not seem to be so overloaded with annotations and validations are independent of the class. Example of Validator:
public class UserValidator implements Validator {
#Override
public boolean supports(Class<?> arg0) {
return User.class.isAssignableFrom(arg0);
}
#Override
public void validate(Object obj, Errors error) {
User user = (User) obj;
if(user.getPassword().length() < 10)
{
error.reject("Password must be lesser than 10");
}
//more validations....
}
}
When would you use one or another?
What are the pros and cons of each?
I think it is a matter of taste and use case. I agree that sometimes it feels one ends up in some sort of annotation overload.
Some reasons for using Bean Validation are that it is a standard. The constraint annotations are standardized and many frameworks integrate with it, for example JPA in case you want to add yet another annotation based framework ;-)
Using something like Spring binds you to a specific library/framework. Your code will be less portable. If you of course never see a scenario where you would leave Spring behind, this might not matter.
Of course you could do something home grown altogether, but in this case you need to write the whole integration code into for example Spring, REST, JPA, etc.
Also writing a general purpose validation framework is not trivial. There are many things to consider.
Related
I am developing a RESTful API in Spring Boot 2+, for which I need to perform several validations. Nothing really fancy, just the typical #NotNull, #NotEmpty, #Max, #Min, #Email, #Regex, #Future, etc stuff...
Except that I have beans from an API that I must use yet cannot modify. This means that I cannot annotate the fields and methods in those DTOs.
It would be great if I could create mixin-like classes or interfaces with the same structure of the real DTOs I must use in the API, on which I would happily place bean-validation's annotations.
For example, if I had the following DTOs that I couldn't modify:
public class Person {
private String name;
private String dateOfBirth;
private Address address;
// constructors, getters and setters ommited
}
public class Address {
private String street;
private String number;
private String zipCode;
// constructors, getters and setters ommited
}
I would create the following 2 interfaces that mimic their structure and annotate them as I need:
public interface PersonMixin {
#NotBlank String name();
#Past String dateOfBirth();
#Valid #NotNull Address address();
}
public interface AddressMixin {
#NotBlank String street();
#Positive int number();
#NotBlank String zipCode(); // Or maybe a custom validator
}
As you see, the name of the methods in the interfaces match the names of the properties of the bean classes. This is just one possible convention...
Then, ideally, somewhere while the app is loading (typically some #Configuration bean) I would be very happy to do something along the lines of:
ValidationMixinsSetup.addMixinFor(Person.class, PersonMixin.class);
ValidationMixinsSetup.addMixinFor(Address.class, AddressMixin.class);
Except that ValidationMixinsSetup.addMixinFor is pure fantasy, i.e. it doesn't exist.
I know that there exists a similar construct for Jackson regarding JSON serialization/deserialization. I've found it extremely useful many times.
Now, I've been looking at both Spring and Hibernate Validator's source code. But it's not a piece of cake... I've dug into ValidatorFactory, LocalValidatorFactoryBean, TraversableResolver implementations, but I haven't been able to even start a proof-of-concept. Could anyone shed some light into this? I.e. not how to implement the whole functionality, but just how and where to start. I'm after some hints regarding which are the essential classes or interfaces to extend and/or implement, which methods to override, etc.
EDIT 1: Maybe this approach is not the best one. If you think there's a better approach please let me know.
EDIT 2: As to this approach being overly complicated, too convoluted, Rube Goldberg, etc, I appreciate and respect these points of view, but I'm not asking whether validation through mixins is good or bad, convenient or inconvenient, neither why it might be like so. Validation through mixins has pros on its own and I think it could be a good approach for some valid use cases, i.e. having declarative validation instead of scripted or programmatic validation while also separating validation from the model, letting the underlying framework do the actual validation job while I only specify the constraints, etc.
Using programmatic API (as mentioned in the comment) in case of Person you could apply next mappings for your constraints:
HibernateValidatorConfiguration config = Validation.byProvider( HibernateValidator.class ).configure();
ConstraintMapping mapping = config.createConstraintMapping();
mapping.type( Person.class )
.field( "name" )
.constraint( new NotNullDef() )
.field( "number" )
.constraint( new PositiveDef() )
.field( "address" )
.constraint( new NotNullDef() )
.valid();
Validator validator = config.addMapping( mapping )
.buildValidatorFactory()
.getValidator();
And as you are using Spring - you would need to do that in one of your sping config files where you define a validator bean.
In my current project almost every entity has a field recordStatus which can have 2 values:
A for Active
D for Deleted
In spring data one can normally use:
repository.findByLastName(lastName)
but with the current data model we have to remember about the active part in every repository call, eg.
repository.findByLastNameAndRecordStatus(lastName, A)
The question is: is there any way to extend spring data in such a way it would be able to recognize the following method:
repository.findActiveByLastName(lastName)
and append the
recordStatus = 'A'
automatically?
Spring Data JPA provides 2 additional options for you dealing with circumstances that their DSL can't handle by default.
The first solution is custom queries with an #Query annotation
#Query("select s from MyTable s where s.recordStatus like 'A%'")
public MyObect findActiveByLastName(String lastName);
The second solution is to add a completely custom method the "Old Fashion Way" You can create a new class setup like: MyRepositoryImpl The Impl is important as it is How spring knows to find your new method (Note: you can avoid this, but you will have to manually link things the docs can help you with that)
//Implementation
public class MyRepositoryImpl implements MyCustomMethodInterface {
#PersistenceContext
EntityManager em;
public Object myCustomJPAMethod() {
//TODO custom JPA work similar to this
String myQuery = "TODO";
return em.createQuery(myQuery).execute();
}
}
//Interface
public interface MyCustomMethodInterface {
public Object myCustomJPAMethod();
}
//For clarity update your JPA repository as well so people see your custom work
public interface MySuperEpicRepository extends JPARepository<Object, String>, MyCustomMethodInterface {
}
These are just some quick samples so feel free to go read their Spring Data JPA docs if you would like to get a bit more custom with it.
http://docs.spring.io/spring-data/jpa/docs/current/reference/html/
Finally just a quick note. Technically this isn't a built in feature from Spring Data JPA, but you can also use Predicates. I will link you to a blog on this one since I am not overly familiar on this approach.
https://spring.io/blog/2011/04/26/advanced-spring-data-jpa-specifications-and-querydsl/
You can use Spring Data's Specifications. Take a look at this article.
If you create a 'Base'-specification with the recordStatus filter, and deriving all other specifications form this one.
Of course, everybody in your team should use the specifactions api, and not the default spring data api.
I am not sure you can extend the syntax unless you override the base class (SimpleReactiveMongoRepository; this is for reactive mongo but you can find the class for your DB type), what I can suggest you is to extend the base methods and then make your method be aware of what condition you want to execute. If you check this post you get the idea that I did for the patch operation for all entities.
https://medium.com/#ghahremani/extending-default-spring-data-repository-methods-patch-example-a23c07c35bf9
spring-data-rest makes it possible to expose #Entity domain objects directly and even provide a DTO projection as follows:
#Projection(name = "personDTO", types = { Person.class })
public interface PersonDTO {
#Value("#{target.firstName} #{target.lastName}") //SPeL
String getFullName();
}
Question: what if I want to construct only some of the dto fields myself? Eg having some kind of condition on the firstname field, and fill it based on this either the one or other way. Is that possible?
Spring mentions a example, but unfortunately it's not complete:
https://spring.io/blog/2014/05/21/what-s-new-in-spring-data-dijkstra
#Projection(name = "summary", types = Order.class)
interface OrderSummary {
#Value("#{#shop.calculateTotal(target)}")
Money getTotal();
}
Here the logic is exported to #shop.calulcateTotal(), BUT they don't tell in the example how this #shop bean is injected here. I assume this is a #Service, but don't know how to get it in.
Says right below the example you posted.
https://spring.io/blog/2014/05/21/what-s-new-in-spring-data-dijkstra
For advanced use cases you can even equip the projection methods with #Value to return the result of a SpEL expression to the marshaller. In our sample here, we invoke a method on a Spring bean named shop and hand the proxy target instance to it to calculate the order total, which could consider rebates, taxes etc.
Since your projections are already managed by spring, you don't really need to inject it. Spring magic takes care of it for you.
I'm implementing several DAO classes for a web project and for some reasons I have to use JDBC.
Now I'd like to return an entity like this:
public class Customer{
// instead of int userId
private User user;
// instead of int activityId
private Activity act;
// ...
}
Using JPA user and activity would be loaded easily (and automatically specifying relations between entities).
But how, using JDBC? Is there a common way to achieve this? Should I load everiting in my CustomerDAO? IS it possible to implement lazy initialization for referenced entities?
My first idea was to implement in my UserDAO:
public void initUser(Customer customer);
and in my ActivityDAO:
public void initActivity(Customer customer);
to initialize variables in customer.
Active Record route
You could do this with AspectJ ITDs and essentially make your entities into Active Record like objects.
Basically you make an Aspect that advises class that implement an interface called "HasUser" and "HasActivity". Your interfaces HasUser and HasActivity will just define getters.
You will then make Aspects that will weave in the actual implementation of getUser() and getActivity().
Your aspects will do the actual JDBC work. Although the learning curve on AspectJ is initially steep it will make your code far more elegant.
You can take a look at one of my answers on AspectJ ITD stackoverflow post.
You should also check out springs #Configurable which will autowire in your dependencies (such as your datasource or jdbc template) into non managed spring bean.
Of course the best example of to see this in action is Spring Roo. Just look at the AspectJ files it generates to get an idea (granted that roo uses JPA) of how you would use #Configurable (make sure to use the activerecord annotation).
DAO Route
If you really want to go the DAO route than you need to this:
public class Customer{
// instead of int userId
private Integer userId;
// instead of int activityId
private Integer activityId;
}
Because in the DAO pattern your entity objects are not supposed to have behavior. Your Services and/or DAO's will have to make transfer objects or which you could attach the lazy loading.
I'm not sure if there is any automated approach about this. Without ORM I usually define getters as singletons where my reference types are initialized to null by default, i.e. my fetching function would load primitives + Strings and will leave them as null. Once I need getUser(), my getter would see if this is null and if so, it would issue another select statement based on the ID of the customer.
This question already has answers here:
Closed 11 years ago.
Possible Duplicate:
using Hibernate Validator without calling annotation.
I have this composite constraint annotation (only for illustration):
#Target... #Retention...
#Constraint(validatedBy = {})
#Pattern(regexp = PasswordComplexity.AT_LEAST_TWO_NONE_ALPAH_CHARS)
#Length(min = 6, max = 20)
public #interface PasswordComplexity {
...
}
And I use it in Spring Controllers and Entity Classes.
But now I need to check a single String in a Service method, where I need to apply the same constraint to a single String. Because of the fact that the constraint is the same, I want to use the same definition of the constraint (#PasswordComplexity) (single source of truth). Something like:
public void createUser(UserDto userDto, String password) {
if(hasViolation(validator.validate(password,PasswordComplexity.class))) {
throw new PasswordComplexityViolationException();
} else {
…
}
}
But I do not know how to run the JSR 303 Validator for an not annotated simple object (String). Is it at least possible, and how?
(I use Hibernate Validator as JSR 303 provider)
One way to do this would be write a full custom validator, and push the logic down into that class having the annotation just use the validator. This would mean you then had an independent compilation unit (A full class PasswordComplexityValidator implements implements ConstraintValidator<PasswordComplexity, String> ...) which you could use independently of the annotation. This approach would also make it easier for you to unit test the validation.
However, since you are using the annotation as a way of configuring the existing regex validator provided by Hibernate, you could use that one instead, passing it the constant pattern from the annotation class. You should also be able to package your length constrain into the regex too, which would be simpler and faster than having both annotations anyway.