How to access Session information on service layer? - java

Is there a way I can share Http/Wicket Session information to the service layer without introducing servlet api/Wicket dependency?
I'll provide some context to why am I asking this question, just in case I'm missing something and asking the wrong question.
I've got several entities that have groups of attributes that can be validatable.
Being validatable means there are fields indicating the validation value, the user who made the validation and the date it was validated in.
This is how these entities are modelled:
#Embeddable
public class ValidationBean<T> implements Serializable {
private T validated;
private String user;
private Date date;
// Constructors, getters, setters ahead.
// ...
}
#Entity
#Table(name="SOME_TABLE")
public class SomeEntity implements Serializable, SomeInterface {
// Some attributes which conform validation group 1
public String attribute11;
public String attribute12;
public String attribute13;
private ValidationBean<Integer> validationBean1 = new ValidationBean<Integer>();
// Some attributes which conform validation group 2
public String attribute21;
private ValidationBean<String> validationBean2 = new ValidationBean<Integer>();
// Constructors, various attribute getters with JPA annotations
// ...
#Embedded
#AttributeOverrides(/*various overrides, each entity/validation group has its own validation column names...*/)
public ValidationBean<Integer> getValidationBean1() { return validationBean1; }
#Embedded
#AttributeOverrides(/*various overrides, each entity/validation group has its own validation column names...*/)
public ValidationBean<Integer> getValidationBean2() { return validationBean2; }
}
ValidationBean's user and date fields are automatically modified in the presentation layer when a change in the validated field is detected.
All of this is working correctly. Now, I'm trying to find an elegant & general solution that integrates with the current modelling to the following requirement: When any of the attributes in a validation group gets its value changed, and the related ValidationBean.validated doesn't change, user and date must also be modified with the current user's id and the current date.
There are, as I see it, two alternatives; putting that logic in the presentation layer, or in business/service layer
Putting it in the presentation layer would have an efficieny advantage. Entities are stored in session so that the DB doesn't have to be queried again to check for field changes. But unfortunately, some entities have some of their fields ajax-updated and it would be hard to tell if the entity really changed. Apart from not being the presentation layer's responsability to fulfill this requirement.
Putting it in the service layer seems the best alternative, and I've already found a possible way to handle this properly. I've come up with #PreUpdate. It would be easy to implement a #PreUpdate method on the #Entities to compare the values in DB with the values about to be updated, and modify the related ValidationBeans accordingly. The problem here, and I suppose it's a common problem, is that in the business layer, I don't have where to get the user id from. The current user Id is stored in the Session, which belongs to the presentation layer.
So, any tips, comments, recommendations on how can I share http session information to the service layer (not necessarily Wicket-specific), or even alternatives to fulfill this requirement will be welcome.
UDPATE : Following gkamal's suggestion, I'll try to integrate spring-security in the less intrusive way I can, just to take advantage of SecurityContext. I'd also appreciate tips on this matter.

The common approach used to solve this is to introduce a SecurityContext class that holds the details of the current user as a static thread local variable. The variable is initialized (from the httpsession) by the security filter or some other filter and cleared after the request processing is complete. The SecurityContext class will itself be part of the business layer which provides a set / get methods and hence doesn't have any web layer dependency.

Related

What is the proper way to create and validate an Entity Model and its DTO in a RESTful API?

I am developing my first RESTful API from scratch and with Spring Boot.
I have already created the endpoints, models and JPA repositories for "standalone" entities. But now that I started linking them together and after doing some research I got to the conclusion that I may have to create DTOs. I don't think everytime I'm creating a new Order with a POST request I should make the client send the whole Customer and Employee objects inside the request as nested objects of Order (if I am also wrong in this please let me know). I am thinking about creating a DTO by just replacing the class relations with just IDs.
This is how my entity is currently defined:
#Data
#Entity
#Table(name = "Orders")
public class Order {
#Id
#GeneratedValue(strategy = GenerationType.SEQUENCE)
private Long id;
#NotBlank
#NotNull
private String description;
#NotBlank
#NotNull
private Status status;
#NotNull
#ManyToOne
#JoinColumn(foreignKey = #ForeignKey(name = "employee_id_fk"))
private Employee employee;
#NotNull
#ManyToOne
#JoinColumn(foreignKey = #ForeignKey(name = "customer_id_fk"))
private Customer customer;
protected Order() {}
public Order(String description) {
this.description = description;
this.status = Status.IN_PROGRESS;
}
}
And my endpoint (this is what I must change):
#PostMapping("/orders")
ResponseEntity<EntityModel<Order>> createOrder(#Valid #RequestBody Order order) {
order.setStatus(Status.IN_PROGRESS);
Order newOrder = repository.save(order);
return ResponseEntity
.created(linkTo(methodOn(OrderController.class).getOrder(newOrder.getId())).toUri())
.body(assembler.toModel(newOrder));
}
Now, how should I validate the requests with this format?
Previously, as you can see, I would simply use #Valid and it would automatically get validated when the endpoint is called against the Order model. However, if I create the DTO, I would have to validate the DTO with the same methodology and duplicate all the annotations from its model (#NotNull, #NotBlank, etc.). Maybe I should validate the entity model after mapping it from the DTO but I don't know how straightforward that would be and whether that is a good practice of validating requests. I also can't remove the validations from the entity model because I'm using Hibernate to map them to tables.
Great questions!
I don't think everytime I'm creating a new Order with a POST request I should make the client send the whole Customer and Employee objects inside the request as nested objects of Order (if I am also wrong in this please let me know).
You're right. It's not because we can save bits and bytes (as it may look like), but because the lesser information you can ask from the client, the better the experience he/she would get (whether it's an external integrator or front-end/back-end application within the same company). Fewer amounts of data to encompass = easier to comprehend and less room for an error. It also makes your API cleaner from the design perspective. Is it possible to process your request without the field? Then it shouldn't be in your API.
Now, how should I validate the requests with this format? Previously, as you can see, I would simply use #Valid and it would automatically get validated when the endpoint is called against the Order model. However, if I create the DTO, I would have to validate the DTO with the same methodology and duplicate all the annotations from its model (#NotNull, #NotBlank, etc.).
You can also use #Valid to kick in validation for DTO inside the controller within the method mapped to endpoint. But as you mentioned correctly, all validated fields within DTO should be annotated with #NotNull, #NotBlank, etc. As a solution, to the "duplication" problem, you can create a base class and define all validations in there and inherit DTO and Entity from it. But please, don't do that!
Having the same fields and validation rules within DTO and Enity isn't considered duplication since they are separate concepts and each one of serves its specific purpose within its layer (DTO - top tier, Entity - most often lowest, Data tier). There are a lot of examples demonstrating it (e.g. here and here)
Maybe I should validate the entity model after mapping it from the DTO but I don't know how straightforward that would be and whether that is a good practice of validating requests.
It's a best practice to validate the request and a lot of projects are following it. In your example, it's very straightforward (direct mapping from DTO to Entity), but very often you would have a service layer that does some business logic before handing it off to a data layer (even in your example I recommend moving out your code from controller to a service layer). You don't want malformed request pass beyond the controller to handle it later with excessive if statements, null checks (that leads to a defensive code that's hard to follow and it's also error-prone).
Another note: you shouldn't sacrifice client experience and tell them or force yourself to add two more fields because it allows having one Object serving as DTO and Entity and simplifies development.
The last note: To map fields from DTO to Entity you can use one of the object mapper libraries.

How to use different validation rules on same entity in Hibernate?

Problem:
How to save object Account as nested object when only ID is needed without getting ConstraintValidator exception?
Problem is because i have set validation rules to class, but when i want to save sem entity as nested object i get exception that some property values are missing. So i would liek to have different validation rules when i want to persist object as a whole and when i want to use it only sa nested object (when only ID is needed).
public class Account {
private int id;
#NotNull
private String name;
#NotNull
private String lastName;
#NotNull
private String userName;
//getters&setters
If I include Account as nested object i just need ID to be able to use it as FK (account entity is already in DB), but because of #NotNull annotation i get Exception.
Is there a way to ignore those annotations from Account when trying to save object Shop or how to create different validation rules for Account to validate just soem other properties and not all?
public class Shop {
private int id;
private Account owner; // only ID is needed
Do you have any basic example? I dont understand those in documentation. I have already read documentation before posting here.
You want to look at Bean Validation groups where you can classify specific validations so they are only activated when that group is validated and ignored otherwise.
You can refer to the documentation here for details.
Taking an example from the documentation:
// This is just a stub interface used for tagging validation criteria
public interface DriverChecks {
}
// The model
public class Driver {
#Min(value = 18, message = "You must be 18", groups = DriverChecks.class)
private int age;
// other stuffs
}
A group is nothing more than a tag that allows you to enable/disable validations based on specific use cases at run-time. By not specifying the groups attribute on a bean validation annotation, it defaults to the Default group, which is what Bean Validation uses if a group-tag isn't specified at the time of validation.
That means the following holds true:
// Age won't be validated since we didn't specify DriverChecks.class
validator.validate( driver );
// Age will be validated here because we specify DriverChecks.class
validator.validate( driver, DriverChecks.class );
This works great when you're triggering the validation yourself inside your service methods because you can manually control which group checks are applicable based on that method's use case.
When it comes to integrating directly with Hibernate ORM's event listeners that can also trigger bean validation, group specifications become a bit harder as they must be specified based on the event-type raised by hibernate.
javax.persistence.validation.group.pre-persist
javax.persistence.validation.group.pre-update
javax.persistence.validation.group.pre-remove
For each of the above properties you can specify in the JPA properties supplied to Hibernate, you can list a comma delimited list of groups that are to be validated for each of those event types. This allows you to have varying checks during insert versus update versus removal.
If that isn't sufficient, there is always the fact that you can create your own constraint validator implementation and annotation to plug into Bean Validation and specify that at the class or property level.
I have often found this useful in cases where values from multiple fields must be validated as a cohesive unit to imply their validity as the normal field-by-field validations didn't suffice.

How to write correct/reliable transactional code with JAX-RS and Spring

Basically, I am trying to understand how to write correct (or "to correctly write"?) transactional code, when developing REST service with Jax-RS and Spring. Also, we're using JOOQ for data-access. But that shouldn't be very relevant...
Consider simple model, where we have some organisations, that have these fields: "id", "name", "code". All of which must be unique. Also there's a status field.
Organization might be removed at some point. But we don't want to remove the data altogether, because we want to save it for analytical/maintenance purposes. So we just set organization 'status' field to 'REMOVED'.
Because we don't delete the organization row from the table, we can't simply put the unique constraint on the "name" column, because, we might delete organization and then create a new one with the same name. But let's assume that codes has to be unique globally, so we DO have a unique constraint on the code column.
So with that, let's see this simple example, that creates organization, performing some checks along the way.
Resource:
#Component
#Path("/api/organizations/{organizationId: [0-9]+}")
#Consumes(MediaType.APPLICATION_JSON)
#Produces(MediaTypeEx.APPLICATION_JSON_UTF_8)
public class OrganizationResource {
#Autowired
private OrganizationService organizationService;
#Autowired
private DtoConverter dtoConverter;
#POST
public OrganizationResponse createOrganization(#Auth Person person, CreateOrganizationRequest request) {
if (organizationService.checkOrganizationWithNameExists(request.name())) {
// this throws special Exception which is intercepted and translated to response with 409 status code
throw Responses.abortConflict("organization.nameExist", ImmutableMap.of("name", request.name()));
}
if (organizationService.checkOrganizationWithCodeExists(request.code())) {
throw Responses.abortConflict("organization.codeExists", ImmutableMap.of("code", request.code()));
}
long organizationId = organizationService.create(person.user().id(), request.name(), request.code());
return dtoConverter.from(organization.findById(organizationId));
}
}
DAO service looks like that:
#Transactional(DBConstants.SOME_TRANSACTION_MANAGER)
public class OrganizationServiceImpl implements OrganizationService {
#Autowired
#Qualifier(DBConstants.SOME_DSL)
protected DSLContext context;
#Override
public long create(long userId, String name, String code) {
Organization organization = new Organization(null, userId, name, code, OrganizationStatus.ACTIVE);
OrganizationRecord organizationRecord = JooqUtil.insert(context, organization, ORGANIZATION);
return organizationRecord.getId();
}
#Override
public boolean checkOrganizationWithNameExists(String name) {
return checkOrganizationExists(Tables.ORGANIZATION.NAME, name);
}
#Override
public boolean checkOrganizationWithCodeExists(String code) {
return checkOrganizationExists(Tables.ORGANIZATION.CODE, code);
}
private boolean checkOrganizationExists(TableField<OrganizationRecord, String> checkField, String checkValue) {
return context.selectCount()
.from(Tables.ORGANIZATION)
.where(checkField.eq(checkValue))
.and(Tables.ORGANIZATION.ORGANIZATION_STATUS.ne(OrganizationStatus.REMOVED))
.fetchOne(DSL.count()) > 0;
}
}
This brings some questions:
Should I put #Transactional annotation on Resource's createOrganization method? Or should I create one more service that talks to DAO and put #Transactional annotation to it's method? Something else?
What would happen if two users concurrently send request with the same "code" field. Before first transaction is commited the checks are successfully passed, so no 409 respones will be sent. Than first transaction will be committed properly, but the second one will violate DB constraint. This will throw SQLException. How to gracefully handle that? I mean I still want to show nice error message on the client side, saying that name is already used. But I can't really parse SQLException or smth.. can I?
Similar to the previous one, but this time "name" is not unique. In this case, second transaction will not violate any constraints, which leads to having two organization with the same name, that violates our buisness constraints.
Where can I see/learn tutorials/code/etc., that you consider great examples on how to write correct/reliable REST+DB code with complicated buisness logic. Github/books/blogs, whatever. I've tried to find something like that myselft, but most examples just focus on the plumbing - add these libs to maven, use these annotations, there is your simple CRUD, the end. They don't contain any transactional considirations at all. I.e.
UPDATE:
I know about isolation level and the usual error/isolation matrix (dirty reads, etc..). The problem I have is finding some "production-ready" sample to learn from. Or a good book on a subject. I still don't really get how to handle all the errors properly.. I guess I need to retry a couple of times, if transaction failed.. and than just throw some generic error and implement client, that handles that.. But do I really have to use SERIALIZABLE mode, whenever I use range queries? Because it will affect performance greatly. But otherwise how can I garantee that transaction will fail..
Anyway I've decided that for now I need more time to learn about transactions and db management in general to tackle this problem...
Generally, without talking about transactionality, endpoint should only grab parameters from the request and call the Service. It shouldn't do business logic.
It seems your checkXXX methods are part of the business logic, because they throw errors about domains-specific conflicts. Why not put them into the Service into one method, which is by the way transactional?
//service code
public Organization createOrganization(String userId, String name, String code) {
if (this.checkOrganizationWithNameExists(request.name())) {
throw ...
}
if (this.checkOrganizationWithCodeExists(code)) {
throw ...
}
long organizationId = this.create(userId, name, code);
return dao.findById(organizationId);
}
I took as your parameters are Strings, but they can be anything. I'm not sure you want to throw Responses.abortConflict in the service layer because it seems to be a REST concept, but you can define your own exception types for it if you want.
Endpoint code should look like this, however, it might contain additional try-catch block which converts the thrown exceptions to Error responses:
//endpoint code
#POST
public OrganizationResponse createOrganization(#Auth Person person, CreateOrganizationRequest request) {
String code = request.code();
String name = request.name();
String userId = person.user().id();
return dtoConverter.from(organizationService.createOrganization(userId, name, code));
}
As for question 2 and 3, transaction isolation levels are your friends. Put isolation level high enough. I think 'repeatable read' is the suitable one in your case. Your checkXXX methods will detect if some other transaction commits entities with the same name or code and it's guaranteeed that the situations stays by the time 'create' method is executed. One more useful read regarding Spring and transaction isolation levels.
As per my understanding the best way to handle DB level transaction you must use Spring's Isolation trnsaction in effective way in the dao layer. Below is sample industry standard codde in your case...
public interface OrganizationService {
#Retryable(maxAttempts=3,value=DataAccessResourceFailureException.class,backoff=#Backoff(delay = 1000))
public boolean checkOrganizationWithNameExists(String name);
}
#Repository
#EnableRetry
public class OrganizationServiceImpl implements OrganizationService {
#Transactional(isolation = Isolation.READ_COMMITTED)
#Override
public boolean checkOrganizationWithNameExists(String name){
//your code
return true;
}
}
Please pinch me if I'm wrong in here
Separation of concern :
Jax-rs resource (endpoint) layer : just handle the request, invoke the service and wrap the potential exception in appropriate response code (just catch and wrap manually or use exception mapper).
Service / business layer : expose a transactional method for each unit of work, business error must be handled as checked exception, operational ones as unchecked (subclasses of RuntimeException).
Data access layer: just handle the data access stuff (i.e. get db context, executes query and eventually map the result).
I insist on one thing, the good place to have transaction boundaries is the place where your business methods are defined. A transaction scope must be a business unit of work.
Regarding the concurrency issue, there is 2 way to handle this kind of concurrency problem : pessimistic or optimistic locking.
Pessimistic :
Lock
do your stuff
Update
Release lock
Optimistic :
check version
do your stuff
update if version is same, fail otherwise
Pessimistic is an issue regarding scalability and performance, optimistic problem is that you sometimes end by sending an operating error to the end-user.
I would personally go with optimistic locking in your case, JOOQ support it
First off the DAO layer should not even know it's being fronted by a REST webservice. Be sure to separate responsibilities.
Keep the #Transactional on the DAO. If you are issuing only a single statement than you need to decide if you are OK with dirty reads. Basically, figure out what the lowest Isolation Level is for your application. Every method will start a new Transaction (unless called from another method that already had one started) and if any Exceptions are thrown it will rollback any calls. You can setup a custom ExceptionHandler in your Controller to handle SQLDataIntegrityExceptions (like you're "code" insert example).
Use an Aggregate Primary Key that covers (id, name, code, status) so you can have an org with the same name but one will be "CURRENT" and one will be "REMOVED"

Exclude field in JPA Entity Listener

I have an entity class in my Enterprise Java application that has an entity listener attached to it:
#Entity
#EntityListeners(ChangeListener.class)
public class MyEntity {
#Id
private long id;
private String name;
private Integer result;
private Boolean dirty;
...
}
However, I would like it so that the entity listener got triggered for all fields except the boolean one. Is there any way exclude a field from triggering the entity listener without making it transient?
I'm using Java EE 5 with Hibernate.
However, it is possible if you implement your own solution. I've had the same need for audit log business requirement, so designed my own AuditField annotation, and applied to the fields to be audit-logged.
Here's the example in one entity bean - Site.
#AuditField(exclude={EntityActionType.DELETE})
#Column(name = "site_code", nullable = false)
private String siteCode;
So, the example indicates the 'siteCode' is a field to audit log, except DELETE action. (EntityActionType is an enum and it contains CRUD operations.)
Also, the EntityListenerhas this part of code.
#PostPersist
public void created(Site pEntity) {
log(pEntity, EntityActionType.CREATE);
}
#PreUpdate
public void updated(Site pEntity) {
log(pEntity, EntityActionType.UPDATE);
}
#PreRemove
public void deleted(Site pEntity) {
log(pEntity, EntityActionType.DELETE);
}
Now what it has to do in log() is, to figure what fields are to audit log and what custom actions are involved optionally.
However, there's another to consider.
If you put the annotation at another entity variable, what fields of the entity have to be logged? (i.e. chained logging)
It's your choice whether what are annotated with #AuditField only in the entity or some other ways. For my case, we decided to log only the entity ID, which is a PK of a DB table. However, I wanted to make it flexible assuming the business can change. So, all the entites must implement auditValue() method, which is coming from a base entity class, and the default implementation (that's overridable) is to return its ID.
There is some kind of mixing of concepts here. EntityListeners are not notified about changes in attribute values - not for single attribute, neither for all attributes.
For reason they are called lifecycle callbacks. They are triggered by following lifecycle events of entity:
persist (pre/post)
load (post)
update(pre/post)
remove (pre/post)
For each one of them there is matching annotation. So answer is that it is not possible to limit this functionality by type of persistent attributes.

Initialize JPA-like entities with JDBC

I'm implementing several DAO classes for a web project and for some reasons I have to use JDBC.
Now I'd like to return an entity like this:
public class Customer{
// instead of int userId
private User user;
// instead of int activityId
private Activity act;
// ...
}
Using JPA user and activity would be loaded easily (and automatically specifying relations between entities).
But how, using JDBC? Is there a common way to achieve this? Should I load everiting in my CustomerDAO? IS it possible to implement lazy initialization for referenced entities?
My first idea was to implement in my UserDAO:
public void initUser(Customer customer);
and in my ActivityDAO:
public void initActivity(Customer customer);
to initialize variables in customer.
Active Record route
You could do this with AspectJ ITDs and essentially make your entities into Active Record like objects.
Basically you make an Aspect that advises class that implement an interface called "HasUser" and "HasActivity". Your interfaces HasUser and HasActivity will just define getters.
You will then make Aspects that will weave in the actual implementation of getUser() and getActivity().
Your aspects will do the actual JDBC work. Although the learning curve on AspectJ is initially steep it will make your code far more elegant.
You can take a look at one of my answers on AspectJ ITD stackoverflow post.
You should also check out springs #Configurable which will autowire in your dependencies (such as your datasource or jdbc template) into non managed spring bean.
Of course the best example of to see this in action is Spring Roo. Just look at the AspectJ files it generates to get an idea (granted that roo uses JPA) of how you would use #Configurable (make sure to use the activerecord annotation).
DAO Route
If you really want to go the DAO route than you need to this:
public class Customer{
// instead of int userId
private Integer userId;
// instead of int activityId
private Integer activityId;
}
Because in the DAO pattern your entity objects are not supposed to have behavior. Your Services and/or DAO's will have to make transfer objects or which you could attach the lazy loading.
I'm not sure if there is any automated approach about this. Without ORM I usually define getters as singletons where my reference types are initialized to null by default, i.e. my fetching function would load primitives + Strings and will leave them as null. Once I need getUser(), my getter would see if this is null and if so, it would issue another select statement based on the ID of the customer.

Categories

Resources