I have a problem with accessing data inside a running transaction when the data came from another (supposedly closed) transaction. I have three classes like below, with an entity (called MyEntity) which also has another entity connected via Hibernate mapping called "OtherEntity" which has lazy loading set to true. Notice how I have two transactions:
One to load a list of entities
And a new transaction for each new item
However, this fails inside the loop with "No session" even though I have an active transaction inside the method (TransactionSynchronizationManager.isActualTransactionActive is true).
I don't really understand the problem. Seems to me the object which is used by the second transaction(s) "belong" to the first one even though the first transaction was supposed to finish? Maybe its a race condition?
#Service
class ServiceA {
#Autowired
private ServiceB serviceB;
#Autowired
private ServiceC serviceC;
public void test() {
List<MyEntity> allEntities = serviceC.loadAllEntities(); //First transaction ran, getting a list of entities, but due to lazy loading we havent loaded all the data
for(MyEntity i : allEntities) {
serviceB.doOnEach(i); //On each element a new transaction should start
}
}
}
#Service
class ServiceB {
#Transactional
public void doOnEach(MyEntity entity) {
System.out.println(TransactionSynchronizationManager.isActualTransactionActive()); //true, therefore we have an active transaction here
OtherEntity other = entity.getSomeOtherEntity(); //Want to load the "lazy loaded" entity here
//"No Session" exception is thrown here
}
}
#Service
class ServiceC {
#Autowired
private MyRepository myRepository;
#Transactional
public List<MyEntity> loadAllEntities() {
return myRepository.findAll();
}
}
A solution would be to re-load the "MyEntity" instance inside the "doOnEach" method, but that seems to me like a sub-optimal solution, especially on big lists. Why would I reload all the data which is already supposed to be there?
Any help is appreciated.
Obviously the real code is a lot more complicated than this but I have to have these kind of separate transactions for business reasons, so please no "solutions" which re-write the core logic of this. I just want to understand whats going on here.
After the call to loadAllEntities() finishes the Spring proxy commits the transaction and closes the associated Hibernate Session. This means you cannot have Hibernate transparently load the non-loaded lazy associations anymore without explicitly telling it to do so.
If for some reason you really want your associated entities to be loaded lazily the two options you have is either use Hibernate.initialize(entity.getSomeOtherEntity()) in your doOnEach() method or set the spring.jpa.open-in-view property to true to have the OpenSessionInViewInterceptor do it for you.
Otherwise it's a good idea to load them together with the parent entity either via JOIN FETCH in your repository query or via an Entity Graph.
References:
https://www.baeldung.com/spring-open-session-in-view
https://www.baeldung.com/hibernate-initialize-proxy-exception
To clarify further:
Spring creates a transaction and opens a new Session (A) before entering the loadAllEntities() method and commits/closes them upon returning. When you call entity.getSomeOtherEntity() the original Session (A) that loaded entity is gone (i.e. entity is detached) but instead there's a new Session (B) which was created upon entering the doOnEach() transactional method. Obviously Session (B) doesn't know anything about entity and its relations and at the same time the Hibernate proxy of someOtherEntity inside entity references the original Session (A) and doesn't know anything about Session (B). To make the Hibernate proxy of someOtherEntity actually use the current active Session (B) you can call Hibernate.initialize().
Recently I was working on a little RESTful API using Spring and I came across the ModelAttribute annotation.
I noticed that there is some very interesting behavior associated with it, mainly the fact that you can stick it onto a method and it will get called before the handler for a given request is called, allowing you to do anything before data is bound to the arguments of your handler method.
One usage that comes to mind is default values:
#ModelAttribute("defaultEntity")
public Entity defaultEntity() {
final var entity = new Entity();
entity.setName("default name");
return entity;
}
#PostMapping("/entity")
public Entity createNewEntity(#Valid #ModelAttribute("defaultEntity") Entity entity) {
dao.saveEntity(entity);
return entity;
}
In this case, when a POST request comes to /entity, the first thing that will happen is that defaultEntity will get called, creating an entity with some default values pre-filled. Then, Spring will bind the incoming data into it (potentially overwriting the defaults or keeping them as-is) and then pass it into the createNewEntity handler. This is actually pretty nice, IMO.
Another surprising fact is that the annotated method can actually take parameters in much the same way as the handler method. A simple way to do partial entity updates could be something like this:
// first fetch the original entity from the database
#ModelAttribute("originalEntity")
public Entity originalEntity(#PathVariable("id") long id ) {
return dao.getEntity(id);
}
// then let Spring bind data to the entity and validate it
#PostMapping("/entity/{id}")
public Entity updateEntity(#Valid #ModelAttribute("originalEntity") Entity entity) {
// and finally we save it
dao.saveEntity(entity);
return entity;
}
Again, this is surprisingly easy.
Even more surprising is that different model attributes can depend on each other, so you can have a complicated multi-stage monster if you want:
// first fetch the original entity from the database
#ModelAttribute("originalEntity")
public Entity originalEntity(#PathVariable("id") long id ) {
return dao.getEntity(id);
}
// then let Spring bind data to the entity, validate it and do some processing to it
#ModelAttribute("boundAndValidatedEntity")
public Entity boundAndValidatedEntity(#Valid #ModelAttribute("originalEntity") Entity entity) {
processEntity(entity);
return entity;
}
// finally check that the entity is still valid and then save it
#PostMapping("/entity/{id}")
public Entity updateEntity(#Valid #ModelAttribute(value = "boundAndValidatedEntity", binding = false) Entity entity) {
dao.saveEntity(entity);
return entity;
}
Obviously not all of the model attributes have to be of the same type, some can depend on multiple arguments from different places. It's like a mini-DI container within a single controller.
However, there are some drawbacks:
as far as I can tell, it only works with query parameters and there is no way to make it work with other kinds of request parameters, such as the request body or path variables
all of the ModelAttribute-annotated methods within a single controller will always be called, which can
have a performance impact
be annoying to work with, since Spring will need to be able to gather all of the method's arguments (which may be impossible, for example when they reference a path variable that doesn't exist in the current request)
So, while ModelAttribute doesn't really seem too useful by itself because of these issues, I feel like the main idea behind it - essentially allowing you to control the construction of a method's parameter before it's bound/validated while being able to easily access other request parameters - is solid and could be very useful.
So, my question is simple - is there anything in Spring that would essentially act like ModelAttribute but without the drawbacks that I mentioned? Or maybe in some 3rd party library? Or maybe I could write something like this myself?
I am currently working on a medium sized, desktop-based administration and configuration tool implemented in Java using JavaFx, google-guice, and hibernate for its jpa implementation.
Until now i got away with a single EntityManager injected as a #Singleton. Meaning that i had this EntityManager "open" from start to shutdown. All loaded entites were permanently known in the context and I barely had any problems with this approach. Although i know/believe it is not the best solution (but easy and a I had no time to redesign the application).
Now the application gets extended and I have to use multiple persistence units simultaneously.
I could try to get my current singleton-approach working with using something like:
#Inject
#PersistenceContext(name="JPA-Unit1")
#Singleton
private EntityManager em;
It never felt perfect, but that feels "ugly". And since I had severe problems getting multiple persistence contexts working with guice, I had to do a lot of reasearch on this topic.
And i came across several blogs SO-questions either mentioning that an instance of the EntityManager should only live as long it is needed or some extended persistence contexts.
Since I useJavaFx in place I use the *Property classes to bind the data directly into the UI.
Simplified user entity (property-based access):
#Entity
#Table(name = "USERS")
#NamedQuery(name = "User.findAll", query = "SELECT u FROM User u")
public class User implements Serializable {
[...]
private final SimpleStringProperty loginProperty = new SimpleStringProperty();
public User() {
}
public String getLogin() {
return this.loginProperty.get();
}
public void setLogin(String login) {
this.loginProperty.set(login);
}
public SimpleStringProperty loginProperty() {
return this.loginProperty;
}
[...]
}
If i start editing the user data in the UI it gets directly updated in the entity:
this.login.textProperty().bindBidirectional(user.loginProperty());
There is no need for extensive "business logic". It gets all handled via (input) validation. If all input is valid i simply save the data via
userService.update(user);
Parts of the UserService (exactly: its abstract super-class):
public abstract class AbstractService<PK extends Serializable, Type> implements GenericService<PK, Type> {
protected Class<Type> clazz;
#PersistenceContext(name = "JPA-Unit1")
#Inject
protected Provider<EntityManager> emProvider;
public AbstractService(Class<Type> clazz) {
this.clazz = clazz;
}
#Transactional
#Override
public Type create(Type entity) {
this.emProvider.get().persist(entity);
return entity;
}
#Transactional
#Override
public Type update(Type entity) {
this.emProvider.get().persist(entity);
return entity;
}
}
As you can see: the service class is pretty straightforward. I could even delete all this "service"-classes and directly use the entitymanager directly in my UI controller.
In this service you can see the "problem" the user i edit got loaded earlier by its named query and put into a list. The loading is also done in a #Transactional method.
But everytime i call this.emProvider.get() I get a new instance with an empty context. And if I want to save the previously edited user I have the problem that persist actually performs an insert (I assume because it is not known in the context [detached]) which leads to an PK-constraint violation or if I delete (null) its ID-property there is a new user row inserted.
My actual questions are:
1. Is this approach "OK"? If yes what do I do with this "always" new persistence context? Call contains and merge every single time?
Should I get rid of my service class and implement the persistence operations directly in my UI-controller?
Can I do an this.emProvider.get() once the User-UI-controller got loaded and use it the entire life time of the application?
Something totally different?
My understanding is that your app uses Guice Persist.
The answer to this question depends on your use cases; however, you absolutely need to realize one thing:
For as long as an EntityManager is open, its underlying persistence context tracks every single change to each persistent entity.
This means that if you keep an entity manager open for the duration of the application, whenever you call e.g. User.setLogin(), the change you just made is already regarded as persistent. Now, moving to your update method, calling persist on an entity that is already managed has no effect; however, since you're calling it from a #Transactional method, Guice wraps the call in a transaction, and consequently, all the changes are are being flushed to the database once the method ends.
This means that if you modify multiple entities at once within your app, and then call AbstractService.update on one of them, you will actually be saving all the changes your app has done to other entities in the meantime, even if AbstractService.update has not been called on them explicitly.
Using the entity manager-per-transaction approach is indeed much safer. Between transactions, there will be no open persistence context, and as a result all the entities will become detached, which will prevent any updates on them from accidentally being flushed to the database.
However, for the same reason, your update method will need to call em.merge on the entity you want to update in the database. merge is basically telling the entity manager 'please put this entity back into the persistence context, and make it have the exact state that the provided entity has'. Calling persist makes it look as though it was a new entity, and PK-constraint violations will indeed follow.
Basically, I am trying to understand how to write correct (or "to correctly write"?) transactional code, when developing REST service with Jax-RS and Spring. Also, we're using JOOQ for data-access. But that shouldn't be very relevant...
Consider simple model, where we have some organisations, that have these fields: "id", "name", "code". All of which must be unique. Also there's a status field.
Organization might be removed at some point. But we don't want to remove the data altogether, because we want to save it for analytical/maintenance purposes. So we just set organization 'status' field to 'REMOVED'.
Because we don't delete the organization row from the table, we can't simply put the unique constraint on the "name" column, because, we might delete organization and then create a new one with the same name. But let's assume that codes has to be unique globally, so we DO have a unique constraint on the code column.
So with that, let's see this simple example, that creates organization, performing some checks along the way.
Resource:
#Component
#Path("/api/organizations/{organizationId: [0-9]+}")
#Consumes(MediaType.APPLICATION_JSON)
#Produces(MediaTypeEx.APPLICATION_JSON_UTF_8)
public class OrganizationResource {
#Autowired
private OrganizationService organizationService;
#Autowired
private DtoConverter dtoConverter;
#POST
public OrganizationResponse createOrganization(#Auth Person person, CreateOrganizationRequest request) {
if (organizationService.checkOrganizationWithNameExists(request.name())) {
// this throws special Exception which is intercepted and translated to response with 409 status code
throw Responses.abortConflict("organization.nameExist", ImmutableMap.of("name", request.name()));
}
if (organizationService.checkOrganizationWithCodeExists(request.code())) {
throw Responses.abortConflict("organization.codeExists", ImmutableMap.of("code", request.code()));
}
long organizationId = organizationService.create(person.user().id(), request.name(), request.code());
return dtoConverter.from(organization.findById(organizationId));
}
}
DAO service looks like that:
#Transactional(DBConstants.SOME_TRANSACTION_MANAGER)
public class OrganizationServiceImpl implements OrganizationService {
#Autowired
#Qualifier(DBConstants.SOME_DSL)
protected DSLContext context;
#Override
public long create(long userId, String name, String code) {
Organization organization = new Organization(null, userId, name, code, OrganizationStatus.ACTIVE);
OrganizationRecord organizationRecord = JooqUtil.insert(context, organization, ORGANIZATION);
return organizationRecord.getId();
}
#Override
public boolean checkOrganizationWithNameExists(String name) {
return checkOrganizationExists(Tables.ORGANIZATION.NAME, name);
}
#Override
public boolean checkOrganizationWithCodeExists(String code) {
return checkOrganizationExists(Tables.ORGANIZATION.CODE, code);
}
private boolean checkOrganizationExists(TableField<OrganizationRecord, String> checkField, String checkValue) {
return context.selectCount()
.from(Tables.ORGANIZATION)
.where(checkField.eq(checkValue))
.and(Tables.ORGANIZATION.ORGANIZATION_STATUS.ne(OrganizationStatus.REMOVED))
.fetchOne(DSL.count()) > 0;
}
}
This brings some questions:
Should I put #Transactional annotation on Resource's createOrganization method? Or should I create one more service that talks to DAO and put #Transactional annotation to it's method? Something else?
What would happen if two users concurrently send request with the same "code" field. Before first transaction is commited the checks are successfully passed, so no 409 respones will be sent. Than first transaction will be committed properly, but the second one will violate DB constraint. This will throw SQLException. How to gracefully handle that? I mean I still want to show nice error message on the client side, saying that name is already used. But I can't really parse SQLException or smth.. can I?
Similar to the previous one, but this time "name" is not unique. In this case, second transaction will not violate any constraints, which leads to having two organization with the same name, that violates our buisness constraints.
Where can I see/learn tutorials/code/etc., that you consider great examples on how to write correct/reliable REST+DB code with complicated buisness logic. Github/books/blogs, whatever. I've tried to find something like that myselft, but most examples just focus on the plumbing - add these libs to maven, use these annotations, there is your simple CRUD, the end. They don't contain any transactional considirations at all. I.e.
UPDATE:
I know about isolation level and the usual error/isolation matrix (dirty reads, etc..). The problem I have is finding some "production-ready" sample to learn from. Or a good book on a subject. I still don't really get how to handle all the errors properly.. I guess I need to retry a couple of times, if transaction failed.. and than just throw some generic error and implement client, that handles that.. But do I really have to use SERIALIZABLE mode, whenever I use range queries? Because it will affect performance greatly. But otherwise how can I garantee that transaction will fail..
Anyway I've decided that for now I need more time to learn about transactions and db management in general to tackle this problem...
Generally, without talking about transactionality, endpoint should only grab parameters from the request and call the Service. It shouldn't do business logic.
It seems your checkXXX methods are part of the business logic, because they throw errors about domains-specific conflicts. Why not put them into the Service into one method, which is by the way transactional?
//service code
public Organization createOrganization(String userId, String name, String code) {
if (this.checkOrganizationWithNameExists(request.name())) {
throw ...
}
if (this.checkOrganizationWithCodeExists(code)) {
throw ...
}
long organizationId = this.create(userId, name, code);
return dao.findById(organizationId);
}
I took as your parameters are Strings, but they can be anything. I'm not sure you want to throw Responses.abortConflict in the service layer because it seems to be a REST concept, but you can define your own exception types for it if you want.
Endpoint code should look like this, however, it might contain additional try-catch block which converts the thrown exceptions to Error responses:
//endpoint code
#POST
public OrganizationResponse createOrganization(#Auth Person person, CreateOrganizationRequest request) {
String code = request.code();
String name = request.name();
String userId = person.user().id();
return dtoConverter.from(organizationService.createOrganization(userId, name, code));
}
As for question 2 and 3, transaction isolation levels are your friends. Put isolation level high enough. I think 'repeatable read' is the suitable one in your case. Your checkXXX methods will detect if some other transaction commits entities with the same name or code and it's guaranteeed that the situations stays by the time 'create' method is executed. One more useful read regarding Spring and transaction isolation levels.
As per my understanding the best way to handle DB level transaction you must use Spring's Isolation trnsaction in effective way in the dao layer. Below is sample industry standard codde in your case...
public interface OrganizationService {
#Retryable(maxAttempts=3,value=DataAccessResourceFailureException.class,backoff=#Backoff(delay = 1000))
public boolean checkOrganizationWithNameExists(String name);
}
#Repository
#EnableRetry
public class OrganizationServiceImpl implements OrganizationService {
#Transactional(isolation = Isolation.READ_COMMITTED)
#Override
public boolean checkOrganizationWithNameExists(String name){
//your code
return true;
}
}
Please pinch me if I'm wrong in here
Separation of concern :
Jax-rs resource (endpoint) layer : just handle the request, invoke the service and wrap the potential exception in appropriate response code (just catch and wrap manually or use exception mapper).
Service / business layer : expose a transactional method for each unit of work, business error must be handled as checked exception, operational ones as unchecked (subclasses of RuntimeException).
Data access layer: just handle the data access stuff (i.e. get db context, executes query and eventually map the result).
I insist on one thing, the good place to have transaction boundaries is the place where your business methods are defined. A transaction scope must be a business unit of work.
Regarding the concurrency issue, there is 2 way to handle this kind of concurrency problem : pessimistic or optimistic locking.
Pessimistic :
Lock
do your stuff
Update
Release lock
Optimistic :
check version
do your stuff
update if version is same, fail otherwise
Pessimistic is an issue regarding scalability and performance, optimistic problem is that you sometimes end by sending an operating error to the end-user.
I would personally go with optimistic locking in your case, JOOQ support it
First off the DAO layer should not even know it's being fronted by a REST webservice. Be sure to separate responsibilities.
Keep the #Transactional on the DAO. If you are issuing only a single statement than you need to decide if you are OK with dirty reads. Basically, figure out what the lowest Isolation Level is for your application. Every method will start a new Transaction (unless called from another method that already had one started) and if any Exceptions are thrown it will rollback any calls. You can setup a custom ExceptionHandler in your Controller to handle SQLDataIntegrityExceptions (like you're "code" insert example).
Use an Aggregate Primary Key that covers (id, name, code, status) so you can have an org with the same name but one will be "CURRENT" and one will be "REMOVED"
I'm implementing several DAO classes for a web project and for some reasons I have to use JDBC.
Now I'd like to return an entity like this:
public class Customer{
// instead of int userId
private User user;
// instead of int activityId
private Activity act;
// ...
}
Using JPA user and activity would be loaded easily (and automatically specifying relations between entities).
But how, using JDBC? Is there a common way to achieve this? Should I load everiting in my CustomerDAO? IS it possible to implement lazy initialization for referenced entities?
My first idea was to implement in my UserDAO:
public void initUser(Customer customer);
and in my ActivityDAO:
public void initActivity(Customer customer);
to initialize variables in customer.
Active Record route
You could do this with AspectJ ITDs and essentially make your entities into Active Record like objects.
Basically you make an Aspect that advises class that implement an interface called "HasUser" and "HasActivity". Your interfaces HasUser and HasActivity will just define getters.
You will then make Aspects that will weave in the actual implementation of getUser() and getActivity().
Your aspects will do the actual JDBC work. Although the learning curve on AspectJ is initially steep it will make your code far more elegant.
You can take a look at one of my answers on AspectJ ITD stackoverflow post.
You should also check out springs #Configurable which will autowire in your dependencies (such as your datasource or jdbc template) into non managed spring bean.
Of course the best example of to see this in action is Spring Roo. Just look at the AspectJ files it generates to get an idea (granted that roo uses JPA) of how you would use #Configurable (make sure to use the activerecord annotation).
DAO Route
If you really want to go the DAO route than you need to this:
public class Customer{
// instead of int userId
private Integer userId;
// instead of int activityId
private Integer activityId;
}
Because in the DAO pattern your entity objects are not supposed to have behavior. Your Services and/or DAO's will have to make transfer objects or which you could attach the lazy loading.
I'm not sure if there is any automated approach about this. Without ORM I usually define getters as singletons where my reference types are initialized to null by default, i.e. my fetching function would load primitives + Strings and will leave them as null. Once I need getUser(), my getter would see if this is null and if so, it would issue another select statement based on the ID of the customer.