I am using Spring transactions so the transaction is still active when POJO to DTO conversion occurs.
I would like to prevent Dozer from triggering lazy loading, so that hidden sql queries never occur : all fetching has to be done explicitly via HQL (to get the best control on performances).
Is it a good practice (I can't find it documented anywhere) ?
How to do it safely ?
I tried this before DTO conversion :
PlatformTransactionManager tm = (PlatformTransactionManager) SingletonFactoryProvider.getSingletonFactory().getSingleton("transactionManager");
tm.commit(tm.getTransaction(new DefaultTransactionDefinition()));
I don't know what happens to the transaction, but the Hibernate session doesn't get closed, and the lazy loading still occurs.
I tried this :
SessionFactory sf = (SessionFactory) SingletonFactoryProvider.getSingletonFactory().getSingleton("sessionFactory");
sf.getCurrentSession().clear();
sf.getCurrentSession().close();
And it prevents lazy loading, but is it a good practice to manipulate session directly in the application layer (which is called "facade" in my project) ? Which negative side effects should I fear ? (I've already seen that tests involving POJO -> DTO conversions could no more be launched through AbstractTransactionnalDatasource Spring test classes, because this classes try to trigger a rollback on a transaction which is no more linked to an active session).
I've also tried to set propagation to NOT_SUPPORTED or REQUIRES_NEW, but it reuse the current Hibernate session, and doesn't prevent lazy loading.
The only generic solution I have found for managing this (after looking into Custom Converters, Event Listeners & Proxy Resolvers) is by implementing a Custom Field Mapper. I found this functionality tucked away in the Dozer API (I don't believe it is documented in the User Guide).
A simple example is as follows;
public class MyCustomFieldMapper implements CustomFieldMapper
{
public boolean mapField(Object source, Object destination, Object sourceFieldValue, ClassMap classMap, FieldMap fieldMapping)
{
// Check if field is a Hibernate collection proxy
if (!(sourceFieldValue instanceof AbstractPersistentCollection)) {
// Allow dozer to map as normal
return false;
}
// Check if field is already initialized
if (((AbstractPersistentCollection) sourceFieldValue).wasInitialized()) {
// Allow dozer to map as normal
return false;
}
// Set destination to null, and tell dozer that the field is mapped
destination = null;
return true;
}
}
This will return any non-initialized PersistentSet objects as null. I do this so that when they are passed to the client I can differentiate between a NULL (non-loaded) collection and an empty collection. This allows me to define generic behaviour in the client to either use the pre-loaded set, or make another service call to retrieve the set (if required). Additionally, if you decide to eagerly load any collections within the service layer then they will be mapped as usual.
I inject the custom field mapper using spring:
<bean id="dozerMapper" class="org.dozer.DozerBeanMapper" lazy-init="false">
<property name="mappingFiles">
...
</property>
<property name="customFieldMapper" ref="dozerCustomFieldMapper" />
</bean>
<bean id="dozerCustomFieldMapper" class="my.project.MyCustomFieldMapper" />
I hope this helps anyone searching for a solution for this, as I failed to find any examples when searching the Internet.
A variation on the popular version above, makes sure to catch both PersistentBags, PersistentSets, you name it...
public class LazyLoadSensitiveMapper implements CustomFieldMapper {
public boolean mapField(Object source, Object destination, Object sourceFieldValue, ClassMap classMap, FieldMap fieldMapping) {
//if field is initialized, Dozer will continue mapping
// Check if field is derived from Persistent Collection
if (!(sourceFieldValue instanceof AbstractPersistentCollection)) {
// Allow dozer to map as normal
return false;
}
// Check if field is already initialized
if (((AbstractPersistentCollection) sourceFieldValue).wasInitialized()) {
// Allow dozer to map as normal
return false;
}
return true;
}
}
I didn't get the above to work (probably different versions). However this works fine
public class HibernateInitializedFieldMapper implements CustomFieldMapper {
public boolean mapField(Object source, Object destination, Object sourceFieldValue, ClassMap classMap, FieldMap fieldMapping) {
//if field is initialized, Dozer will continue mapping
return !Hibernate.isInitialized(sourceFieldValue));
}
}
Have you considered disabling lazy loading altogether?
It doesn't really seem to jive with the patterns you state you would like to use:
I would like to prevent Dozer from triggering lazy loading, so that hidden sql queries never occur : all fetching has to be done explicitly via HQL (to get the best control on performances).
This suggests you would never want to use lazy loading.
Dozer and the Hibernate-backed beans you pass to it are blissfully ignorant of each other; all Dozer knows is that it is accessing properties in the bean, and the Hibernate-backed bean is responding to calls to get() a lazy-loaded collection just as it would if you were accessing those properties yourself.
Any tricks to make Dozer aware of the Hibernate proxies in your beans or vice versa would, IMO, break down the layers of your app.
If you don't want any "hidden SQL queries" fired at unexpected times, simply disable lazy-loading.
Short version of this mapper will be
return sourceFieldValue instanceof AbstractPersistentCollection &&
!( (AbstractPersistentCollection) sourceFieldValue ).wasInitialized();
Using CustomFieldMapper may not be a good idea as it gonna invoke for every field of your source class,but our concern is only lazy association mapping(child object list),so we can set the null value in getter of the entity object,
public Set<childObject> getChild() {
if(Hibernate.isInitialized(child){
return childObject;
}else
return null;
}
Related
I am having a service that gets the data from the database which has a column which is stored with encrypted value.
After fetching from the DAO, i will update the value of the property to decrypted value and then send it as response for the API.
I assume that the entity is having change tracking enabled for select queries also because after i get the data, the data is updated in the DB with the decrypted password. I have googled and found that the use of EntityManager solves the problem, but for this implementation I have to do a lot of code changes in many entities.
from this link, i see that we have to write custom stateless bean and inject to the code, but it looks like not right. Please suggest me the best approach to handle this problem.
My DAO:
#Repository
public interface EnvironmentDao extends JpaRepository<Environment, Long> {
//custom methods go here with native queries
}
My Service
#Override
public List<Environment> getEnvironmentsByIds(List<Long> environmentIds) throws Exception {
if (environmentIds == null || environmentIds.size() < 1) {
return null;
}
return decryptPassword(environmentDao.findAllById(environmentIds));
}
Inside the decryptPassword method, i am just looping through all the records and then setting the decrypted password like
e.setDB_Password(encryptionService.decrypt(e.getDB_Password()));
One case that i noticed yesterday is that for a similar entity on any error, there was a DB save and that time the values got updated, so after fixing the error, this change was not happening.
Please help me as I am not an expert in java and taking more time to analyze and could not understand. In the case of C#, i would use .AsNoTracking(), but i don't know java much and fiddling around.
Tried the following in the Service
#Autowired
EntityManager entityManager;
In the method,
Optional<Environment> environment = environmentDao.findById(id);
entityManager.detach(environment.get());
return managePassword(environment.get(), false);
I would suggest two options to overcome the entity being updated unintentionally:
Instead of returning the entity itself I would suggest creating a DTO class and creating an instance of that class and setting relevant properties on to the DTO instance so that no changes will be made to the entity itself. So the code will be sth like:
public List<EnvironmentDTO> getEnvironmentsByIds(List<Long> environmentIds) throws Exception {
if (environmentIds == null || environmentIds.size() < 1) {
return null;
}
return createEnvironmentDTOs(environmentDao.findAllById(environmentIds));
}
private LisT<EnvironmentDTO> createEnvironmentDTOs(List<Environment> environments) {
return environments.stream().map((env) -> {
EnvironmentDTO envDto = new EnvironmentDTO();
// Copy all relevant fields to DTO (you can even use some Mapper library for this, i.e. http://modelmapper.org/)
envDto.setDB_Password(encryptionService.decrypt(e.getDB_Password()));
})
}
If you want to return the entity no matter what instead of creating a DTO class and instance from it; you can detach the entity so that changes to the entity will not be reflected to database. So what you need to do is detaching entity after you are done with decrypting the password and setting it back to the entity: entityManager.detach(environment)
I'd like to implement repository method void touch(MyEntity myEntity) which enforces SQL call of update of entity columns to their current values. (The reason behind is the on update trigger which needs to be invoked in some point of execution.) Ideal usecase is:
void serviceMethod(Long myEntityId) {
MyEntity myEntity = myEntityRepository.findOne(myEntityId);
...
myEntityRepository.touch(myEntity);
...
}
There are already similar questions on SO which don't work for me: Force update in Hibernate (my entity is detached), Implementing “touch” on JPA entity? (doing some harmless change works but is not general and has bad impact on code readability), Hibernate Idempotent Update (similar example).
I am aware of session interceptor method findDirty and also CustomEntityDirtinessStrategy both described in this Vlad Mihalcea's article. However, it seems to use findDirty I would have to override session interceptor, which is not possible from within repository method since the interceptor is final field assigned to session at session creation. And CustomEntityDirtinessStrategy comes from SessionFactory which is global. I rather need some one-shot solution to temporary consider one concrete entity of one concrete class dirty.
The so-far-best working solution is to set invalid (array of nulls) entity snapshot into persistence context, so that the subsequent logic in flush() evaluates entity as differing from snapshot and enforce update. This works:
#Override
#Transactional
public void touch(final T entity) {
SessionImpl session = (SessionImpl)em.getDelegate();
session.update(entity);
StatefulPersistenceContext pctx = (StatefulPersistenceContext) session.getPersistenceContext();
Serializable id = session.getIdentifier(entity);
EntityPersister persister = session.getEntityPersister(null, entity);
EntityKey entityKey = session.generateEntityKey(id, persister);
int length = persister.getPropertyNames().length;
Field entitySnapshotsByKeyField = FieldUtils.getField(pctx.getClass(), "entitySnapshotsByKey", true);
Map<EntityKey,Object> entitySnapshotsByKey = (Map<EntityKey,Object>)ReflectionUtils.getField(entitySnapshotsByKeyField, pctx);
entitySnapshotsByKey.put(entityKey, new Object[length]);
session.flush();
em.refresh(entity);
}
The advice in Force update in Hibernate didn't work for me because session.evict(entity) clears entitySnapshotsByKey entry at all, which causes subsequent org.hibernate.event.internal.DefaultFlushEntityEventListener#getDatabaseSnapshot loads fresh entity from db. The question is 9 years old and I'm not sure if it's applicable to current version of Hibernate (mine is 5.2.17).
I am not satisfied with such hacky solution though. Is there some straightforward way or something I could do simpler?
I have the following question. From what I understand the #Transactional annotation is supposed to keep the session alive, thus enabling to lazy fetch child entities without the need to performe a specific joining query.
I have the following scenario where I do not understand why I'm still getting a LazyInitializationException.
My app runs a resolver in order to provide the various controller services with a resolved object so that it can be used directly.
Said resolver intercepts a header from the request and using it's value attempts to query the db in order to fetch the object. Now the object in question is quite simple is it's doings albeit it has a list of two sub-entities.
In order to perform the resolving action I'm using an extra service where I basically wrap some JpaRepository methods. The complete is below:
#Service
public class AppClientServiceImpl implements AppClientService {
private static final Logger LOGGER = LoggerFactory.getLogger(AppClientServiceImpl.class.getCanonicalName());
private final AppClientRepository repository;
#Autowired
public AppClientServiceImpl(AppClientRepository repository) {
this.repository = repository;
}
#Override
#Transactional(readOnly = true)
public AppClient getByAppClientId(final String appClientId) {
LOGGER.debug("Attempting to retrieve appClient with id:: {}", appClientId);
return repository.findByAppClientId(appClientId);
}
#Override
#Transactional
public void saveAndFlush(final AppClient appClient) {
LOGGER.debug("Attempting to save/update appClient:: {}", appClient);
repository.saveAndFlush(appClient);
}
}
As you can see both methods are annotated as #Transactional meaning that the should keep the session alive in the context of that said method.
Now, my main questions are the following:
1) Using the debugger I'm seeing even on that level getByAppClientId the list containing on the sub-entities which is lazy loaded has been resolved just fine.
2) On the resolver itself, where the object has been received from the delegating method, the list fails to be evaluated due to a LazyInitializationException.
3) Finally on the final controller service method which is also marked as #Transactional, the same as above occurs meaning that this eventually fails to it's job (since it's performing a get of the list that has failed to initialize.
Based on all the above, I would like to know what is the best approach in handling this. For once I do not want to use an Eager fetching type and I would also like to avoid using fetch queries. Also marking my resolver as #Transactional thus keeping the session open there as well is also out of the question.
I though that since the #Transactional would keep the session open, thus enabling the final service method to obtain the list of sub-entities. This seems not to be the case.
Based on all the above it seems that I need a way for the final service method that gets call (which needs the list on hand) to fetch it somehow.
What would the best approach to handle this? I've read quite a few posts here, but I cannot make out which is the most accepted methods as of Spring boot 2.0 and hibernate 5.
Update:
Seems that annotating the sub-entitie with the following:
#Fetch(FetchMode.SELECT)
#LazyCollection(LazyCollectionOption.TRUE)
Resolves the problem but I still don't know whether this is the best approach.
You initialize the collection by debugging. The debugger usually represents collections in a special way by using the collection methods which trigger the initialization, so that might be the reason why it seems to work fine during debugging. I suppose the resolver runs outside of the scope of the getByAppClientId? At that point the session is closed which is why you see the exception.
I created Blaze-Persistence Entity Views for exactly that use case. You essentially define DTOs for JPA entities as interfaces and apply them on a query. It supports mapping nested DTOs, collection etc., essentially everything you'd expect and on top of that, it will improve your query performance as it will generate queries fetching just the data that you actually require for the DTOs.
The entity views for your example could look like this
#EntityView(AppClient.class)
interface AppClientDto {
String getName();
}
Querying could look like this
List<AppClientDto> dtos = entityViewManager.applySetting(
EntityViewSetting.create(AppClientDto.class),
criteriaBuilderFactory.create(em, AppClient.class)
).getResultList();
I'm trying to figure out why the Jackson JSON Serialization of a collection of 250 objects is taking 40 seconds, and I think I have narrowed it down to SDN lazy loading. I'm using #Fetch, but it still seems as if its asking the database for the delegate for every attribute of every node in the collection. Please ignore any typos as I have to hand-type this as copy-paste isn't an option. Rest assured the class compiles as expected. The (simplified) class being serialized:
#NodeEntity
public class NodeWithDelegate {
#RelatedTo(type="REL_NAME", direction=Direction.OUTGOING)
#Fetch private DelegateNode delegate;
private DelegateNode getInitializedDelegate() {
if (delegate == null) {
delegate = new DelegateNode();
}
return delegate;
}
public String getDelegateAttribute1() {
return delegate == null ? null : delegate.getAttribute1();
}
public void setDelegateAttribute1(String attribute1) {
getInitializedDelegate().setAttribute1(attribute1);
}
....
public String getDelegateAttribute15() {
return delegate == null ? null : delegate.getAttribute15();
}
public void setDelegateAttribute15(String attribute15) {
getInitializedDelegate().setAttribute15(attribute15);
}
}
The DelegateNode class is exactly what you would expect, just a simple #NodeEntity POJO containing 15 String or Integer or Boolean attributes.
So two questions really:
how can I tell for sure if an object is actually being eagerly loaded? I'm using eclipse.
For debugging purposes, if the objects are all eagerly loaded, and I put a breakpoint between the fetching of the collection from the database and the serializer which calls all the delegate getters, and while paused shutdown the database, should it work? Is there any reason the objects would need to talk to the database at this point if its all eagerly loaded?
I guess I should mention I'm using the rest api for neo4j.
Many thanks in advance!
I am assuming you are using 3.x version of Spring Data Neo4j.
This version is not very optimized for REST api. If you enable logging of the cypher queries you will see many. Example for log4j:
log4j.category.org.springframework.data.neo4j.support.query=DEBUG
You can work around this limitation using custom cypher query and mapping the result with #QueryResult annotation.
Using the logging you should see your objects being loaded
It should, unless there is something "lazy" in the DelegateNode itself.
I was trying to use the hibernate #Size validation on a OneToMany collection which is lazy initialized. If i am creating the parent entity with the children added in this collection, the validation is applied on trying to persist.
But if i simply find the parent entity and then do a getChildren(), the validation is not applied at all. I even tried putting the annotation on the getter. so i am using #Size(max=1) but still hibernate doesn't throw any exception even if children are more than 1. even EAGER fetch does not help.
As of now i had to put validation logic myself in the getter but obviously this is not the clean way. kindly let me know if someone has faced this issue before and if there is any elegant way of doing this.
Event based validation is triggered on persist, update and remove. These are the events JPA defines for Bean Validation to occur (refer to the JSR-317 and JSR-338 specifications for more details). There is no validation when loading entities/associations from the database. The assumption is that persisted data is already validated. If you need to validate in your scenario, you need indeed to validate manually.
Hibernate Validator provides two TraversableResolvers out of the box which will be enabled automatically depending on your environment. The first is DefaultTraversableResolver which will always return true for isReachable() and isTraversable(). The second is JPATraversableResolver which gets enabled when Hibernate Validator is used in combination with JPA 2.
Create your own implementation of TraversableResolver or use DefaultTraversableResolver and configure Hibernate Validator.
public class MyTraversableResolver implements TraversableResolver {
#Override
public boolean isReachable(
Object traversableObject,
Node traversableProperty,
Class<?> rootBeanType,
Path pathToTraversableObject,
ElementType elementType) {
return true;
}
#Override
public boolean isCascadable(
Object traversableObject,
Node traversableProperty,
Class<?> rootBeanType,
Path pathToTraversableObject,
ElementType elementType) {
return true;
}
}