changing entity schema name before sessionFactory inicialization - java

During migration from hibernate 3 version to 4 version I faced with problem.
I use spring and hibernate in my project and during start up of my application sometimes I want to change schema of my entity classes. With 3 version hibernate and spring I make this by overriding postProcessConfiguration method in LocalSessionFactortBean class like this:
#SuppressWarnings("unchecked")
#Override
protected void postProcessAnnotationConfiguration(AnnotationConfiguration config)
{
Iterator<Table> it = config.getTableMappings();
while (it.hasNext())
{
Table table = it.next();
table.setSchema(schemaConfigurator.getSchemaName(table.getSchema()));
}
}
this work perfect for me. But in hibernate4.LocalSessionFactoryBean class all post process methods were deleted. Some people suggest to use ServiceRegistryBuilder class, but I want to use spring xml configuration for my session factory and with ServiceRegistryBuilder class I don't know how to perform this. So may be someone suggest any solution to my problem.

Looking at source code help to find solution. LocalSessionFactoryBean class has method called buildSessionFactory(newSessionFactory in previous version). With previous version of Hibernate(3 version) some operations where processed before this method call. You can see them in official docs
// Tell Hibernate to eagerly compile the mappings that we registered,
// for availability of the mapping information in further processing.
postProcessMappings(config);
config.buildMappings();
as I understand (may be I'm wrong) this buildMapping method parses all classes that specified as mapped classes or placed in packagesToScan and creates Table representation of all this classes. After this called postProcessConfiguration method.
With Hibernate 4 version we don't have such postProcess methods. But we can override buildSessionFactory method like this:
#Override
protected SessionFactory buildSessionFactory(LocalSessionFactoryBuilder sfb) {
sfb.buildMappings();
// For my task we need this
Iterator<Table> iterator = getConfiguration().getTableMappings();
while (iterator.hasNext()){
Table table = iterator.next();
if(table.getSchema() != null && !table.getSchema().isEmpty()){
table.setSchema(schemaConfigurator.getSchemaName(table.getSchema()));
}
}
return super.buildSessionFactory(sfb);
}

Related

How can I avoid Stream<X> return type method in JpaRepository loading all data into memory with EclipseLink JPA?

We are using Spring data 2.4.4 + EclipseLink 2.7.0 (JPA 2.2) in our project.
Recently, we are developing a function allowing user downloading data by xlsx, and when I test interface with large dataset, it ran out of memory (OutOfMemoryError) unsurprisingly. So we are considering using Stream tpye method in JpaRepository expecting EclipseLink will return a Stream implemented by CursoredStream or ScrollableCursor. However, it seems to behave just like getting a List.
To verify, I defined a method that would fetch all job orders from db:
#Query("select jo from JobOrder jo order by jo.oid")
Stream<JobOrder> streamAll();
And wrap it with a transaction:
#Repository
public class JobOrderTestDAO {
#Autowired
private JobOrderRepository repository;
#Transactional(readOnly = true)
public Stream<JobOrder> testGetAllByStream() {
return repository.streamAll();
}
}
Finally, in the test, I limit the stream size to 10 and print their oid in console. If Cursor is used as container, results should be returned immediately.
#Autowired
private JobOrderTestDAO testDAO;
#Test
void testGetAllByStream() {
Stream<JobOrder> joStream = testDAO.testGetAllByStream();
joStream.limit(10).forEach(System.out::println);
joStream.close();
}
However, no results returned, only to find memory exploding. We checked the source code, EclipseLink seems not providing real-streaming solution for getResultStream() that said to "provide additional capabilities".
default Stream<X> getResultStream() {
return getResultList().stream();
}
Now we're using a somehow tricky workaround by downgrading JPA to 2.1.x. Since StreamExecutor will explicitly calling Cursor based function.
protected Object doExecute(final AbstractJpaQuery query, JpaParametersParameterAccessor accessor) {
if (!SurroundingTransactionDetectorMethodInterceptor.INSTANCE.isSurroundingTransactionActive()) {
throw new InvalidDataAccessApiUsageException(NO_SURROUNDING_TRANSACTION);
}
Query jpaQuery = query.createQuery(accessor);
// JPA 2.2 on the classpath
if (streamMethod != null) {
return ReflectionUtils.invokeMethod(streamMethod, jpaQuery);
}
// Fall back to legacy stream execution
PersistenceProvider persistenceProvider = PersistenceProvider.fromEntityManager(query.getEntityManager());
//Implementation here is using Cursor
CloseableIterator<Object> iter = persistenceProvider.executeQueryWithResultStream(jpaQuery);
return StreamUtils.createStreamFromIterator(iter);
}
It may not be a good practice to exclude a jar that match the versions and reinclude a jar that has been out of date. Thus, we are seeking for a solution, that might possibly keep JpaRepository and JpaSpecificationExecutor instead of coding directly with ExpressionBuilder and with stream underlying Stream.
Have the same issue, what I found is that spring-data-jpa starting from 1.11.8 version changed implementation of JpaQueryExecution.doExecute. So instead of running persistenceProvider.executeQueryWithResultStream it calls Query.getResultStream method. The default implementation for getResultStream method is getResultList().stream(). That means that instead of real streaming and using scrollable cursors it tries to put all the data in memory. Eclipselink do not overrides default behavior for getResultStream method till current version 3.0.
A few options could be used here:
Instead of spring-data use directly JDBC like spring-data did on versions before 1.11.8.
Instead of eclipselink use hibernate or any other persistence provider who has full support of JPA 2.2 features.

touch equivalent for Hibernate entity

I'd like to implement repository method void touch(MyEntity myEntity) which enforces SQL call of update of entity columns to their current values. (The reason behind is the on update trigger which needs to be invoked in some point of execution.) Ideal usecase is:
void serviceMethod(Long myEntityId) {
MyEntity myEntity = myEntityRepository.findOne(myEntityId);
...
myEntityRepository.touch(myEntity);
...
}
There are already similar questions on SO which don't work for me: Force update in Hibernate (my entity is detached), Implementing “touch” on JPA entity? (doing some harmless change works but is not general and has bad impact on code readability), Hibernate Idempotent Update (similar example).
I am aware of session interceptor method findDirty and also CustomEntityDirtinessStrategy both described in this Vlad Mihalcea's article. However, it seems to use findDirty I would have to override session interceptor, which is not possible from within repository method since the interceptor is final field assigned to session at session creation. And CustomEntityDirtinessStrategy comes from SessionFactory which is global. I rather need some one-shot solution to temporary consider one concrete entity of one concrete class dirty.
The so-far-best working solution is to set invalid (array of nulls) entity snapshot into persistence context, so that the subsequent logic in flush() evaluates entity as differing from snapshot and enforce update. This works:
#Override
#Transactional
public void touch(final T entity) {
SessionImpl session = (SessionImpl)em.getDelegate();
session.update(entity);
StatefulPersistenceContext pctx = (StatefulPersistenceContext) session.getPersistenceContext();
Serializable id = session.getIdentifier(entity);
EntityPersister persister = session.getEntityPersister(null, entity);
EntityKey entityKey = session.generateEntityKey(id, persister);
int length = persister.getPropertyNames().length;
Field entitySnapshotsByKeyField = FieldUtils.getField(pctx.getClass(), "entitySnapshotsByKey", true);
Map<EntityKey,Object> entitySnapshotsByKey = (Map<EntityKey,Object>)ReflectionUtils.getField(entitySnapshotsByKeyField, pctx);
entitySnapshotsByKey.put(entityKey, new Object[length]);
session.flush();
em.refresh(entity);
}
The advice in Force update in Hibernate didn't work for me because session.evict(entity) clears entitySnapshotsByKey entry at all, which causes subsequent org.hibernate.event.internal.DefaultFlushEntityEventListener#getDatabaseSnapshot loads fresh entity from db. The question is 9 years old and I'm not sure if it's applicable to current version of Hibernate (mine is 5.2.17).
I am not satisfied with such hacky solution though. Is there some straightforward way or something I could do simpler?

#Transactional annotation Spring boot 2.0 and hibernate LazyInitializationException

I have the following question. From what I understand the #Transactional annotation is supposed to keep the session alive, thus enabling to lazy fetch child entities without the need to performe a specific joining query.
I have the following scenario where I do not understand why I'm still getting a LazyInitializationException.
My app runs a resolver in order to provide the various controller services with a resolved object so that it can be used directly.
Said resolver intercepts a header from the request and using it's value attempts to query the db in order to fetch the object. Now the object in question is quite simple is it's doings albeit it has a list of two sub-entities.
In order to perform the resolving action I'm using an extra service where I basically wrap some JpaRepository methods. The complete is below:
#Service
public class AppClientServiceImpl implements AppClientService {
private static final Logger LOGGER = LoggerFactory.getLogger(AppClientServiceImpl.class.getCanonicalName());
private final AppClientRepository repository;
#Autowired
public AppClientServiceImpl(AppClientRepository repository) {
this.repository = repository;
}
#Override
#Transactional(readOnly = true)
public AppClient getByAppClientId(final String appClientId) {
LOGGER.debug("Attempting to retrieve appClient with id:: {}", appClientId);
return repository.findByAppClientId(appClientId);
}
#Override
#Transactional
public void saveAndFlush(final AppClient appClient) {
LOGGER.debug("Attempting to save/update appClient:: {}", appClient);
repository.saveAndFlush(appClient);
}
}
As you can see both methods are annotated as #Transactional meaning that the should keep the session alive in the context of that said method.
Now, my main questions are the following:
1) Using the debugger I'm seeing even on that level getByAppClientId the list containing on the sub-entities which is lazy loaded has been resolved just fine.
2) On the resolver itself, where the object has been received from the delegating method, the list fails to be evaluated due to a LazyInitializationException.
3) Finally on the final controller service method which is also marked as #Transactional, the same as above occurs meaning that this eventually fails to it's job (since it's performing a get of the list that has failed to initialize.
Based on all the above, I would like to know what is the best approach in handling this. For once I do not want to use an Eager fetching type and I would also like to avoid using fetch queries. Also marking my resolver as #Transactional thus keeping the session open there as well is also out of the question.
I though that since the #Transactional would keep the session open, thus enabling the final service method to obtain the list of sub-entities. This seems not to be the case.
Based on all the above it seems that I need a way for the final service method that gets call (which needs the list on hand) to fetch it somehow.
What would the best approach to handle this? I've read quite a few posts here, but I cannot make out which is the most accepted methods as of Spring boot 2.0 and hibernate 5.
Update:
Seems that annotating the sub-entitie with the following:
#Fetch(FetchMode.SELECT)
#LazyCollection(LazyCollectionOption.TRUE)
Resolves the problem but I still don't know whether this is the best approach.
You initialize the collection by debugging. The debugger usually represents collections in a special way by using the collection methods which trigger the initialization, so that might be the reason why it seems to work fine during debugging. I suppose the resolver runs outside of the scope of the getByAppClientId? At that point the session is closed which is why you see the exception.
I created Blaze-Persistence Entity Views for exactly that use case. You essentially define DTOs for JPA entities as interfaces and apply them on a query. It supports mapping nested DTOs, collection etc., essentially everything you'd expect and on top of that, it will improve your query performance as it will generate queries fetching just the data that you actually require for the DTOs.
The entity views for your example could look like this
#EntityView(AppClient.class)
interface AppClientDto {
String getName();
}
Querying could look like this
List<AppClientDto> dtos = entityViewManager.applySetting(
EntityViewSetting.create(AppClientDto.class),
criteriaBuilderFactory.create(em, AppClient.class)
).getResultList();

How do I refactor hibernate entities?

My use case is as follows: I've inherited a project that is using hibernate. The entity I'm focused on right now is, for the purposes of this exercise, closed to modification. The object of the exercise is to replace the use of the legacy entity with an unrelated implementation that is better suited to the new requirements.
The goal is to be able to move functionality from the old entity to the new incrementally.
As is, the use of the legacy entity looks something like
//...
final Session currentSession = sessionFactory().getCurrentSession();
{
LegacyEntity oldAndBusted = get(currentSession, "12345");
oldAndBusted.change(...);
put(oldAndBusted);
}
}
LegacyEntity get(final Session currentSession, String businessId) {
return (LegacyEntity) currentSession
.createQuery("from PurpleMonkeyDishwasher where businessId = ?")
.setParameter(0, "12345")
.uniqueResult();
}
void put(final Session currentSession, LegacyEntity changed) {
currentSession.saveOrUpdate(changed);
}
With configuration magic hidden off in some hbm.xml file
<class name="LegacyEntity" table="PurpleMonkeyDiswasher">
<!-- stuff -->
</class>
How do I arrange analogous code for a new entity mapped to the same table
BuzzwordCompliantEntity get(final Session currentSession, String businessId);
void put(BuzzwordCompliantEntity changed);
without breaking the code paths that are still using LegacyEntity in the same process?
"The entity I'm focused on right now is, for the purposes of this exercise, closed to modification. ... The goal is to be able to move functionality from the old entity to the new incrementally." I find this contradictory. When replacing a class by another, I have always had success by: 1) incrementally changing the old class API to be a subset of the new class API, 2) renaming the old type to have the same name and package as the new class, 3) removing the old class (that we renamed in step 2). While doing all of this, I rely as much as possible on the refactoring capabilities of the IDE.

Spring AOP for database operation

I am working in a spring,hibernate project and database is oracle. I have DAO layer for persistence related operations.
In all my tables, I have create_date and update_date columns representing the timestamp when a row is inserted and updated in the tables respectively.
There is a requirement that I have to update the above two mentioned timestamp columns of that particular table for which the request is meant to whenever any insert/update operation happens.For example, If my DAO layer has two methods, say m1 and m2 responsible for impacting t1 and t2 tables respectively. Now, if m1 method is invoked, then timestamp columns of t1 table will be updatedi.e. For insert, create_date column will be updated and for any update update_date column will be updated.
I have idea of Spring AOP so I was thinking to use AOP to implement the above requirement, though, i am not quite sure if it can be achieved using AOP.
Please let me know if I can use AOP to fulfill this requirement. And if it is possible, then please provide me the inputs how to implement it.
I have implemented update date feature for one of the modules in my application using Spring AOP.
PFB code for your reference
Hope this will help.
I wonder if one can have pointcuts for variable as well.I know its might not possible with spring's aspect j implementation.But any work around guys :P
**
* #author Vikas.Chowdhury
* #version $Revision$ Last changed by $Author$ on $Date$ as $Revision$
*/
#Aspect
#Component
public class UpdateDateAspect
{
#Autowired
private ISurveyService surveyService;
Integer surveyId = null;
Logger gtLogger = Logger.getLogger(this.getClass().getName());
#Pointcut("execution(* com.xyz.service.impl.*.saveSurvey*(..)))")
public void updateDate()
{
}
#Around("updateDate()")
public Object myAspect(final ProceedingJoinPoint pjp)
{
// retrieve the runtime method arguments (dynamic)
Object returnVal = null;
for (final Object argument : pjp.getArgs())
{
if (argument instanceof SurveyHelper)
{
SurveyHelper surveyHelper = (SurveyHelper) argument;
surveyId = surveyHelper.getSurveyId();
}
}
try
{
returnVal = pjp.proceed();
}
catch (Throwable e)
{
gtLogger.debug("Unable to use JointPoint :(");
}
return returnVal;
}
#After("updateDate()")
public void updateSurveyDateBySurveyId() throws Exception
{
if (surveyId != null)
{
surveyService.updateSurveyDateBySurveyId(surveyId);
}
}
}
I'd use an Hibernate interceptor instead, that's what they are for. For example, the entities that need such fields could implement the following interface:
public interface Auditable {
Date getCreated();
void setCreated(Date created);
Date getModified();
void setModified(Date modified);
}
Then the interceptor always sets the modified field on save, and only sets the created field when it's not already set.
Even though you have been asking for a Spring AOP solution to your question, I would like to point out that the same result can be achieved using database triggers, e. g. automatically setting the created timestamp during INSERT operations and the modified timestamp during UPDATE statements.
This may be a good solution, especially if not all your DB calls are going through the AOP-captured logic (e. g. when bypassing your pointcut definition because a method does not fit the pattern or even bypassing the code completely using a standalone SQL client), so that you could enforce the modified timestamp even when somebody updates the entries from a different application.
It would have the drawback that you need to define the triggers on all affected tables, though.
It should be possible with Spring AOP using a #Before advice. If you pass an entity to a create method have an advice set the create_date and for an update method the update_date. You may want to consider the following to make your job easier:
Have all entities implement a common interface to set create_date and update_date. This allows you to have a common advice without having to resort to reflection.
Have a naming convention to identify create and update methods on our DAOs. This will make your point cuts simpler.

Categories

Resources