How to change Persistence Unit dynamically? - java

I have a Spring MVC + Hibernate + JPA app. Also have 4 different Schemas in my db with similar tables. (for different companies) .
Now when I'm using my Hibernate app, can i switch Persistence Unit so
that I can use same the form (with the same content) to save data in
all four Schemas?
I'm aware that i can switch persistence unit at run time, but i want to be able to use the already loaded forms to save data to all four Schemas by changing the persistence unit.

I had similar problem some time ago.
I had 2 identical schemas - application had to persist to first or second depending on some logic.
It was pure Hibernate, but talking in terms of JPA I will suggest to have 4 persistence units defined in your persistence.xml:
persistence.xml
<persistence-unit name="PU1">
...
</persistence-unit>
<persistence-unit name="PU2">
...
</persistence-unit>
[...]
and DAO class that has injected EntityManager proxies - each for different PU:
#Repository
public class MyDaoImpl implements MyDao {
#PersistenceContext(unitName = "PU1")
private EntityManager em1;
#PersistenceContext(unitName = "PU2")
private EntityManager em2;
...
public void saveToPU1(MyEntity e) {
em1.persist(e);
}
public void saveToPU2(MyEntity e) {
em2.persist(e);
}
...
}
Of course em1 annotated with #PersistenceContext(unitName="PU1") is Spring's proxy to Hibernate session and becomes open and bound with current thread only if that thread tries to use it.

I am not sure to understand your problem: of course you can change the PersistentUnit used at runtime with the Persistence#createEntityManagerFactory(String persistenceUnitName) method.
But if you want to
save data to all four Schemas
Then you should repeat your operation (persist I guess) four times (for example in a private method taking the persistence unit name as parameter).
You could introduce a form cache if you want to reuse the already loaded forms, but this is a software architecture question.
As suggested in the Java EE 5 tutorial, on a software design point of view, having a form depending directly on the JPA layer is not a best practice. The other answer suggests it: a DAO could be the solution. All is about your DAOs lifecyle.
The Core JEE patterns book suggests it (the online reference only mentions briefly the topic, the printed book is better): associating DAOs with a Factory pattern is a good idea. You could recycle the EntityManagerFactory or anything you wish.

Related

#Transactional annotation Spring boot 2.0 and hibernate LazyInitializationException

I have the following question. From what I understand the #Transactional annotation is supposed to keep the session alive, thus enabling to lazy fetch child entities without the need to performe a specific joining query.
I have the following scenario where I do not understand why I'm still getting a LazyInitializationException.
My app runs a resolver in order to provide the various controller services with a resolved object so that it can be used directly.
Said resolver intercepts a header from the request and using it's value attempts to query the db in order to fetch the object. Now the object in question is quite simple is it's doings albeit it has a list of two sub-entities.
In order to perform the resolving action I'm using an extra service where I basically wrap some JpaRepository methods. The complete is below:
#Service
public class AppClientServiceImpl implements AppClientService {
private static final Logger LOGGER = LoggerFactory.getLogger(AppClientServiceImpl.class.getCanonicalName());
private final AppClientRepository repository;
#Autowired
public AppClientServiceImpl(AppClientRepository repository) {
this.repository = repository;
}
#Override
#Transactional(readOnly = true)
public AppClient getByAppClientId(final String appClientId) {
LOGGER.debug("Attempting to retrieve appClient with id:: {}", appClientId);
return repository.findByAppClientId(appClientId);
}
#Override
#Transactional
public void saveAndFlush(final AppClient appClient) {
LOGGER.debug("Attempting to save/update appClient:: {}", appClient);
repository.saveAndFlush(appClient);
}
}
As you can see both methods are annotated as #Transactional meaning that the should keep the session alive in the context of that said method.
Now, my main questions are the following:
1) Using the debugger I'm seeing even on that level getByAppClientId the list containing on the sub-entities which is lazy loaded has been resolved just fine.
2) On the resolver itself, where the object has been received from the delegating method, the list fails to be evaluated due to a LazyInitializationException.
3) Finally on the final controller service method which is also marked as #Transactional, the same as above occurs meaning that this eventually fails to it's job (since it's performing a get of the list that has failed to initialize.
Based on all the above, I would like to know what is the best approach in handling this. For once I do not want to use an Eager fetching type and I would also like to avoid using fetch queries. Also marking my resolver as #Transactional thus keeping the session open there as well is also out of the question.
I though that since the #Transactional would keep the session open, thus enabling the final service method to obtain the list of sub-entities. This seems not to be the case.
Based on all the above it seems that I need a way for the final service method that gets call (which needs the list on hand) to fetch it somehow.
What would the best approach to handle this? I've read quite a few posts here, but I cannot make out which is the most accepted methods as of Spring boot 2.0 and hibernate 5.
Update:
Seems that annotating the sub-entitie with the following:
#Fetch(FetchMode.SELECT)
#LazyCollection(LazyCollectionOption.TRUE)
Resolves the problem but I still don't know whether this is the best approach.
You initialize the collection by debugging. The debugger usually represents collections in a special way by using the collection methods which trigger the initialization, so that might be the reason why it seems to work fine during debugging. I suppose the resolver runs outside of the scope of the getByAppClientId? At that point the session is closed which is why you see the exception.
I created Blaze-Persistence Entity Views for exactly that use case. You essentially define DTOs for JPA entities as interfaces and apply them on a query. It supports mapping nested DTOs, collection etc., essentially everything you'd expect and on top of that, it will improve your query performance as it will generate queries fetching just the data that you actually require for the DTOs.
The entity views for your example could look like this
#EntityView(AppClient.class)
interface AppClientDto {
String getName();
}
Querying could look like this
List<AppClientDto> dtos = entityViewManager.applySetting(
EntityViewSetting.create(AppClientDto.class),
criteriaBuilderFactory.create(em, AppClient.class)
).getResultList();

#Transactional not woking in JPA entity

I have a static method in the entity
#Transactional
public static void updateState() {
entityManager().createNativeQuery("UPDATE TABLEA SET hide = 1 WHERE id= 1").executeUpdate();
}
But when I call the method, I catch a exception say the update statement need a transaction.
Am I using the #Transactional in the wrong way?
It seems like you are trying to make your Entity a fat domain model (as opposed to thin models that are most common in the Java EE world) following the Active Record pattern.
What you are trying to do will not work as is in Spring.
If you refactor your method to not be static (first problem) then one way to get #Transactional working on a JPA entity is to use the #Configurable annotation from Spring (making it managed by Spring - therefore fixing the second problem), along with load time weaving and a Java agent. See this and this for more details.
Maybe you should try with the annotation:
#Transactional(readOnly=false)

Handle multiple EntityManager in Java EE application

I have Java EE application with about 10 EntityManagers (number of EMs will probably increase). My application also contains many stateless, statefull and message driven beans.
Rather than inject in each bean my EMs with #PersistenceContext (and 2 methods to detect which EM to use for user), I probably store all of that inside a singleton bean and access it with others beans. Like that, no worries about maintainability.
Nevertheless, is it thread-safe to store EMs inside one singleton bean? Can a bottleneck appear?
Another solution is to create an abstract class and all beans will extend it.
What is the better solution?
An entity manager is not supposed to be thread-safe, so you shouldn't share ones via a Singleton. It's the same reason as why you should not inject an entity manager into a Servlet, and why a lookup from JNDI in such a web component -should- return a different instance of the entity manager ever time.
In practice some implementations may provide an entity manager that is thread-safe, so during testing it may seem to work. However, for the sake of portability and to protect you against upgrade woes, you should never rely on this.
Instead of inheriting from a common base class, you could define all your entity managers in one bean, and inject that wherever you need an entity manager.
E.g.
#Stateless
public class EntityManagerProviderBean {
#PersistenceContext(unitName="foo")
private EntityManager entityManagerFoo;
#PersistenceContext(unitName="bar")
private EntityManager entityManagerBar;
public EntityManager getEntityManager() {
return ...? entityManagerFoo : entityManagerBar;
}
}
(where ... is the logic you use to select the right entity manager)
Inject this into a bean needing an entity manager:
#Stateless
public class MyService {
#EJB
private EntityManagerProviderBean entityManagerProvider;
public void doStuff(MyEntity myEntity) {
entityManagerProvider.getEntityManager().update(myEntity);
}
}
Alternatively the following would perhaps be even neater:
#Stateless
#PersistenceContexts({
#PersistenceContext(unitName="foo", name = "fooENC"),
#PersistenceContext(unitName="bar", name = "barENC") }
)
public class EntityManagerProviderBean {
#Resource
private EJBContext context;
public EntityManager getEntityManager() {
return (EntityManager) context.lookup(... ? "fooENC" : "barENC");
}
}
The last example maps all persistence contexts into the ENC of the bean, where they can be conveniently retrieved programmatically.
Unfortunately, people forgot to add tests for the latter syntax to the TCK and subsequently major vendors forgot to implement it (see http://java.net/jira/browse/JPA_SPEC-38 and https://issues.jboss.org/browse/AS7-5549), so test if this works on your server.
Container managed entity managers are automatically propagated with the current JTA transaction and EntityManager references that are mapped to the same persistence unit provide access to the persistence context within that transaction. So it's not good practice to share an entity manager from a singleton, apart from concurrency problems, it would result in using the same transaction context for every method you call on your beans.
A simple solution to your need is to inject EntityManagerFactory references in your beans and create EntityManager objects calling the createEntityManager() method.
The drawback is that you should manage transactions manually, no more relying on the container.
Otherwise another approach could be inject all of your entity managers in a main enterprise bean and implement business logic in service beans with methods to which you pass the appropriate managers.
An example of the latter solution:
#Stateless
class MainBean {
#PersistenceContext EntityManager em1;
#PersistenceContext EntityManager em2;
...
#EJB WorkerBean1 workerBean1;
#EJB WorkerBean2 workerBean2;
...
void method1(Object param1, Object param2) {
workerBean1.method1(em1, param1, param2);
}
void method2(Object param1, Object param2, Object param3) {
workerBean2.method2(em2, param1, param2, param3);
}
...
}
#Stateless
class WorkerBean1 {
void method1(EntityManager em, Object param1, Object param2) {
...
}
...
}
#Stateless
class WorkerBean2 {
void method2(EntityManager em, Object param1, Object param2, Object param3) {
...
}
...
}
Composite persistence units - Java EE
The way to handle multiple entity managers, i.e. multiple persistence units, in Java EE is to use composite persistence units (CPUs). Such a composite persistence unit can be assessed from one single point in the EE web-application, a datalayer. This needs to be a #Stateless EE bean though in order to work with the #PersistenceContext.
Composite persistence units have been introduced to make possible reusing entity classes, among various Java applications. CPUs are a feature of Enterprise architecture. I choose to use EclipseLink as showcase, as I have positive experience with that from a running production application.
Introduction
In some cases, entities contain general data that is needed across more web-services in a server landscape. Take for example a general ‘name-address’ entity, a ‘user-password-role’ entity, a ‘document-keyword-index’ entity, etc. A composite persistence unit implementation facilitates that the source of each entity definition is specified in only one place (‘single point of definition’). These entity definitions can subsequently be included in each Java web-application that needs this entity access.
Working of composite persistence unit
The working of a composite persistence unit is illustrated by the following tutorial: EclipseLink composite persistence units
The concept of composite persistence units works by first defining member persistence units. Each member persistence unit may be associated with a different database, but the member persistence units can also all refer to the same actual database. I have experience with the latter, where EclipseLink (version 2.6.4) was used in combination with one Postgress database.
Maven is needed to make possible the required modular approach.
Settings in persistence.xml
A composite persistence unit member is defined as follows: Program a group of related entities (Java #Entity classes), one-by-one, in a dedicated Maven module. Define in this Maven module also a composite persistence unit member (important!). The composite unit member PuPersonData refers to this set of related entities that characterizes person data. Define the member persistence unit PuPersonData as (
<persistence-unit name="PuPersonData" transaction-type="JTA">
...
<jta-data-source>jdbc/PostgresDs</jta-data-source>
...
).
In a second Maven module, define another composite persistence unit member, PuWebTraffic (
<persistence-unit name="PuWebTraffic" transaction-type="JTA">
...
<jta-data-source>jdbc/PostgresDs</jta-data-source>
...
). Include here other entities (Java classes denoted with #Entity) that store data about web-transactions, logon, sessions, etc.
Needless to state, the two composite persistence unit members must be disjoint with respect to entities, no overlap is allowed in entity-names.
Both persistence unit members have in their XML-definitions the property:
<properties>
<property name="eclipselink.composite-unit.member" value="true"/>
...
</properties>
Composite persistence unit
We now define in a third Maven module the composite persistence unit CPuPersonSessionData that includes both the persistence units members PuPersonData and PuWebTraffic.
<persistence-unit name="CPuPersonSessionData" transaction-type="JTA">
This composite persistence unit CPuPersonSessionData refers to the two persistence unit members, PuPersonData and PuWebTraffic, by means of including the jars that result from compilation of the two pertaining Maven modules.
...
<jar-file>PuPersonData.jar</jar-file>
<jar-file>PuWebTraffic.jar</jar-file>
...
In the XML-definition of the composite persistence unit, the following property needs to be set
<properties>
<property name="eclipselink.composite-unit" value="true"/>
...
</properties>
This setting ensures that the composite persistence unit is treated differently by Java EE than its persistence unit members.
Use of persistence unit in Java
In the Java web-application that is going to store and retrieve entities with both person-data and traffic-data, only the composite persistence unit is included
#Stateless
public class DataLayer {
#PersistenceUnit(unitName="CPuPersonSessionData")
EntityManager em;
...
The normal 'em' operations such as persist, find and merge can now be performed on each entity, contained in one of the composite entity members.
Under Payara, no XA-transactions were needed for this composite persistence unit to address the entities pertaining to each of the persistence unit members.
Maven
The Maven parent POM file needs to contain the specifications for the pertaining modules.
...
<modules>
<module>PersonData</module>
<module>WebTraffic</module>
<module>PersonSessionData</module>
</modules>
...
The POM-file of each module needs to be configured as a normal Maven-project, referring to the parent POM-file.
Pitfalls:
You need to configure the Maven multi-module project correctly, which can be somewhat tricky. Each composite persistence unit member constitutes a separate Maven module. Also the composite persistence unit is a separate Maven module. The members need to be compiled first, in Maven sequence.
The ‘jars’ in the composite persistence unit need to be found when compiling the module of the composite persistence unit.
The entities of each composite persistence unit member need to be available in the resulting ‘jar’, directly in the ‘classes’ directory (adding extra paths to the entities, via Maven, is possible but complex).
The ‘jars’ of the persistence unit members need to be available in the ‘classes’ directory for the composite persistence unit to find them.
The benefit gained is a neat Enterprise data-layer that works with reusable entities, each with one central definition. Moreover, it is possible to perform cross-unit native SQL-queries. I got this to work also.
Documentation states that cross-unit native queries will not work when the composite persistence unit members run on different, actual databases. This should still be verified.

How to refresh JPA entities when backend database changes asynchronously?

I have a PostgreSQL 8.4 database with some tables and views which are essentially joins on some of the tables. I used NetBeans 7.2 (as described here) to create REST based services derived from those views and tables and deployed those to a Glassfish 3.1.2.2 server.
There is another process which asynchronously updates contents in some of tables used to build the views. I can directly query the views and tables and see these changes have occured correctly. However, when pulled from the REST based services, the values are not the same as those in the database. I am assuming this is because JPA has cached local copies of the database contents on the Glassfish server and JPA needs to refresh the associated entities.
I have tried adding a couple of methods to the AbstractFacade class NetBeans generates:
public abstract class AbstractFacade<T> {
private Class<T> entityClass;
private String entityName;
private static boolean _refresh = true;
public static void refresh() { _refresh = true; }
public AbstractFacade(Class<T> entityClass) {
this.entityClass = entityClass;
this.entityName = entityClass.getSimpleName();
}
private void doRefresh() {
if (_refresh) {
EntityManager em = getEntityManager();
em.flush();
for (EntityType<?> entity : em.getMetamodel().getEntities()) {
if (entity.getName().contains(entityName)) {
try {
em.refresh(entity);
// log success
}
catch (IllegalArgumentException e) {
// log failure ... typically complains entity is not managed
}
}
}
_refresh = false;
}
}
...
}
I then call doRefresh() from each of the find methods NetBeans generates. What normally happens is the IllegalArgumentsException is thrown stating somethng like Can not refresh not managed object: EntityTypeImpl#28524907:MyView [ javaType: class org.my.rest.MyView descriptor: RelationalDescriptor(org.my.rest.MyView --> [DatabaseTable(my_view)]), mappings: 12].
So I'm looking for some suggestions on how to correctly refresh the entities associated with the views so it is up to date.
UPDATE: Turns out my understanding of the underlying problem was not correct. It is somewhat related to another question I posted earlier, namely the view had no single field which could be used as a unique identifier. NetBeans required I select an ID field, so I just chose one part of what should have been a multi-part key. This exhibited the behavior that all records with a particular ID field were identical, even though the database had records with the same ID field but the rest of it was different. JPA didn't go any further than looking at what I told it was the unique identifier and simply pulled the first record it found.
I resolved this by adding a unique identifier field (never was able to get the multipart key to work properly).
I recommend adding an #Startup #Singleton class that establishes a JDBC connection to the PostgreSQL database and uses LISTEN and NOTIFY to handle cache invalidation.
Update: Here's another interesting approach, using pgq and a collection of workers for invalidation.
Invalidation signalling
Add a trigger on the table that's being updated that sends a NOTIFY whenever an entity is updated. On PostgreSQL 9.0 and above this NOTIFY can contain a payload, usually a row ID, so you don't have to invalidate your entire cache, just the entity that has changed. On older versions where a payload isn't supported you can either add the invalidated entries to a timestamped log table that your helper class queries when it gets a NOTIFY, or just invalidate the whole cache.
Your helper class now LISTENs on the NOTIFY events the trigger sends. When it gets a NOTIFY event, it can invalidate individual cache entries (see below), or flush the entire cache. You can listen for notifications from the database with PgJDBC's listen/notify support. You will need to unwrap any connection pooler managed java.sql.Connection to get to the underlying PostgreSQL implementation so you can cast it to org.postgresql.PGConnection and call getNotifications() on it.
An an alternative to LISTEN and NOTIFY, you could poll a change log table on a timer, and have a trigger on the problem table append changed row IDs and change timestamps to the change log table. This approach will be portable except for the need for a different trigger for each DB type, but it's inefficient and less timely. It'll require frequent inefficient polling, and still have a time delay that the listen/notify approach does not. In PostgreSQL you can use an UNLOGGED table to reduce the costs of this approach a little bit.
Cache levels
EclipseLink/JPA has a couple of levels of caching.
The 1st level cache is at the EntityManager level. If an entity is attached to an EntityManager by persist(...), merge(...), find(...), etc, then the EntityManager is required to return the same instance of that entity when it is accessed again within the same session, whether or not your application still has references to it. This attached instance won't be up-to-date if your database contents have since changed.
The 2nd level cache, which is optional, is at the EntityManagerFactory level and is a more traditional cache. It isn't clear whether you have the 2nd level cache enabled. Check your EclipseLink logs and your persistence.xml. You can get access to the 2nd level cache with EntityManagerFactory.getCache(); see Cache.
#thedayofcondor showed how to flush the 2nd level cache with:
em.getEntityManagerFactory().getCache().evictAll();
but you can also evict individual objects with the evict(java.lang.Class cls, java.lang.Object primaryKey) call:
em.getEntityManagerFactory().getCache().evict(theClass, thePrimaryKey);
which you can use from your #Startup #Singleton NOTIFY listener to invalidate only those entries that have changed.
The 1st level cache isn't so easy, because it's part of your application logic. You'll want to learn about how the EntityManager, attached and detached entities, etc work. One option is to always use detached entities for the table in question, where you use a new EntityManager whenever you fetch the entity. This question:
Invalidating JPA EntityManager session
has a useful discussion of handling invalidation of the entity manager's cache. However, it's unlikely that an EntityManager cache is your problem, because a RESTful web service is usually implemented using short EntityManager sessions. This is only likely to be an issue if you're using extended persistence contexts, or if you're creating and managing your own EntityManager sessions rather than using container-managed persistence.
You can either disable caching entirely (see: http://wiki.eclipse.org/EclipseLink/FAQ/How_to_disable_the_shared_cache%3F ) but be preparedto a fairly large performance loss.
Otherwise, you can perform a clear cache programmatically with
em.getEntityManagerFactory().getCache().evictAll();
You can map it to a servlet so you can call it externally - this is better if your database is modify externally very seldom and you just want to be sure JPS will pick up the new version
Just a thought, but how do you receive your EntityManager/Session/whatever?
If you queried the entity in one session, it will be detached in the next one and you will have to merge it back into the persistence context to get it managed again.
Trying to work with detached entities may result in those not-managed exceptions, you should re-query the entity or you could try it with merge (or similar methods).
JPA doesn't do any caching by default. You have to explicitly configure it. I believe its the side effect of the architectural style you have chosen: REST. I think caching is happening at the web servers, proxy servers etc. I suggest you read this and debug more.

Initialize JPA-like entities with JDBC

I'm implementing several DAO classes for a web project and for some reasons I have to use JDBC.
Now I'd like to return an entity like this:
public class Customer{
// instead of int userId
private User user;
// instead of int activityId
private Activity act;
// ...
}
Using JPA user and activity would be loaded easily (and automatically specifying relations between entities).
But how, using JDBC? Is there a common way to achieve this? Should I load everiting in my CustomerDAO? IS it possible to implement lazy initialization for referenced entities?
My first idea was to implement in my UserDAO:
public void initUser(Customer customer);
and in my ActivityDAO:
public void initActivity(Customer customer);
to initialize variables in customer.
Active Record route
You could do this with AspectJ ITDs and essentially make your entities into Active Record like objects.
Basically you make an Aspect that advises class that implement an interface called "HasUser" and "HasActivity". Your interfaces HasUser and HasActivity will just define getters.
You will then make Aspects that will weave in the actual implementation of getUser() and getActivity().
Your aspects will do the actual JDBC work. Although the learning curve on AspectJ is initially steep it will make your code far more elegant.
You can take a look at one of my answers on AspectJ ITD stackoverflow post.
You should also check out springs #Configurable which will autowire in your dependencies (such as your datasource or jdbc template) into non managed spring bean.
Of course the best example of to see this in action is Spring Roo. Just look at the AspectJ files it generates to get an idea (granted that roo uses JPA) of how you would use #Configurable (make sure to use the activerecord annotation).
DAO Route
If you really want to go the DAO route than you need to this:
public class Customer{
// instead of int userId
private Integer userId;
// instead of int activityId
private Integer activityId;
}
Because in the DAO pattern your entity objects are not supposed to have behavior. Your Services and/or DAO's will have to make transfer objects or which you could attach the lazy loading.
I'm not sure if there is any automated approach about this. Without ORM I usually define getters as singletons where my reference types are initialized to null by default, i.e. my fetching function would load primitives + Strings and will leave them as null. Once I need getUser(), my getter would see if this is null and if so, it would issue another select statement based on the ID of the customer.

Categories

Resources