Java - Sync different entities between Client-Server database (JPA2+EJB3) - java

can someone help me plz?
i am trying to get a synchronization between N "clients" database to 1 Big "Server" database (kind of repository/backup database).
Its ok, i got a way to make their sync fine.
but, it was demanded the "Server" will be a "Client" application too, with same Java classes, models, entities, business, etc..
I am trying to find a solution to this Server database holds all database content from all Clients without making major changes to my entities (cause i will need to apply then on Client too)
This is an example of what i tried, but no success how to implement it on JPA and Serialization through EJB remotes.
#Entity
#Table(name="crud_sync")
public class CrudSync implements Serialization{
private static final long serialVersionUID = 1L;
#Id private Long id;
private String data;
//...ommited...
}
supposing i have this CrudSync entity.
On the N clients side database, i would have a table with columns:
ID - Number - PK
Data - Varchar
etc...
on the 1 Server side, i would like to have a table with:
ID - Number - PK
ClientPK - Number // the ID on Client database table
ClientId - Number // the Client code (1,2,3,4.. etc..)
// UQ (ClientPK,ClientID)
Data - Varchar
etc...
or even a composite PK
ClientPK - Number - PK // the ID on Client database table
ClientId - Number - PK // the Client code (1,2,3,4.. etc..)
Data - Varchar
etc...
but i am not sure how to achieve that using the same Entities on both sides, if its even possible..
another solution could maybe create separated JAR/EAR for Client and Server.
doing that, what should i change in Entities/JPA annotations?
what class should be separated to Client/Server JAR/EAR ?
Any other suggestion on how to make a (N Client <-> 1 Server) database?
Its mandatory to Clients work offline, so they need to have their own Database running, also its mandatory to have the same application on Server, because it will be possible to create entities on Server and those will be sent to Client (vice-versa).
I am using JBoss AS 7.1, JPA 2, EJB 3.1.
Thanks in advance.
Anon.

Off the cuff, not sure if it's possible to use the same JPA entities to do that. The logic is probably going to be sophisticated enough where you might want it to reside in the application and not the database.
That said, you could use a decorator pattern so the server/client application can use the same core JPA entities. Then have server-specific JPA entities that decorate the core entities and persist the data required to manage the synchronization between the client and server applications. The decorator objects simply contain a reference to the core object. Then whatever data it requires to facilitate the push and pull of data between server and client.
Hope this helps

Related

ID of java objects not synchronizing with database

The only link I found that's close to what I am experiencing is this one :
How do you synchronize the id of a java object to its associated db row?
and there's not much of a solution in it.
My problem is that my Java objects aren't updated after being added to my database despite the .commit()
em.getTransaction().begin();
System.out.println(eleve.getID());
em.persist(eleve);
em.getTransaction().commit();
System.out.println(eleve.getID());
which refers to this class
public class Eleve {
private String _nom;
private String _prenom;
private float _ptsMerite;
#Id
private int _IDEleve;
and yields this output :
0
0
I think I've done everything properly when it comes to the persistence since it does create the object in the database (mySQL) with correct ID's which I've set to be autoincrement.
I am using javax.persistence for everything (annotations and such).
Did you try to add the #GeneratedValue annotation at your ID field?
There are four possible strategies you can choose from:
GenerationType.AUTO: The JPA provider will choose an appropriate strategy for the underlying database.
GenerationType.IDENTITY: Relies on a auto-increment column in your database.
GenerationType.SEQUENCE: Relies on a database sequence
GenerationType.TABLE: Uses a generator table in the database.
More info: https://www.baeldung.com/jpa-strategies-when-set-primary-key
If you ever change to a more powerful framework it is likely that this manages your transactions (CMT) so you can't (or don't want) commit everytime you want to access the ID for a new entity. In these cases you can use EntityManager#flush to synchronize Entity Manager with database.

How to dynamically add Entity in Hibernate?

I'm a java developer. I'm using spring 4.0.1 and hibernate 4.2.21. I have a class as follow:
#Entity
#Inheritance(...)
public abstract class Feature{
#Id
#GeneratedValue
protected Long id;
...
}
Now I have some many class as follow:
Label.java class:
#Entity
public class Label extends Feature{
protected String str;
...
}
Point.java class:
#Entity
public class Point extends Feature{
protected Integer intg;
...
}
I have more than 20 Entity class that extends from Feature class. Is there any way to add dynamically this classes(such as Label and Point) to the project without writing hard code?
update:
For example, Hibernate get data from a database and then according this data, create models.
Is it possible?
How do I do?
I think its not a good database design that needs to be changed dynamically. It sounds verbose and not consistent. Observe your domain again and try to design a proper entity relationships that wouldnt be changed over run time.
You can try to collect the needed data to build the model and generate a hibernate hbm.xml file for each entity (is xml format and easy to generate with java after reading the data needed as you describe in your update)
After that, you can create programmatically a hibernate configuration object following this http://docs.jboss.org/hibernate/orm/3.3/reference/en/html/session-configuration.html#configuration-programmatic
I Think with that approach you can achieve what you want if I understand well your question.
I think you want to generate your entity class at runtime instead of that you have to write your java file and compile it and so on.
If this is your requirement you can use a byte code generator like javassist to generate and annotate your class file at runtime. Then you can persist it to your table using JPA, Hibernate and any other ORM framework.
As I understand you need to develop a tool, collects table names that have one-to-one relationship with Feature table.
My suggestion is like that (tested with Oracle):
1) From your DB, get tables metadata who is referancing your Feature table.
Below will print your Label, Point, etc tables who has foreign key relation to your table.If you want to only generate a subset (irrelevant tables might has this relationship too) may be you put a common foreign key column name and filter out non-related tables with a help of such marking.
Connection connection = jdbcTemplate.getDataSource().getConnection();
DatabaseMetaData metaData = connection.getMetaData();
ResultSet exportedKeys = metaData.getExportedKeys(connection.getCatalog(), "<your_schema_name>", "FEATURE");
while (exportedKeys.next()){
String fkTableName = exportedKeys.getString("FKTABLE_NAME");
String fkColumnName = exportedKeys.getString("FKCOLUMN_NAME");
System.out.println("[fkTableName:" + fkTableName + "], [fkColumnName" + fkColumnName + "]");
}
exportedKeys.close();
2) For the tables you collected above, for each table of our concern, get table metadata for the types and columns.
ResultSet columns = metaData.getColumns(connection.getCatalog(), "<your_schema_name>", "<your_table_name>", null);
while (columns.next()){
String columnName = columns.getString("COLUMN_NAME");
String typeName = columns.getString("TYPE_NAME");
System.out.println("[columnName:" + columnName + "], [typeName" + typeName + "]");
}
columns.close();
3) According to result from 2 generate your Java classes. With fields, getter setters, annotations etc. Then copy them into your source directory. You know the rest :)
Hope this is helpful.
I think you can use Hibernate Reverse Engineering to generate Entity for all the database tables. Please refer this Link. That will explained step by step process to generate entity from database using hibernate reverse engineering.
Do not repeat yourself.
If you really need those classes use an IDE (like eclipse) to generate the classes. Or use generics and inheritance to create only one class that is capable of storing Strings as well as Integers.
But if you do not actually need classes, generate SQL (not JPQL nor HQL) and to store the data in java.util.Map and similar data structures.
Classes are good for:
type safety
combining logic (methods) with data (fields)
describing relationships
In your case you might only need:
store structured data at runtime.
I think you could do this with eclipse, but the classes had to be modified more or less to preserve the inheritance hierarchy.
Righ click on the project name and select Properties
Use project facets if project facets not enabled
Click the JPA if it's not selected, then click OK to close the project properties window.
After enabling JPA in project properties, now right click you eclipse project name again, you should see a new context menu item JPA tools appears. Choose Generate Entities from tables
Select a database connection to let Eclipse get the tables used to generated
class from.
Here is how to setup db in eclipse
It's better to create the entities in a dummy project using the above method and copy the Entity classes to the real project.
Eclipse's Class refactoring may be used to preserve the inheritance hierarchy that you want.
Hope this helps.

how to dynamically query database via web?

Can you recommend on a framework which enable querying of data via web?
Requirements:
ORM capabilities - I want that the representation of the model at the server & client will not be dependent.
For example: let's say that the server will return to the client layer the following model: transaction (firstName, lastName, description, amount). While in the dal-layer it's being saved like this: Customer(Id, fName, lName, address) , Transaction(id, CustomerId, description, amount)
Option to write my own query provider (For example: HiveQL, SQL & etc).
I have tried to use the following frameworks (but it's seems like it, that the first demand is not supported):
JayData: http://jaydata.org/
breezejs: http://www.breezejs.com/
Thanks in advance.
Breeze does provide this but you will need to write the server side code that translates an OData query into a query that your chosen server implements. We already provide several implementations of this code for different server/database technologies and plan on doing more in the future.
To date we have done this for .NET servers with both Entity Framework and NHibernate ORM's and against Node servers with MongoDB backends. We also have other developers working on a Ruby server implementation. If you want to write your own, you should probably take a look at the Breeeze/MongoDB source to see how this is done.
Alternately, if your chosen server tech already has an OData provider, then Breeze can talk to it.
OData does provide a way to query database via web, such as
GET http://myservice/Products?$filter=Id gt 3 and contains(Name,'abc')
GET http://myservice/Products?$select=Id,Name,Provider&$orderby=ManufactureDate desc
Here are some odata samples https://aspnet.codeplex.com/SourceControl/latest#Samples/WebApi/OData/v4/. In a controller, you can use what ever framework/provider you like to retrieve data from persistence.
If you want to use Entity Framework please follow this one:https://aspnet.codeplex.com/SourceControl/latest#Samples/WebApi/OData/v3/ODataActionsSample/.

How to refresh JPA entities when backend database changes asynchronously?

I have a PostgreSQL 8.4 database with some tables and views which are essentially joins on some of the tables. I used NetBeans 7.2 (as described here) to create REST based services derived from those views and tables and deployed those to a Glassfish 3.1.2.2 server.
There is another process which asynchronously updates contents in some of tables used to build the views. I can directly query the views and tables and see these changes have occured correctly. However, when pulled from the REST based services, the values are not the same as those in the database. I am assuming this is because JPA has cached local copies of the database contents on the Glassfish server and JPA needs to refresh the associated entities.
I have tried adding a couple of methods to the AbstractFacade class NetBeans generates:
public abstract class AbstractFacade<T> {
private Class<T> entityClass;
private String entityName;
private static boolean _refresh = true;
public static void refresh() { _refresh = true; }
public AbstractFacade(Class<T> entityClass) {
this.entityClass = entityClass;
this.entityName = entityClass.getSimpleName();
}
private void doRefresh() {
if (_refresh) {
EntityManager em = getEntityManager();
em.flush();
for (EntityType<?> entity : em.getMetamodel().getEntities()) {
if (entity.getName().contains(entityName)) {
try {
em.refresh(entity);
// log success
}
catch (IllegalArgumentException e) {
// log failure ... typically complains entity is not managed
}
}
}
_refresh = false;
}
}
...
}
I then call doRefresh() from each of the find methods NetBeans generates. What normally happens is the IllegalArgumentsException is thrown stating somethng like Can not refresh not managed object: EntityTypeImpl#28524907:MyView [ javaType: class org.my.rest.MyView descriptor: RelationalDescriptor(org.my.rest.MyView --> [DatabaseTable(my_view)]), mappings: 12].
So I'm looking for some suggestions on how to correctly refresh the entities associated with the views so it is up to date.
UPDATE: Turns out my understanding of the underlying problem was not correct. It is somewhat related to another question I posted earlier, namely the view had no single field which could be used as a unique identifier. NetBeans required I select an ID field, so I just chose one part of what should have been a multi-part key. This exhibited the behavior that all records with a particular ID field were identical, even though the database had records with the same ID field but the rest of it was different. JPA didn't go any further than looking at what I told it was the unique identifier and simply pulled the first record it found.
I resolved this by adding a unique identifier field (never was able to get the multipart key to work properly).
I recommend adding an #Startup #Singleton class that establishes a JDBC connection to the PostgreSQL database and uses LISTEN and NOTIFY to handle cache invalidation.
Update: Here's another interesting approach, using pgq and a collection of workers for invalidation.
Invalidation signalling
Add a trigger on the table that's being updated that sends a NOTIFY whenever an entity is updated. On PostgreSQL 9.0 and above this NOTIFY can contain a payload, usually a row ID, so you don't have to invalidate your entire cache, just the entity that has changed. On older versions where a payload isn't supported you can either add the invalidated entries to a timestamped log table that your helper class queries when it gets a NOTIFY, or just invalidate the whole cache.
Your helper class now LISTENs on the NOTIFY events the trigger sends. When it gets a NOTIFY event, it can invalidate individual cache entries (see below), or flush the entire cache. You can listen for notifications from the database with PgJDBC's listen/notify support. You will need to unwrap any connection pooler managed java.sql.Connection to get to the underlying PostgreSQL implementation so you can cast it to org.postgresql.PGConnection and call getNotifications() on it.
An an alternative to LISTEN and NOTIFY, you could poll a change log table on a timer, and have a trigger on the problem table append changed row IDs and change timestamps to the change log table. This approach will be portable except for the need for a different trigger for each DB type, but it's inefficient and less timely. It'll require frequent inefficient polling, and still have a time delay that the listen/notify approach does not. In PostgreSQL you can use an UNLOGGED table to reduce the costs of this approach a little bit.
Cache levels
EclipseLink/JPA has a couple of levels of caching.
The 1st level cache is at the EntityManager level. If an entity is attached to an EntityManager by persist(...), merge(...), find(...), etc, then the EntityManager is required to return the same instance of that entity when it is accessed again within the same session, whether or not your application still has references to it. This attached instance won't be up-to-date if your database contents have since changed.
The 2nd level cache, which is optional, is at the EntityManagerFactory level and is a more traditional cache. It isn't clear whether you have the 2nd level cache enabled. Check your EclipseLink logs and your persistence.xml. You can get access to the 2nd level cache with EntityManagerFactory.getCache(); see Cache.
#thedayofcondor showed how to flush the 2nd level cache with:
em.getEntityManagerFactory().getCache().evictAll();
but you can also evict individual objects with the evict(java.lang.Class cls, java.lang.Object primaryKey) call:
em.getEntityManagerFactory().getCache().evict(theClass, thePrimaryKey);
which you can use from your #Startup #Singleton NOTIFY listener to invalidate only those entries that have changed.
The 1st level cache isn't so easy, because it's part of your application logic. You'll want to learn about how the EntityManager, attached and detached entities, etc work. One option is to always use detached entities for the table in question, where you use a new EntityManager whenever you fetch the entity. This question:
Invalidating JPA EntityManager session
has a useful discussion of handling invalidation of the entity manager's cache. However, it's unlikely that an EntityManager cache is your problem, because a RESTful web service is usually implemented using short EntityManager sessions. This is only likely to be an issue if you're using extended persistence contexts, or if you're creating and managing your own EntityManager sessions rather than using container-managed persistence.
You can either disable caching entirely (see: http://wiki.eclipse.org/EclipseLink/FAQ/How_to_disable_the_shared_cache%3F ) but be preparedto a fairly large performance loss.
Otherwise, you can perform a clear cache programmatically with
em.getEntityManagerFactory().getCache().evictAll();
You can map it to a servlet so you can call it externally - this is better if your database is modify externally very seldom and you just want to be sure JPS will pick up the new version
Just a thought, but how do you receive your EntityManager/Session/whatever?
If you queried the entity in one session, it will be detached in the next one and you will have to merge it back into the persistence context to get it managed again.
Trying to work with detached entities may result in those not-managed exceptions, you should re-query the entity or you could try it with merge (or similar methods).
JPA doesn't do any caching by default. You have to explicitly configure it. I believe its the side effect of the architectural style you have chosen: REST. I think caching is happening at the web servers, proxy servers etc. I suggest you read this and debug more.

Database design to track changes - w/ Hibernate

Hey All I'm having a difficult time with a database design. As you can see from my current design a Registration can have multiple EmployerRegistrations which can have multiple ClientRegistrations. It's pretty simple from here. A new registration needs to be created by the user each year.
Unfortunately I need to be able to track changes / amendments. Changes can be made to the Registration information (name, address, etc.) or Client Registration information (remove / add client or remove / add employer).
I've tried a bunch different designs, but so far nothing feels "right". Tracking amendments in the Registration table is easy as this affects all tables above it. All id's are updated. It's the changes to the ClientRegistration table that's throwing me for a loop. As you can see I have a version column I was trying, but it's not helping me much. With Hibernate it feels like each amended client registration needs it's own unique registration object, but creating a whole new registration for any client registration amendments doesn't seem right / efficient.
I've been battling this for about a week so any help would be greatly appreciated. Thanks!
Have you checked out Hibernate Envers? Its an automatic versioning plugin for Hibernate. Makes tracking history of object changes very easy. It is configured in an AOP way, so you can simply annotate the object you want audited and let Envers handle the details:
#Entity
#Audited
public class Person {
#Id
#GeneratedValue
private int id;
private String name;
private String surname;
#ManyToOne
private Address address;
...
}
You can use Envers, which is now bundled in hibernate-core. Check the docs
Have you come across the concept of Temporal data modeling? You might want to Google for it. One very popular technique for temporal modeling is "Effective Dated" logic, used extensively in Peoplesoft. Briefly, it goes like this:
Every table in the system would have this design pattern:
Table{
Primary_key,
effdt,
effseq,
other data,
modified_ts
};
Multiple versions of the record are "stacked up" using the primary key, effdt and effseq. Effdt stores date only, not datetime. effseq (int) is used store multiple changes on the same day. modified_ts stores the date stamp of data change.
the data in a table would look like this:
PrimaryKey1 2012-01-01 1 MyData1 MyData2
PrimaryKey1 2012-02-01 1 MyData1 Change1
PrimaryKey1 2012-02-01 2 Change2 Change1
To get the latest data from any table, you would use the query like this:
select * from MyTable A
where effdt = (select max(effdt) from MyTable where PrimaryKey = A.PrimaryKey)
and effseq = (select max(effseq) from MyTable where PrimaryKey = A.PrimaryKey
and Effdt=A.EFfdt)
Will that help?

Categories

Resources