I have a database with this structure.
I am using JSP + Servlet + Entity Classes from database + Session Beans for entity classes. As you can see, my tables are normalized which in return makes it necessary to join tables to obtain the whole details of a patient/staff. As i studied the https://netbeans.org/kb/docs/javaee/ecommerce/intro.html i saw that they access the database by using the facade.find etc and etc. Considering my case, I have also tried using the same thing.
For example. I have a session bean (Profile Manager) which accesses the entities and puts it in the map.
public Map getPatientDetails(int patientID)
{
Map patientMap = new HashMap();
Patient patient = patientFacade.find(patientID);
User user = userFacade.find(patient.getUserId().getId());
UserContact userContact = user.getUserContact();
Family family = familyFacade.find(patient.getFamilyId().getId());
String patientDOB = new SimpleDateFormat("MMMMM dd, yyyy").format(user.getDateOfBirth());
patientMap.put("familyRecord", family);
patientMap.put("patientRecord", patient);
patientMap.put("patientDOB", patientDOB);
patientMap.put("userRecord", user);
patientMap.put("userContactRecord", userContact);
return patientMap;
}
As I give myself time to think about it, I thought that I can join the entities by using and setting a namedquery instead making it a single access. Which is the right way to do this? Do you think using facades to access my database is better than constructing an inner join query to acheive getting all the information at once? What would you guys suggest? Thanks!
I would suggest you to avoid joins in your SQL as, in my experience, it is one of the main root cause of performance issues associated to data access layer.
I would suggest to fetch entity one by one (like hibernate). In this method, there will be round trips to the database. But the SQLs will be simple and thus faster.
Related
I have one table which has all the api audit information - Table name : api_audit
I have one table which has extra information about every api call - Table name : api_audit_info
Inside api_audit I have primary key as "transaction_id".
I want all the data from api_audit table and some data from api_audit_info table.
I have written a custom query like -
#Query(select c from ApiAudit c INNER JOIN ApiAudiInfo t ON c.transactionId = t.msgId)
But the issue is that the result type that I am getting this way contains only ApiAudit type data.
What shall I do to get data from both the tables. Please help.
Note: I am using JpaRepository as I need paginated data.
I am fairly new to Spring boot and JPA so not sure exactly which direction to look to.
Whenever I need to join data from more than 1 table, I am using Jdbi.
Here you have the official documentation:
Remember to include the required dependencies and configure a bean for Jdbi in your project.
Then I create a repository class, POJO and query with all of the information which I need. For example:
select c.transaction_id as transactionId, t.name as name from ApiAudit c INNER JOIN ApiAudiInfo t ON c.transactionId = t.msgId
Here you have some code samples from official documentation
After you map your data to POJO, you can use
public PageImpl(List<T> content, Pageable pageable, long total)
to return paginated data.
There is a big chance that there is a better solution, but this works for me every time.
I have Document entity and some managed document object for doc with id=1.
Document managedDoc = entityManager.find(Document .class, 1);
managedDoc.setName("changedName");
As I know, managed doc state changed in persistent context (futher PC) after calling setter but nothing changed in database. Somewhere in my code I do the following:
Query query = entityManager.createQuery("from Document");
List<Document> list = query.getResultList();
return list;
When I perform select-all query as shown above, is document with id=1 taken from DB or from PC? From DB means select will not see new name because new name still in PC.
Actually, my problem is in updating via merge() and flush() and futher retrieving all objects - currently my select-all query doesn't see new values of some fields. Looks like merge+flush is OK, but JPA Query reads not from DB but from PC. But even if I'm right, both PC and DB contains new value of the name, why my select-all doesn't see it?
Moreover, select all sometimes returns correct/updated values, sometimes not
UPDATE
Clarification:
I put some object to PC via entityManager.find(Document .class, 1);
I create new detached instance with some name property setted. Id and other props gotten from managed instance. For example,
managedDoc = getFromSomeDataStructure();
Document nonManaged = new Document(managedDoc.getId()); nonManaged.setName("newName");
I update DB via em.merge(nonManaged);flush();
I saw my changes in DB when check it in Workbench.
I'm pressing F5 (and even CTRL+F5) button which performs select-all JPQL query and on each odd button press==select-all query I see non-actual old value, on each even button press==select-all query I see correct value.
It will be taken from the Persistent Context, as long as it has them their. To be more correct: as long as you have an entity in a managed state (i.e in the Persistence Context), it will not be overrriden. Of course, in the context when the same EntityManager instance is used.
If you want to refetch the value from DB, you have different possibilities:
Use another EntityManager, in a different transaction (important!).
Use EntityManager.detach() or if you want to clear the entire persistence context, use EntityManager.clear()
Use EntityManager.refresh() to throw out all changes made to an entity instance.
Let me try to clarify with a couple of examples an maybe this answer your question or with luck, helps to make the question clearer.
Scenario #1: Two Different Reads
Department department = em.find(Department.class, 1);
department.setName("Jedi Masters");
TypedQuery<Department> typedQuery = em.createQuery("SELECT d FROM Department d", Department.class);
List<Department> departments = typedQuery.getResultList();
for(Department found : departments){
if(found.getId().equals(1)){
assert found == department;
assert found.getName().equals(department.getName());
}
}
In this first scenario you can expect the department and found to be exact same instance and therefore have the exact same values. Both assertions above pass.
Scenario #2: Merging Detached Entity
//detached entity
Department department = new Department();
department.setId(1);
department.setName("Jedi Masters");
em.merge(department);
TypedQuery<Department> typedQuery = em.createQuery("SELECT d FROM Department d", Department.class);
List<Department> departments = typedQuery.getResultList();
for(Department found : departments){
if(found.getId().equals(1)){
assert found != department);
assert found.getName().equals(department.getName());
}
}
At least with Hibernate, the behavior in this case is slightly different. The two objects are not the same instance. They are different instances, but they still should have the same contents.
So, depending on your implementation on how you are comparing them you might get unexpected results, above all if you do not implemented a right equals/hashCode protocol for detached cases like this.
As answered here, I should call refresh() for each item in result list. But only refreshing didn't work for me. After setting READ COMMITED in persistence.xml by writing
<property name="hibernate.connection.isolation" value="2" />
everything worked perfectly.
P.S Don't forget to mark select method as #Transactional because refresh() doesn't work without this annotation.
I am using Spring JDBC and I am a bit unsure on how to work with multiple one-to-many relations (or many-to-many). In this case I am injecting a repository into one of my resultsetextractors so that I can retrieve its associations. Is this the way to do it? Is it bad? Are there other better ways?
Note: I have left out the injection of repository
public class SomeResultSetExtractor implements ResultSetExtractor {
public Object extractData(ResultSet rs) throws SQLException, DataAccessException {
List result = new LinkedList();
while (rs.next()) {
SomeObject object = new SomeObject(rs.getString(1), rs.getLong(2));
result.add(object);
List<AnotherObject> otherObjects = anotherRepository.findAllById(object.getId);
object.setOtherObjects(otherObjects);
// and so on
}
return result;
}
}
Okey so after reading Dmytro Polivenok answer I have changed to RowMapper interface instead and I am currently using the other repositories to populate all associations like I show in my example. Is this a good way of doing it?
I think a good practice for Spring JDBC and SQL queries in general is to use one query for each entity.
E.g. assume this model:
Customer (customerId, name, age, ...)
Address (customerId, type, street, city, ...)
PaymentOption (customerId, cardnumber, cardtype, ...)
Customer 1---* Address
Customer 1---* PaymentOption
I would build 3 queries, 3 Daos, 3 ResultSetExtractors/RowcallbackHandlers:
CustomerDao with readCustomerData(Customer or List)
AddressDao with readAddressForCustomer(Customer or List)
PaymentOptionDao with readPaymentOptionsForCustomer(Customer or List)
If you would bake this in 1 query, you would have to build some logic to revert the cartasian product.
I.e. if the customer has 3 addresses and 2 payment options the query would return 6 rows.
This gets quite hard, if Address or PaymentOption does not have an own primary key.
For many to many:
Customer * --recommends-- * Product
I would probably build:
CustomerDao.readRecommendationsAndProductKeys
getDistinctListOfProductKeysFromRecommendations
ProductDao.readProducts
replaceProductKeysByProductsOnRecommendations
Like this you could reuse ProductDao.readProducts for
Customer * --buys-- * Product or
ProductGroup 1---* Product
I think that your code will work, but the concern here is about usage of ResultSetExtractor which is mainly for JDBC framework itself, and for most cases documentation recommends to use RowMapper.
So alternative approach would be to have method in your DAO that selects and maps parent object. Then for each object to invoke other Repository or private method that selects and maps child objects, and then to link child objects with parents based on your relationship type (one-directional or bidirectional). This approach may also allow you to control whether you want to load child objects or not.
For example, you may check Spring PetClinic application which has SimpleJdbcClinic class
If you can use other frameworks, you may consider mybatis, it is more about mapping and allows you to control your SQL code.
I'm just getting to grips with JPA in a simple Java web app running on Glassfish 3 (Persistence provider is EclipseLink). So far, I'm really liking it (bugs in netbeans/glassfish interaction aside) but there's a thing that I want to be able to do that I'm not sure how to do.
I've got an entity class (Article) that's mapped to a database table (article). I'm trying to do a query on the database that returns a calculated column, but I can't figure out how to set up a property of the Article class so that the property gets filled by the column value when I call the query.
If I do a regular "select id,title,body from article" query, I get a list of Article objects fine, with the id, title and body properties filled. This works fine.
However, if I do the below:
Query q = em.createNativeQuery("select id,title,shorttitle,datestamp,body,true as published, ts_headline(body,q,'ShortWord=0') as headline, type from articles,to_tsquery('english',?) as q where idxfti ## q order by ts_rank(idxfti,q) desc",Article.class);
(this is a fulltext search using tsearch2 on Postgres - it's a db-specific function, so I'm using a NativeQuery)
You can see I'm fetching a calculated column, called headline. How do I add a headline property to my Article class so that it gets populated by this query?
So far, I've tried setting it to be #Transient, but that just ends up with it being null all the time.
There are probably no good ways to do it, only manually:
Object[] r = (Object[]) em.createNativeQuery(
"select id,title,shorttitle,datestamp,body,true as published, ts_headline(body,q,'ShortWord=0') as headline, type from articles,to_tsquery('english',?) as q where idxfti ## q order by ts_rank(idxfti,q) desc","ArticleWithHeadline")
.setParameter(...).getSingleResult();
Article a = (Article) r[0];
a.setHeadline((String) r[1]);
-
#Entity
#SqlResultSetMapping(
name = "ArticleWithHeadline",
entities = #EntityResult(entityClass = Article.class),
columns = #ColumnResult(name = "HEADLINE"))
public class Article {
#Transient
private String headline;
...
}
AFAIK, JPA doesn't offer standardized support for calculated attributes. With Hibernate, one would use a Formula but EclipseLink doesn't have a direct equivalent. James Sutherland made some suggestions in Re: Virtual columns (#Formula of Hibernate) though:
There is no direct equivalent (please
log an enhancement), but depending on
what you want to do, there are ways to
accomplish the same thing.
EclipseLink defines a
TransformationMapping which can map a
computed value from multiple field
values, or access the database.
You can override the SQL for any CRUD
operation for a class using its
descriptor's DescriptorQueryManager.
You could define a VIEW on your
database that performs the function
and map your Entity to the view
instead of the table.
You can also perform minor
translations using Converters or
property get/set methods.
Also have a look at the enhancement request that has a solution using a DescriptorEventListener in the comments.
All this is non standard JPA of course.
I am building an app based on google app engine (Java) using JDO for persistence.
Can someone give me an example or a point me to some code which shows persisting of multiple entities (of same type) using javax.jdo.PersistenceManager.makePersistentAll() within a transaction.
Basically I need to understand how to put multiple entites in one Entity Group so that they can be saved using makePersistentAll() inside transaction.
This section of the docs deals with exactly that.
i did this:
public static final Key root_key = KeyFactory.createKey("Object", "RootKey");
...
so a typical datastore persistent object will set the id in the constructor instead of getting one automatically
public DSO_MyType(string Name, Key parent)
{
KeyFactory.Builder b = new KeyFactory.Builder(parent);;
id = b.addChild(DSO_MyType.class.getSimpleName() , Name).getKey();
}
and you pass root_key as the parent
i'm not sure if you can pass different parents to objects of the same kind
Thanks for the response Nick.
This document only tells about implicit handling of entity groups by app engine when its a parent-child relationship. I want to save multiple objects of same type using PeristentManager.makePersistentAll(list) within a transaction. If objects are not same Entity Group this throws exception. Currently I could do it as below but think there must be a better and more appropriate approach to do this -
User u1 = new User("a");
UserDAO.getInstance().addObject(user1);
// UserDAO.addObject uses PersistentManager.makePersistent() in transaction and user
// object now has its Key set. I want to avoid this step.
User u2 = new User("x");
u2.setKey(KeyFactory.createKey(u1.getKey(),User.class.getSimpleName(), 100 /*some random id*/));
User u3 = new User("p");
u3.setKey(KeyFactory.createKey(u1.getKey(), User.class.getSimpleName(), 200));
UserDAO.getInstance().addObjects(Arrays.asList(new User[]{u2, u3}));
// UserDAO.addObjects uses PersistentManager.makePersistentAll() in transaction.
Although this approach works, the problem with this is that you have to depend on an already persistent entity to create an entity group.
Gopi, AFAIK you don't have to do that... this should work (haven't tested it):
List<User> userList = new ArrayList<User>();
userList.add(new User("a"));
userList.add(new User("b"));
userList.add(new User("c"));
UserDAO().getInstance().addObjects(userList);
Again, AFAIK, this should put all these objects in the same entity group. I'd love to know if I am wrong.