Multiple one-to-many relations in Spring JDBC - java

I am using Spring JDBC and I am a bit unsure on how to work with multiple one-to-many relations (or many-to-many). In this case I am injecting a repository into one of my resultsetextractors so that I can retrieve its associations. Is this the way to do it? Is it bad? Are there other better ways?
Note: I have left out the injection of repository
public class SomeResultSetExtractor implements ResultSetExtractor {
public Object extractData(ResultSet rs) throws SQLException, DataAccessException {
List result = new LinkedList();
while (rs.next()) {
SomeObject object = new SomeObject(rs.getString(1), rs.getLong(2));
result.add(object);
List<AnotherObject> otherObjects = anotherRepository.findAllById(object.getId);
object.setOtherObjects(otherObjects);
// and so on
}
return result;
}
}
Okey so after reading Dmytro Polivenok answer I have changed to RowMapper interface instead and I am currently using the other repositories to populate all associations like I show in my example. Is this a good way of doing it?

I think a good practice for Spring JDBC and SQL queries in general is to use one query for each entity.
E.g. assume this model:
Customer (customerId, name, age, ...)
Address (customerId, type, street, city, ...)
PaymentOption (customerId, cardnumber, cardtype, ...)
Customer 1---* Address
Customer 1---* PaymentOption
I would build 3 queries, 3 Daos, 3 ResultSetExtractors/RowcallbackHandlers:
CustomerDao with readCustomerData(Customer or List)
AddressDao with readAddressForCustomer(Customer or List)
PaymentOptionDao with readPaymentOptionsForCustomer(Customer or List)
If you would bake this in 1 query, you would have to build some logic to revert the cartasian product.
I.e. if the customer has 3 addresses and 2 payment options the query would return 6 rows.
This gets quite hard, if Address or PaymentOption does not have an own primary key.
For many to many:
Customer * --recommends-- * Product
I would probably build:
CustomerDao.readRecommendationsAndProductKeys
getDistinctListOfProductKeysFromRecommendations
ProductDao.readProducts
replaceProductKeysByProductsOnRecommendations
Like this you could reuse ProductDao.readProducts for
Customer * --buys-- * Product or
ProductGroup 1---* Product

I think that your code will work, but the concern here is about usage of ResultSetExtractor which is mainly for JDBC framework itself, and for most cases documentation recommends to use RowMapper.
So alternative approach would be to have method in your DAO that selects and maps parent object. Then for each object to invoke other Repository or private method that selects and maps child objects, and then to link child objects with parents based on your relationship type (one-directional or bidirectional). This approach may also allow you to control whether you want to load child objects or not.
For example, you may check Spring PetClinic application which has SimpleJdbcClinic class
If you can use other frameworks, you may consider mybatis, it is more about mapping and allows you to control your SQL code.

Related

How to deal with transient entities after deserialization

Let's say I have a simple REST app with Controller, Service and Data layers. In my Controller layer I do something like this:
#PostMapping("/items")
void save(ItemDTO dto){
Item item = map(dto, Item.class);
service.validate(item);
service.save(item);
}
But then I get errors because my Service layer looks like this:
public void validate(Item item) {
if(item.getCategory().getCode().equals(5)){
throw new IllegalArgumentException("Items with category 5 are not currently permitted");
}
}
I get a NullPointerException at .equals(5), because the Item entity was deserialized from a DTO that only contains category_id, and nothing else (all is null except for the id).
The solutions we have found and have experimented with, are:
Make a special deserializer that takes the ids and automatically fetches the required entities. This, of course, resulted in massive performance problems, similar to those you would get if you marked all your relationships with FetchType.EAGER.
Make the Controller layer fetch all the entities the Service layer will need. The problem is, the Controller needs to know how the underlying service works exactly, and what it will need.
Have the Service layer verify if the object needs fetching before running any validations. The problem is, we couldn't find a reliable way of determining whether an object needs fetching or not. We end up with ugly code like this everywhere:
(sample)
if(item.getCategory().getCode() == null)
item.setCategory(categoryRepo.findById(item.getCategory().getId()));
What other ways would you do it to keep Services easy to work with? It's really counterintuitive for us having to check every time we want to use a related entity.
Please note this question is not about finding any way to solve this problem. It's more about finding better ways to solve it.
From my understanding, it would be very difficult for modelMapper to map an id that is in the DTO to the actual entity.
The problem is that modelMapper or some service would have to do a lookup and inject the entity.
If the category is a finite set, could use an ENUM and use static ENUM mapping?
Could switch the logic to read
if(listOfCategoriesToAvoid.contains(item.getCategory())){ throw new IllegalArgumentException("Items with category 5 are not currently permitted"); }
and you could populate the listOfCategoriesToAvoid small query, maybe even store it in a properties file/table where it could be a CSV?
When you call the service.save(item), wouldn't it still fail to populate the category because that wouldn't be populated? Maybe you can send the category as a CategoryDTO inside the itemDTO that populated the Category entity on the model.map() call.
Not sure if any of these would work for you.
From what I can gather the map(dto, Item.class) method does something like this:
Long categoryId = itemDto.getCategoryId();
Category cat = new Category();
cat.setId(categoryId);
outItem.setCategory(cat);
The simplest solution would be to have it do this inside:
Long categoryId = itemDto.getCategoryId();
Category cat = categoryRepo.getById(categoryId);
outItem.setCategory(cat);
Another option is since you are hardcoding the category code 5 until its finished, you could hard-code the category IDs that have it instead, if those are not something that you expect to be changed by users.
Why aren't you just using the code as primary key for Category? This way you don't have to fetch anything for this kind of check. The underlying problem though is that the object mapper is just not able to cope with the managed nature of JPA objects i.e. it doesn't know that it should actually retrieve objects by PK through e.g. EntityManager#getReference. If it were doing that, then you wouldn't have a problem as the proxy returned by that method would be lazily initialized on the first call to getCode.
I suggest you look at something like Blaze-Persistence Entity Views which has first class support for something like that.
I created the library to allow easy mapping between JPA models and custom interface or abstract class defined models, something like Spring Data Projections on steroids. The idea is that you define your target structure(domain model) the way you like and map attributes(getters) via JPQL expressions to the entity model.
A DTO model for your use case could look like the following with Blaze-Persistence Entity-Views:
#EntityView(Item.class)
// You can omit the strategy to default to QUERY when using the code as PK of Category
#UpdatableEntityView(strategy = FlushStrategy.ENTITY)
public interface ItemDTO {
#IdMapping
Long getId();
String getName();
void setName(String name);
CategoryDTO getCategory();
void setCategory(CategoryDTO category);
#EntityView(Category.class)
interface CategoryDTO {
#IdMapping
Long getId();
}
}
Querying is a matter of applying the entity view to a query, the simplest being just a query by id.
ItemDTO a = entityViewManager.find(entityManager, ItemDTO.class, id);
The Spring Data integration allows you to use it almost like Spring Data Projections: https://persistence.blazebit.com/documentation/entity-view/manual/en_US/index.html#spring-data-features
Page<ItemDTO> findAll(Pageable pageable);
The best part is, it will only fetch the state that is actually necessary!
And in your case of saving data, you can use the Spring WebMvc integration
that would look something like the following:
#PostMapping("/items")
void save(ItemDTO dto){
service.save(dto);
}
class ItemService {
#Autowired
ItemRepository repository;
#Transactional
public void save(ItemDTO dto) {
repository.save(dto);
Item item = repository.getOne(dto);
validate(item);
}
// other code...
}

Use Spring Data JPA, QueryDSL to update a bunch of records

I'm refactoring a code base to get rid of SQL statements and primitive access and modernize with Spring Data JPA (backed by hibernate). I do use QueryDSL in the project for other uses.
I have a scenario where the user can "mass update" a ton of records, and select some values that they want to update. In the old way, the code manually built the update statement with an IN statement for the where for the PK (which items to update), and also manually built the SET clauses (where the options in SET clauses can vary depending on what the user wants to update).
In looking at QueryDSL documentation, it shows that it supports what I want to do. http://www.querydsl.com/static/querydsl/4.1.2/reference/html_single/#d0e399
I tried looking for a way to do this with Spring Data JPA, and haven't had any luck. Is there a repostitory interface I'm missing, or another library that is required....or would I need to autowire a queryFactory into a custom repository implementation and very literally implement the code in the QueryDSL example?
You can either write a custom method or use #Query annotation.
For custom method;
public interface RecordRepository extends RecordRepositoryCustom,
CrudRepository<Record, Long>
{
}
public interface RecordRepositoryCustom {
// Custom method
void massUpdateRecords(long... ids);
}
public class RecordRepositoryImpl implements RecordRepositoryCustom {
#Override
public void massUpdateRecords(long... ids) {
//implement using em or querydsl
}
}
For #Query annotation;
public interface RecordRepository extends CrudRepository<Record, Long>
{
#Query("update records set someColumn=someValue where id in :ids")
void massUpdateRecords(#Param("ids") long... ids);
}
There is also #NamedQuery option if you want your model class to be reusable with custom methods;
#Entity
#NamedQuery(name = "Record.massUpdateRecords", query = "update records set someColumn=someValue where id in :ids")
#Table(name = "records")
public class Record {
#Id
#GeneratedValue(strategy = GenerationType.AUTO)
private Long id;
//rest of the entity...
}
public interface RecordRepository extends CrudRepository<Record, Long>
{
//this will use the namedquery
void massUpdateRecords(#Param("ids") long... ids);
}
Check repositories.custom-implementations, jpa.query-methods.at-query and jpa.query-methods.named-queries at spring data reference document for more info.
This question is quite interesting for me because I was solving this very problem in my current project with the same technology stack mentioned in your question. Particularly we were interested in the second part of your question:
where the options in SET clauses can vary depending on what the user
wants to update
I do understand this is the answer you probably do not want to get but we did not find anything out there :( Spring data is quite cumbersome for update operations especially when it comes to their flexibility.
After I saw your question I tried to look up something new for spring and QueryDSL integration (you know, maybe something was released during past months) but nothing was released.
The only thing that brought me quite close is .flush in entity manager meaning you could follow the following scenario:
Get ids of entities you want to update
Retrieve all entities by these ids (first actual query to db)
Modify them in any way you want
Call entityManager.flush resulting N separate updates to database.
This approach results N+1 actual queries to database where N = number of ids needed to be updated. Moreover you are moving the data back and forth which is actually not good too.
I would advise to
autowire a queryFactory into a custom repository
implementation
Also, have a look into spring data and querydsl example. However you will find only lookup examples.
Hope my pessimistic answer helps :)

Obtaining Lists from Database and Populating Array/Hashmap Using Hibernate

As part of my program, I'm using relational tables which hold information such as - user role, job category etc. Each table may have slightly differing fields - for example:
User Role Table has the following fields:
id (auto-generated)
role (eg Planner, Admin etc)
role_description (description of above role)
enabled (toggle this role on/off)
Job Category Table:
id (auto-generated)
category (eg Service, Maintenance etc)
category_description (description of above)
category_group (categories are grouped into management areas)
...
enabled (toggle category on/off)
The lists can be changed by the end user so I need to provide an admin section to enable new roles/categories to be added.
I had thought of creating a routine where I pass the entity class of the role/category etc and have it generate an array which can be used to populate the admin section but have only been able to do this for the 1st two columns - eg id/role or id/category.
With the fields differing between each entity, is there a way that I can do this? Or will I have to create a method in each of the entities - such as getRoleList and getCategoryList etc?
Thanks.
After a bit of experimenting, I've decided to implement this in the following way.
I've added methods to my database helper class that will read the list and populate an array. I'll have to create a separate method for each entity but I've decided this would be necessary due to the differences between the classes.
I'm not 100% sure that this is the most efficient way of accomplishing this but it does what I need (for now).
One of the methods:
public static UserRole[] getUserRoleList(String order, Boolean reverseOrder) throws SQLException {
Session session = openSession();
List<UserRole> list;
if (!reverseOrder) {
// obtain list and sort by provided field in ascending order
list = session.createCriteria(UserRole.class).addOrder(Order.asc(order)).list();
} else {
// sort descending
list = session.createCriteria(UserRole.class).addOrder(Order.desc(order)).list();
}
// return UserRole[]
return list.toArray((UserRole[]) Array.newInstance(UserRole.class, list.size()));
}
The rest of the methods will be pretty much identical (substituting the entity/class names). The only difference would be adding another argument for some entities (enabled Boolean, so I can return only items in the list which are enabled).
Edit:
Since posting the above, I changed my mind and moved to a generic method to obtain lists, passing in the entity class as below:
public static List getList(Class entity, String order, Boolean reverseOrder, Boolean enabled) {
// stripped for brevity...
list = session.createCriteria(entity)
.add(Restrictions.eq("enabled", true))
.addOrder(Order.asc(order)).list();
// stripped more...
return list;
}
Casting when calling the method:
List<User> userList = DatabaseHelper.getList(User.class);

EJB named query preference

I have a database with this structure.
I am using JSP + Servlet + Entity Classes from database + Session Beans for entity classes. As you can see, my tables are normalized which in return makes it necessary to join tables to obtain the whole details of a patient/staff. As i studied the https://netbeans.org/kb/docs/javaee/ecommerce/intro.html i saw that they access the database by using the facade.find etc and etc. Considering my case, I have also tried using the same thing.
For example. I have a session bean (Profile Manager) which accesses the entities and puts it in the map.
public Map getPatientDetails(int patientID)
{
Map patientMap = new HashMap();
Patient patient = patientFacade.find(patientID);
User user = userFacade.find(patient.getUserId().getId());
UserContact userContact = user.getUserContact();
Family family = familyFacade.find(patient.getFamilyId().getId());
String patientDOB = new SimpleDateFormat("MMMMM dd, yyyy").format(user.getDateOfBirth());
patientMap.put("familyRecord", family);
patientMap.put("patientRecord", patient);
patientMap.put("patientDOB", patientDOB);
patientMap.put("userRecord", user);
patientMap.put("userContactRecord", userContact);
return patientMap;
}
As I give myself time to think about it, I thought that I can join the entities by using and setting a namedquery instead making it a single access. Which is the right way to do this? Do you think using facades to access my database is better than constructing an inner join query to acheive getting all the information at once? What would you guys suggest? Thanks!
I would suggest you to avoid joins in your SQL as, in my experience, it is one of the main root cause of performance issues associated to data access layer.
I would suggest to fetch entity one by one (like hibernate). In this method, there will be round trips to the database. But the SQLs will be simple and thus faster.

JPA - Setting entity class property from calculated column?

I'm just getting to grips with JPA in a simple Java web app running on Glassfish 3 (Persistence provider is EclipseLink). So far, I'm really liking it (bugs in netbeans/glassfish interaction aside) but there's a thing that I want to be able to do that I'm not sure how to do.
I've got an entity class (Article) that's mapped to a database table (article). I'm trying to do a query on the database that returns a calculated column, but I can't figure out how to set up a property of the Article class so that the property gets filled by the column value when I call the query.
If I do a regular "select id,title,body from article" query, I get a list of Article objects fine, with the id, title and body properties filled. This works fine.
However, if I do the below:
Query q = em.createNativeQuery("select id,title,shorttitle,datestamp,body,true as published, ts_headline(body,q,'ShortWord=0') as headline, type from articles,to_tsquery('english',?) as q where idxfti ## q order by ts_rank(idxfti,q) desc",Article.class);
(this is a fulltext search using tsearch2 on Postgres - it's a db-specific function, so I'm using a NativeQuery)
You can see I'm fetching a calculated column, called headline. How do I add a headline property to my Article class so that it gets populated by this query?
So far, I've tried setting it to be #Transient, but that just ends up with it being null all the time.
There are probably no good ways to do it, only manually:
Object[] r = (Object[]) em.createNativeQuery(
"select id,title,shorttitle,datestamp,body,true as published, ts_headline(body,q,'ShortWord=0') as headline, type from articles,to_tsquery('english',?) as q where idxfti ## q order by ts_rank(idxfti,q) desc","ArticleWithHeadline")
.setParameter(...).getSingleResult();
Article a = (Article) r[0];
a.setHeadline((String) r[1]);
-
#Entity
#SqlResultSetMapping(
name = "ArticleWithHeadline",
entities = #EntityResult(entityClass = Article.class),
columns = #ColumnResult(name = "HEADLINE"))
public class Article {
#Transient
private String headline;
...
}
AFAIK, JPA doesn't offer standardized support for calculated attributes. With Hibernate, one would use a Formula but EclipseLink doesn't have a direct equivalent. James Sutherland made some suggestions in Re: Virtual columns (#Formula of Hibernate) though:
There is no direct equivalent (please
log an enhancement), but depending on
what you want to do, there are ways to
accomplish the same thing.
EclipseLink defines a
TransformationMapping which can map a
computed value from multiple field
values, or access the database.
You can override the SQL for any CRUD
operation for a class using its
descriptor's DescriptorQueryManager.
You could define a VIEW on your
database that performs the function
and map your Entity to the view
instead of the table.
You can also perform minor
translations using Converters or
property get/set methods.
Also have a look at the enhancement request that has a solution using a DescriptorEventListener in the comments.
All this is non standard JPA of course.

Categories

Resources