Basic Java MVC: Beans and associative entities with attributes - java

I'm working in this movie db project for college where I need to create a website based on java to do advanced searches in a huge postgresql database. I'm not using hibernate or similar tools. Here's part of the ER diagram for the database:
As you can see, the associative entity actormovie links the entities actor and movie while also listing the character portrayed. I have created two simple beans, Actor and Movie, with attributes, getters and setters.
This is my first java web project with focus on MVC, so I'm more than a little lost. My question is: Should I create a bean mapping the associative table? If not, what do I do with the as_character attribute?

The answer is yes.
This is because the relation Actor->Movie has a property as_character, you can also find a way to not do a class, but in the long time it will cause problems (maybe some stupid bug created because you forgot it, or someone else didn't know it, something you don't want to deal with).
If this is your first approach I think what can make you confused is how to represents the relationship.
The first approach that come in mind, most of the time, is to have an ActorMovie class like:
public class ActorMovie {
Integer actor_id;
Integer movieid;
String as_character;
//getters, setters, equals, hashCode, toString
}
But you can also think at it as a value of Actor (or Movie) and have it like:
public class ActorMovie {
Integer movieid;
String as_character;
//getters, setters, equals, hashCode, toString
}
and a Actor class:
public class Actor {
Integer actor_id;
String name;
String sex;
Set<ActorMovie> movies;
//getters, setters, equals, hashCode, toString
}
Both of them solve the problems, they just change how you will interact with these data through the code, to learn when is better to use one or the other you have to try both and see what change, so choose what you feel more "natural" and see the results.

No, you don't need. The class Actor will have a list of Movie, and the Movie class, a list of Actor. Just map like this.
To the character attribute, you could create a map in the Actor class, where the key, is the movie, and the value, the character (Or a list of character, because an actor can have many characters in the same movie):
Map<Movie, List<String>> characters = new HashMap<>()

It depends how your project will develop. You should create bean mapping for the associative table, if you are not using Object Relational Mapping(as you stated in the question). If you will introduce Object Relational Mapping later on, then Actor can own as_character property, then you should not create bean mapping for the associative table .

Related

Apply filter on all annotated fields of a POJO (Java)

Imagine that there are such kind of POJO classes, that just keep data:
public class Pojo() {
#AnnotatedProp
String someField;
SubPojo someSubPojo;
String someOtherFieldA;
String someOtherFieldB;
}
public class SubPojo() {
#AnnotatedProp
String someSubField;
Integer someOtherFieldC;
}
someField of Pojo and someSubField of SubPojo are marked special with the #AnnotatedProp property.
I'd like to modify an object of type Pojo. All String fields with #AnnotatedProp annotation should be modified. A "filter" should modify the values of these fields, e.g. replace some characters inside.
I tried with FieldUtils / simple reflection, but I ended up in stack overflows (the exception AND this forum).
What would be the best way to filter these fields?
Thanks for help.
I've written a recursive POJO filter/transformer doing just that, because I needed it for a project. Just cleaning up and getting green light for release, but the key points are:
use FieldUtils.readField(field, node, true) for traversal - for getting all fields annotated with your Annotation, there is also a direct method FieldUtils.getFieldsListWithAnnotation() but I needed a more flexible way to scan all nodes first, so I can drill down on sub objects
you need to check against your own base package name to identify your custom POJO classes and only traverse those sub objects.
once you have your annotated field, you can simply use FieldUtils.writeField(field, node, getMyValueForField(field), true);
Caveats:
to avoid a StackOverflow for back-references (child->ancestor), I keep a HashMap and store the FQCN of the nodes I traverse into
before rolling my own, I checked Apache commons ObjectGraphIterator, but found it not suitable for my purpose.

DTOs with different granularity

I'm on a project that uses the latest Spring+Hibernate for persistence and for implementing a REST API.
The different tables in the database contain lots of records which are in turn pretty big as well. So, I've created a lot of DAOs to retrieve different levels of detail and their accompanying DTOs.
For example, if I have some Employee table in the database that contains tons of information about each employee. And if I know that any client using my application would benefit greatly from retrieving different levels of detail of an Employee entity (instead of being bombarded by the entire entity every time), what I've been doing so far is something like this:
class EmployeeL1DetailsDto
{
String id;
String firstName;
String lastName;
}
class EmployeeL2DetailsDto extends EmployeeL1DetailsDto
{
Position position;
Department department;
PhoneNumber workPhoneNumber;
Address workAddress;
}
class EmployeeL3DetailsDto extends EmployeeL2DetailsDto
{
int yearsOfService;
PhoneNumber homePhoneNumber;
Address homeAddress;
BidDecimal salary;
}
And So on...
Here you see that I've divided the Employee information into different levels of detail.
The accompanying DAO would look something like this:
class EmployeeDao
{
...
public List<EmployeeL1DetailsDto> getEmployeeL1Detail()
{
...
// uses a criteria-select query to retrieve only L1 columns
return list;
}
public List<EmployeeL2DetailsDto> getEmployeeL2Detail()
{
...
// uses a criteria-select query to retrieve only L1+L2 columns
return list;
}
public List<EmployeeL3DetailsDto> getEmployeeL3Detail()
{
...
// uses a criteria-select query to retrieve only L1+L2+L3 columns
return list;
}
.
.
.
// And so on
}
I've been using hibernate's aliasToBean() to auto-map the retrieved Entities into the DTOs. Still, I feel the amount of boiler-plate in the process as a whole (all the DTOs, DAO methods, URL parameters for the level of detail wanted, etc.) are a bit worrying and make me think there might be a cleaner approach to this.
So, my question is: Is there a better pattern to follow to retrieve different levels of detail from a persisted entity?
I'm pretty new to Spring and Hibernate, so feel free to point anything that is considered basic knowledge that you think I'm not aware of.
Thanks!
I would go with as little different queries as possible. I would rather make associations lazy in my mappings, and then let them be initialized on demand with appropriate Hibernate fetch strategies.
I think that there is nothing wrong in having multiple different DTO classes per one business model entity, and that they often make the code more readable and maintainable.
However, if the number of DTO classes tends to explode, then I would make a balance between readability (maintainability) and performance.
For example, if a DTO field is not used in a context, I would leave it as null or fill it in anyway if that is really not expensive. Then if it is null, you could instruct your object marshaller to exclude null fields when producing REST service response (JSON, XML, etc) if it really bothers the service consumer. Or, if you are filling it in, then it's always welcome later when you add new features in the application and it starts being used in a context.
You will have to define in one way or another the different granularity versions. You can try to have subobjects that are not loaded/set to null (as recommended in other answers), but it can easily get quite awkward, since you will start to structure your data by security concerns and not by domain model.
So doing it with individual classes is after all not such a bad approach.
You might want to have it more dynamic (maybe because you want to extend even your data model on db side with more data).
If that's the case you might want to move the definition out from code to some configurations (could even be dynamic at runtime). This will of course require a dynamic data model also on Java side, like using a hashmap (see here on how to do that). You gain thereby a dynamic data model, but loose the type safety (at least to a certain extend). In other languages that probably would feel natural but in Java it's less common.
It would now be up to your HQL to define on how you want to populate your object.
The path you want to take depends now a lot on the context, how your object will get used
Another approach is to use only domain objects at Dao level, and define the needed subsets of information as DTO for each usage. Then convert the Employee entity to each DTO's using the Generic DTO converter, as I have used lately in my professional Spring activities. MIT-licenced module is available at Maven repository artifact dtoconverter .
and further info and user guidance at author's Wiki:
http://ratamaa.fi/trac/dtoconverter
Quickest idea you get from the example page there:
Happy hunting...
Blaze-Persistence Entity Views have been created for exactly such a use case. You define the DTO structure as interface or abstract class and have mappings to your entity's attributes. When querying, you just pass in the class and the library will take care of generating an optimized query for the projection.
Here a quick example
#EntityView(Cat.class)
public interface CatView {
#IdMapping("id")
Integer getId();
String getName();
}
CatView is the DTO definition and here comes the querying part
CriteriaBuilder<Cat> cb = criteriaBuilderFactory.create(entityManager, Cat.class);
cb.from(Cat.class, "theCat")
.where("father").isNotNull()
.where("mother").isNotNull();
EntityViewSetting<CatView, CriteriaBuilder<CatView>> setting = EntityViewSetting.create(CatView.class);
List<CatView> list = entityViewManager
.applySetting(setting, cb)
.getResultList();
Note that the essential part is that the EntityViewSetting has the CatView type which is applied onto an existing query. The generated JPQL/HQL is optimized for the CatView i.e. it only selects(and joins!) what it really needs.
SELECT
theCat.id,
theCat.name
FROM
Cat theCat
WHERE theCat.father IS NOT NULL
AND theCat.mother IS NOT NULL

Object formation with relationships(JDBC)

There are 3 entities (which matches tables):
public class Enterprise{
private long id;
private String name;
private List<Department> departments;
//getters()/setters()
}
public class Department{
private long id;
private String name;
private List<Employee> employees;
//getters()/setters()
}
public class Employee{
private long id;
private String name;
private List<Department> departments;
//getters()/setters()
}
ENTERPRISE---|OneToMany|---DEPARTMENT---|ManyToMany|---EMPLOYEE
Can someone write method on JDBC :
List<Enterprise> findAll();
The connection, statements, queries, etc. can be ignored. The main difficulty is to set all references on the correct objects (for example, to avoid:
enterprise.getDepartments().get(1).getEmployees().get(1).getDepartments() == NULL) .
EXAMPLE (The beginning of method):
List<Enterprise> findAll(){
ResultSet rs = executeQuery(SELECT_ALL_ENTERPRISES);
List<Enterprise> ents = createEnterprises(rs);
.........
Mapping objects to relations is not as easy as it would seem. They have been working on it for decades now, with decent results only in some scenarios. The good news is that the scenarios that work can accommodate most programs.
I suggest that you take a different approach, but first I'll give you an example that will help you understand why I suggest the different approach.
Imagine a person who wants to look up all Departments, which will require a look up of all Employees (as they are part of a Department object). Which will require that for each employee, a list of departments would need to be looked up, which would require that those departments would need a list of employees, which would ....
Perhaps now you get the idea.
So many systems that are structured like yours don't actually return full Employees when looking up departments. They return "Employee identifiers". This allows one to look up all the Departments, but it guarantees that no Employees are going to be returned, preventing an infinite loop. Then, if a person is interested enough, they can use the employee identifiers to look up individual employees, which would of course contain department identifiers.
In short, I recommend that you don't really rebuild the association at this level. I suggest that you build disconnected graphs of the object mesh, such that one can easily navigate the disconnected graph at a later time. Then, if you really must connect them, you will at least have all the data loaded without recursion before you start knitting together references.
Many ORM libraries enable you to define one to many relationships as you described. Sormula can do this. See one to many example.
What I like about Sormula is that if you name the foreign key field on the "many side" the same as the field "one side", then Sormula will deduce the relationship and no annotations are necessary.

How to add data of different records to a single record?

If do not have time please have a look at the example
I have two types of users, temporary users and permanent users.
Temporary users use the system as guest just provide their name and use it but system needs to track them.
Permanent users are those that are registered and permanent.
Once user create a permanent record for himself, I need to copy all the information that has been tracked while user was a guest to his permanent record.
Classes are as following,
#Entity
public class PermUser{
#Id
#GeneratedValue
private long id;
#OneToMany
private List Favorites favorites;
....
}
#Entity
public class Favorites {
#Id
#GeneratedValue
private long id;
#OneToMany (cascade = CascadeType.ALL)
#LazyCollection(LazyCollectionOption.FALSE)
private List <FavoriteItems> items;
...
}
#Entity
public class FavoriteItems {
#Id
#GeneratedValue
private long id;
private int quantity;
#ManyToOne
private Ball ball;
..
}
#Entity
public class TempUser extends PermUser{
private String date;
....
}
Problems is :
If I clone the tempUser object, I am copying the id parameters as well so when saving the perm user object it shows a message like "Duplicate entry '10' for key ...", I can not remove the tempUser first then save the permUser as if saving permUser failed I will miss the data. If I try to copy each ball of favoriteitems separately without id of item it would not be an efficient way.
Example (Question in one sentence: As shown blew a user may have more than one TempUser record and just one PermUser record, therefore I need to add information of all the TempUser records to that single PermUser record.)
Type of record | name | favorites | date
| | |
1)TempUser | Jack | 2 items | 1/1/2013
2)TempUser | Jack | 3 items | 1/4/2013
---------------------------------------------------------------------------
PermUser | Jack | 5 items ( 2 + 3 items from his temp records)
*Please note, I need to find a solution, and do not care if try a new solution rather than cloning the object.
The reason that I have two different classes is that tempUser has few additional attributes, I may also need to add favorites of few tempUsers to favorites list of one permUser. and also as mentioned above a user may have many different not related temp records
Forgive me if I'm missing something, but I don't think that TempUser and PermUser should be different classes. TempUser extends PermUser, which is an "is-a" relationship. Clearly, temporary users are not a type of permanent user. Your question doesn't give enough information to justify making them different -- perhaps they're the same class, and the difference can be expressed as a few new attributes? Eg:
#Entity
public class User{
#OneToMany(cascade = CascadeType.ALL)
private List Favorites favorites;
private boolean isTemporary;
....
}
The "transition" from temporary to permanent can be handled by some controller, making sure that isTemporary = false and that the other properties of a permanent user are appropriately set. This would completely side-step the cloning issue and would be much easier on your database.
I just had the same problem. I've been digging through many interesting articles and questions in boards like SO untill I had enough inspiration.
At first I also wanted to have sub classes for different types of users. It turns out that the idea itself is a design flaw:
Don't use inheritance for defining roles!
More Information here Subtle design: inheritance vs roles
Think of an user as a big container which just harbors other entities like credentials, preferences, contacts, items, userinformation, etc.
With this in mind you can easily change certain abilities/behaviour of certain users,
Of course you can define a role many users can play. Users playing the same role will have the same features.
If you have many entities/objects depending on each other you shoukd think of a building mechanism/pattern that sets up a certain user role in a well defined way.
Some thoughts: A proper way for JPA entities instantiation
If you had a builder/factory for users, your other problem wouldn't be that complex anymore.
Example (really basic, do not expect too much!)
public void changeUserRoleToPermanent (User currentUser) {
UserBuilder builder = new UserBuilder();
builder.setRole(Role.PERMANENT); // builder internally does all the plumping
// copy the stuff you want to keep
builder.setId(user.getId);
builder.setPrefences();
// ...
User newRoleUser = builder.build();
newRoleUser = entityManager.merge(newRoleUser);
entitymanager.detach(currentUser);
// delete old stuff
entityManager.remove(currentUser.getAccountInfo()); // Changed to different implementaion...
}
I admit, it is some work but you will have many possibilities once you have the infrastructure ready! You can then "invent" new stuff really fast!
I hope I could spread some ideas. I'm sorry for my miserable english.
As I agree with prior comments that if it is possible you should reevaluate these entities, but if that is not possible I suggest that you return a general User from the database and then caste that user as either PermUser or TempUser which both would be extensions of User, based on the presence of certain criteria.
For part 2 of your problem:
You are using CascadeType.ALL for the favorites relation. This includes CascadeType.REMOVE, which means a remove operation on the user will cascade to that entity. So specify an array of CascadeType values that doesn't include CascadeType.REMOVE.
See http://webarch.kuzeko.com/2011/11/hibernate-understanding-cascade-types/.
What I am going to suggest might not be that OO but hope will be effective. I am happy to keep PermUser and TempUser separate do not extend it, not binding them into is-a relationship also. So I will have two separate tables in database one for TempUser and one for PermUser thereby treating them as two seperate entities. Many will find it to be redundant.. but read on... we all know.. sometimes redundancy is good.. So now...
1) I don't know when a TempUser would want to become PermUser. So I will always have all TempUsers in separate table.
2) What would I do if a user always wants to be TempUser..? I still have separate TempUser table to refer to..
3) I am assuming that when a TempUser wants to become a PermUser you are reading his TempUser name to get his records as TempUser.
So now your job is easy. So now when a TempUser want to become PermUser all you would do is copy TempUser objects,populate your required attributes and create a new PermUser object with it. After that you can keep your TempUser record if you want to or delete it.. :)
Also you would have a history how many of your TempUsers actually become permanent if you keep it and also know in what average time a TempUser becomes permanent.
I think you should do a manual deep clone. Not exactly a clone since you have to merge data from several tempUsers to a single permUser. You can use reflection and optionally annotations to automate the copy of information.
To automatically copy fields from an existing object to a new one you can follow this example. It is not a deep clone but may help you as starting point.
Class 'c' is used as reference. src and dest must be instances of 'c' or instance of subclases of 'c'. The method will copy the attributes defined in 'c' and superclasses of 'c'.
public static <E> E copyObject(E dest, E src, Class<?> c) throws IllegalArgumentException, IllegalAccessException{
// TODO: You may want to create new instance of 'dest' here instead of receiving one as parameter
if (!c.isAssignableFrom(src.getClass()))
{
throw new IllegalArgumentException("Incompatible classes: " + src.getClass() + " - " + c);
}
if (!c.isAssignableFrom(dest.getClass()))
{
throw new IllegalArgumentException("Incompatible classes: " + src.getClass() + " - " + c);
}
while (c != null && c != Object.class)
{
for (Field aField: c.getDeclaredFields())
{
// We skip static and final
int modifiers = aField.getModifiers();
if ( Modifier.isStatic(modifiers) || Modifier.isFinal(modifiers))
{
continue;
}
// We skip the fields annotated with #Generated and #GeneratedValue
if (aField.getAnnotation(GeneratedValue.class) == null &&
aField.getAnnotation(Generated.class) == null)
{
aField.setAccessible(true);
Object value = aField.get(src);
if (aField.getType().isPrimitive() ||
String.class == aField.getType() ||
Number.class.isAssignableFrom(aField.getType()) ||
Boolean.class == aField.getType() ||
Enum.class.isAssignableFrom(aField.getType()))
{
try
{
// TODO: You may want to recursive copy value too
aField.set(dest, value);
}
catch(Exception e)
{
e.printStackTrace();
}
}
}
}
c = c.getSuperclass();
}
return dest;
}
Like some have already suggested I would tackle this problem using inheritance + either shallow copies (to share references) or deep cloning with libraries that let me exclude / manipulate the auto-generated ids (when you want to duplicate items).
Since you don't want to bend your database model too much, start with a Mapped Superclass with common attributes. This will not be reflected in your database at all. If you could I would go with Single Table Inheritance which maps close to your model (but may require some adjusts on the database layer).
#MappedSuperclass
public abstract class User {
#Id
#GeneratedValue
private long id;
// Common properties and relationships...
Then have both PermUser and TempUser inherit from User, so that they will have a lot of common state:
#Entity
#Table(name="USER")
public class PermUser extends User {
// Specific properties
}
Now there are several possible approaches, if your classes don't have a lot of state, you can, for instance, make a constructor that builds a PermUser collecting data of a List of TempUsers.
Mock code:
#Entity
#Table(name="PERMANENT_USER")
public class PermUser extends User {
public PermUser() {} // default constructor
public PermUser(List<TempUser> userData) {
final Set<Favorites> f = new LinkedHashSet<>();
// don't set the id
for(TempUser u : userData) {
this.name = u.getName();
// Shallow copy that guarants uniqueness and insertion order
// Favorite must override equals and hashCode
f.addAll(u.getFavorites());
}
this.favorites = new ArrayList<>(f);
// Logic to conciliate dates
}
}
When you persist the PermUser it will generate a new id, cascaded unidirectional relationships should work fine.
On the other hand, if your class have a lot of attributes and relationships, plus there are a lot of situations in which you really need to duplicate objects, then you could use a Bean Mapping library such as Dozer (but be warned, cloning objects is a code smell).
Mapper mapper = new DozerBeanMapper();
mapper.map(tempUser.getFavorites(), user.getFavorites());
With dozer you can configure Mappings through annotations, API or XML do to such things as excluding fields, type casting, etc.
Mock mapping:
<mapping>
<class-a>my.object.package.TempUser</class-a>
<class-b>my.object.package.PermUser</class-b>
<!-- common fields with the same name will be copied by convention-->
<!-- exclude ids and fields exclusive to temp
<field-exclude>
<a>fieldToExclude</a>
<b>fieldToExclude</b>
</field-exclude>
</mapping>
You can, for example, exclude ids, or maybe copy permUser.id to all of the cloned bidirectional relationships back to User (if there is one), etc.
Also, notice that cloning collections is a cumulative operation by default.
From Dozer documentation:
If you are mapping to a Class which has already been initialized, Dozer will either 'add' or 'update' objects to your List. If your List or Set already has objects in it dozer checks the mapped List, Set, or Array and calls the contains() method to determine if it needs to 'add' or 'update'.
I've used Dozer in several projects, for example, in one project there were a JAXB layer that needed to be mapped to a JPA model layer. They were close enough, but unfortunately I couldn't bend neither. Dozer worked quite well, was easy to learn and spare me from writing 70% of the boring code. I can deeply clone recommend this library out of personal experience.
From a pure OO perspective it does not really make sense for an instance to morph from one type into another, Hibernate or not. It sounds like you might want to reconsider the object model independently of its database representation. FourWD seems more like a property of a car than a specialization, for example.
A good way to model this is to create something like a UserData class such that TempUser has-a UserData and PermUser has-a UserData. You could also make TempUser has-a PermUser, though that's going to be less clear. If your application needs to use them interchangeably (something you'd get with the inheritance you were using), then both classes can implement an interface that returns the UserData (or in the second option, getPermUser, where PermUser returns itself).
If you really want to use inheritance, easiest might be to map it using the "Table per class hierarchy" and then using straight JDBC to update the discriminator column directly.

solrj: how to store and retrieve List<POJO> via multivalued field in index

My use case is an index which holds titles of online media. The provider of the data associates a list of categories with each title. I am using SolrJ to populate the index via an annotated POJO class
e.g.
#Field("title")
private String title;
#Field("categories")
private List<Category> categoryList;
The associated POJO is
public class Category {
private Long id;
private String name;
...
}
My question has two parts:
a) is this possible via SolrJ - the docs only contain an example of #Field using a List of String, so I assume the serialization/marshalling only supports simple types ?
b) how would I set up the schema to hold this. I have a naive assumption I just need to set
multiValued=true on the required field & it will all work by magic.
I'm just starting to implement this so any response would be highly appreciated.
The answer is as you thought:
a) You have only simple types available. So you will have a List of the same type e.g. String. The point is you cant represent complex types inside the lucene document so you wont deserialize them as well.
b) The problem is what you are trying is to represent relational thinking in a "document store". That will probably work only to a certain point. If you want to represent categories inside a lucene document just use the string it is not necessary to store a id as well.
The only point to store an id as well is: if you want to do aside the search a lookup on a RDBMS. If you want to do this you need to make sure that the id and the category name is softlinked. This is not working for every 1:n relation. (Every 1:n relation where the n related table consists only of required fields is possible. If you have an optional field you need to put something like a filling emptyconstant in the field if possible).
However if these 1:n relations are not sparse its possible actually if you maintain the order in which you add fields to the document. So the case with the category relation can be probably represented if you dont sort the lists.
You may implement a method which returns this Category if you instantiate it with the values at position 0...n. So the solution would be if you want to have the first category it will be at position 0 of every list related to this category.

Categories

Resources