I have just started reading about ORMLite since I am interested in using it in an Android application.
I would like to have a feeling of how object relations are/should be persisted with this framework.
For example, if I have these classes:
#DatabaseTable(tableName = "bill_items")
class BillItem{...}
#DatabaseTable(tableName = "bills")
class Bill {
#DatabaseField String x;
List<BillItem> billItems = new ArrayList<BillItem>;
public void setItems(List<BillItem> billItems) {
this.billItems = billItems;
}
}
As far as I understand the recommended way to update a Bill object would be with something similar to:
Dao<Bill, String> billDao = DaoManager.createDao(connectionSource, Bill.class);
Bill bill = billDao.queryForId(...);
List<BillItem> billItems = ...; //some new list of items
bill.setX("some new value for X");
bill.setItems(billItems);
//saving explicitly the billItems before saving the bill. Is this correct ?
Dao<BillItem, String> billItemDao = DaoManager.createDao(connectionSource, BillItem.class);
for(BillItem billItem : billItems)
billItemDao.update(billItem);
billDao.update(bill);
Is this the recommended way to update an object which relations have changed ? (specifically relations with a set of persistent objects, as in the code above).
Somehow I had the impression that it should be a better way to do this.
Also, if I want to use this framework, am I suppose to put in my model classes both domain attributes and persistency related attributes (e.g., foreign and primary keys ?). Wondering if there is a way to avoid this mixing of concerns.
Thanks for any help.
Your code is basically correct although there are some things you can do to improve it.
You should use the #ForeignCollectionField annotation to mark the billItems field. This will load the collection when you query for a Bill. See the docs on foreign collections.
#ForeignCollectionField
ForeignCollection<BillItem> billItems;
Instead of doing the updates by hand each time, you can create your own Dao class for Bill that overrides the update update() method and updates the inner objects on its own.
Related
I'm using spring boot with mysql to create a Restful API. Here's an exemple of how i return a json response.
first i have a model:
#Entity
public class Movie extends DateAudit {
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
private Long id;
private String name;
private Date releaseDate;
private Time runtime;
private Float rating;
private String storyline;
private String poster;
private String rated;
#OneToMany(mappedBy = "movie", cascade = CascadeType.ALL, orphanRemoval = true)
private List<MovieMedia> movieMedia = new ArrayList<>();
#OneToMany(mappedBy = "movie", cascade = CascadeType.ALL, orphanRemoval = true)
private List<MovieReview> movieReviews = new ArrayList<>();
#OneToMany(mappedBy = "movie", cascade = CascadeType.ALL, orphanRemoval = true)
private List<MovieCelebrity> movieCelebrities = new ArrayList<>();
// Setters & Getters
}
and correspond repository:
#Repository
public interface MovieRepository extends JpaRepository<Movie, Long> {
}
Also i have a payload class MovieResponse which represent a movie instead of Movie model, and that's for example if i need extra fields or i need to return specific fields.
public class MovieResponse {
private Long id;
private String name;
private Date releaseDate;
private Time runtime;
private Float rating;
private String storyline;
private String poster;
private String rated;
private List<MovieCelebrityResponse> cast = new ArrayList<>();
private List<MovieCelebrityResponse> writers = new ArrayList<>();
private List<MovieCelebrityResponse> directors = new ArrayList<>();
// Constructors, getters and setters
public void setCelebrityRoles(List<MovieCelebrityResponse> movieCelebrities) {
this.setCast(movieCelebrities.stream().filter(movieCelebrity -> movieCelebrity.getRole().equals(CelebrityRole.ACTOR)).collect(Collectors.toList()));
this.setDirectors(movieCelebrities.stream().filter(movieCelebrity -> movieCelebrity.getRole().equals(CelebrityRole.DIRECTOR)).collect(Collectors.toList()));
this.setWriters(movieCelebrities.stream().filter(movieCelebrity -> movieCelebrity.getRole().equals(CelebrityRole.WRITER)).collect(Collectors.toList()));
}
}
As you can see i divide the movieCelebrities list into 3 lists(cast, directos and writers)
And to map a Movie to MovieResponse I'm using ModelMapper class:
public class ModelMapper {
public static MovieResponse mapMovieToMovieResponse(Movie movie) {
// Create a new MovieResponse and Assign the Movie data to MovieResponse
MovieResponse movieResponse = new MovieResponse(movie.getId(), movie.getName(), movie.getReleaseDate(),
movie.getRuntime(),movie.getRating(), movie.getStoryline(), movie.getPoster(), movie.getRated());
// Get MovieCelebrities for current Movie
List<MovieCelebrityResponse> movieCelebrityResponses = movie.getMovieCelebrities().stream().map(movieCelebrity -> {
// Get Celebrity for current MovieCelebrities
CelebrityResponse celebrityResponse = new CelebrityResponse(movieCelebrity.getCelebrity().getId(),
movieCelebrity.getCelebrity().getName(), movieCelebrity.getCelebrity().getPicture(),
movieCelebrity.getCelebrity().getDateOfBirth(), movieCelebrity.getCelebrity().getBiography(), null);
return new MovieCelebrityResponse(movieCelebrity.getId(), movieCelebrity.getRole(),movieCelebrity.getCharacterName(), null, celebrityResponse);
}).collect(Collectors.toList());
// Assign movieCelebrityResponse to movieResponse
movieResponse.setCelebrityRoles(movieCelebrityResponses);
return movieResponse;
}
}
and finally here's my MovieService service which i call in the controller:
#Service
public class MovieServiceImpl implements MovieService {
private MovieRepository movieRepository;
#Autowired
public void setMovieRepository(MovieRepository movieRepository) {
this.movieRepository = movieRepository;
}
public PagedResponse<MovieResponse> getAllMovies(Pageable pageable) {
Page<Movie> movies = movieRepository.findAll(pageable);
if(movies.getNumberOfElements() == 0) {
return new PagedResponse<>(Collections.emptyList(), movies.getNumber(),
movies.getSize(), movies.getTotalElements(), movies.getTotalPages(), movies.isLast());
}
List<MovieResponse> movieResponses = movies.map(ModelMapper::mapMovieToMovieResponse).getContent();
return new PagedResponse<>(movieResponses, movies.getNumber(),
movies.getSize(), movies.getTotalElements(), movies.getTotalPages(), movies.isLast());
}
}
So the question here: is it fine to use for each model i have a payload class for the json serialize ? or it there a better way.
also guys id it's there anything wrong about my code feel free to comment.
I had this dilemma not so long back, this was my thought process. I have it here https://stackoverflow.com/questions/44572188/microservices-restful-api-dtos-or-not
The Pros of Just exposing Domain Objects
The less code you write, the less bugs you produce.
despite of having extensive (arguable) test cases in our code base, I have came across bugs due to missed/wrong copying of fields from domain to DTO or viceversa.
Maintainability - Less boiler plate code.
If I have to add a new attribute, I don't have to add in Domain, DTO, Mapper and the testcases, of course. Don't tell me that this can be achieved using a reflection beanCopy utils like dozer or mapStruct, it defeats the whole purpose.
Lombok, Groovy, Kotlin I know, but it will save me only getter setter headache.
DRY
Performance
I know this falls under the category of "premature performance optimization is the root of all evil". But still this will save some CPU cycles for not having to create (and later garbage collect) one more Object (at the very least) per request
Cons
DTOs will give you more flexibility in the long run
If only I ever need that flexibility. At least, whatever I came across so far are CRUD operations over http which I can manage using couple of #JsonIgnores. Or if there is one or two fields that needs a transformation which cannot be done using Jackson Annotation, As I said earlier, I can write custom logic to handle just that.
Domain Objects getting bloated with Annotations.
This is a valid concern. If I use JPA or MyBatis as my persistent framework, domain object might have those annotations, then there will be Jackson annotations too. If you are using Spring boot you can get away by using application-wide properties like mybatis.configuration.map-underscore-to-camel-case: true , spring.jackson.property-naming-strategy: SNAKE_CASE
Short story, at least in my case, cons didn't outweigh the pros, so it did not make any sense to repeat myself by having a new POJO as DTO. Less code, less chances of bugs. So, went ahead with exposing the Domain object and not having a separate "view" object.
Disclaimer: This may or may not be applicable in your use case. This observation is per my usecase (basically a CRUD api having 15ish endpoints)
We should each layer separate from other. As in your case, you have defined the entity and response classes. This is right way to separate things, we should never send the entity in the response. Even for request thing we should have a class.
What the issue if we are sending entity instead of response dto.
Not available to modify them because we already expose it with our client
Sometimes we don't want to serialize some fields and send as response.
Some overhead are there to translate request to domain, entity to domain etc. But its okay to keep more organized. ModelMapper is the best choice for translation purpose.
Try to use construct injection instead of setter for mandate dependency.
It is always recommended to separate DTO and Entity.
Entity should interact with DB/ORM and DTO should interact with client layer(Layer for request and response) even if the structure of Entity and DTO same.
Here Entity is Movie and
DTO is MovieResponse
Use your existing class MovieResponse for request & response.
Never use Movie class for request & response.
and the class MovieServiceImpl should contain business logic for converting Entity to DTO, Or you can use Dozer api to do auto conversion.
The reason for sepating:
In case you need to add/remove new elements in Request/response you dont have to change much code
if 2 entity have 2 way mapping(e.g. one-to-many/many-to-many relationship) then
JSON object cant be created if object have nested data, this will throw error while serializing
if Anything changed in DB or Entity, then this will not affect JSON Response(most of the time).
Code will be clear and easy to maintain.
On one side you should separate them because sometimes some of the JPA annotations which you use in your model don't work well with the json processor annotations. And yes, you should keep the things separated.
What if you later decide to change your data layer? Will you have to rewrite all your client side?
On the other side, there is this problem of mapping. For that, you can use a library with a small performance penalty.
DTO is a design pattern and solves the problem of fetching as maximum useful data from a service as possible.
In case of a simple application as yours, the DTOs tend to be similar to the Entity classes. However for certain complex applications, DTOs can be extended to combine data from various entities to avoid multiple requests to the server and thus save valuable resources and request-response time.
I would suggest not to duplicate the code in a simple case like this and use model classes in response to the APIs as well. Using separate response classes as DTOs will not solve any purpose and will only make maintaining the code difficult.
While most people have answered pros and cons of using DTO objects, I would like to give my 2 cents. In my case DTO was necessary because not all fields persisted in database were captured from user. There were a few fields which were computed based on user input(of other fields) and were not exposed to users. Also, it can also reduces the size of payload which could result in better performance in such cases.
I advocate for separating the "Payload" or "Data" object from the "Model" or "Display" object. Pretty much always. This just keeps things easier to manage.
Here's an example:
Let's say you need to hit an API that gives you data about cats for sale. Then you parse the data into a cat model object and populate a list of cats that is then displayed to the user. Cool.
But now you want to integrate another API and pull cats from 2 databases. But you run into a problem. One API returns furColor for the color and the new one returns catColor for the color.
If you were using the same object to also display the info, you have some options:
Add both furColor and catColor to the model object, make them both optional, and do some kind of computed property to check which one is set and use that one to display the color
In reality, this is rarely an option because the responses will usually be much more different than just one value like this so you would likelly need a whole new parser anyway
Add a new data object and then also a new adapter and then have to do some kind of check to know which adapter to use when
Something else that still isn't pretty or fun to work with
However, if you create a data object that catches the response, and then a display object that has only the info needed to populate the list, this becomes really easy:
You have a data object that captures the response from the first API
Now make a data object that captures the response from the second API
Now all you need is some kind of simple mapper to map the response to the Display Object
Now both will be converted to a common simple display object, and the same adapter can be used to display the new cats without additional work
This also will make storing the data locally much cleaner.
I know that when using Wicket with JPA frameworks it is not advisable to serialize entities that have already been persisted to the database (because of problems with lazy fields and to save space). In such cases we are supposed to use LoadableDetachableModel. But what about the following use-case?
Suppose we want to create a new entity (say, a Contract) which will consist, among other things, of persisted entities (say, a Client which is selected from a list of clients stored in the DB). The entity under creation is a model object of some Wicket component (say, a Wizard). In the end (when we finish our wizard) we save the new entity to the DB. So my question is: what is the best generic solution to the serialization problem of such model objects? We can't use LDM because the entity is not in the DB yet but we don't want our inner entities (like Client) to be serialized wholly, too.
My idea was to implement a custom wicket serializer that checks if the object is an entity and if it is persisted. If so, store only its id, otherwise use the default serialization. Similarly, when deserializing use the stored id and get the entity from the DB or deserialize using the default mechanism. Not sure, though, how to do that in a generic way. My next thought was that if we can do it, then we do not need any LDM anymore, we can just store all our entities in simple org.apache.wicket.model.Model models and our serialization logic will take care of them, right?
Here's some code:
#Entity
Client {
String clientName;
#ManyToOne(fetch = FetchType.LAZY)
ClientGroup group;
}
#Entity
Contract {
Date date;
#ManyToOne(fetch = FetchType.LAZY)
Client client;
}
ContractWizard extends Wizard {
ContractWizard(String markupId, IModel<Contract> model) {
super(markupId);
setDefaultModel(model);
}
}
Contract contract = DAO.createEntity(Contract.class);
ContractWizard wizard = new ContractWizard("wizard", ?);
How to pass the contract? If we just say Model.of(contract) the whole contract will be serialized along with inner client (and it can be big), moreover if we access contract.client.group after deserialization we can bump into the problem: https://en.wikibooks.org/wiki/Java_Persistence/Relationships#Serialization.2C_and_Detaching
So I wonder how people go about solving such issues, I'm sure it's a fairly common problem.
I guess there are 2 approaches to your problem:
a.) Only save the stuff the user actually sees in Models. In your example that might be "contractStartDate", "contractEndDate", List of clientIds. That's the main approach if you don't want your DatabaseObjects in your view.
b.) Write your own LoadableDetachableModel and make sure you only serialize transient objects. For example like: (assuming that any negative id is not saved to the database)
public class MyLoadableDetachableModel extends LoadableDetachableModel {
private Object myObject;
private Integer id;
public MyLoadableDetachableModel(Object myObject) {
this.myObject = myObject;
this.id = myObject.getId();
}
#Override
protected Object load() {
if (id < 0) {
return myObject;
}
return myObjectDao.getMyObjectById(id);
}
#Override
protected void onDetach() {
super.onDetach();
id = myObject.getId();
if (id >= 0) {
myObject = null;
}
}
}
The downfall of this is that you'll have to make your DatabaseObjects Serializable which is not really ideal and can lead to all kind of problems. You would also need to decouple the references to other entities from the transient object by using a ListModel.
Having worked with both approaches I personally prefer the first. From my expierence the whole injecting dao objects into wicket can lead to disaster. :) I would only use this in view-only projects that aren't too big.
Most projects I know of just accept serializing referenced entities (e.g. your Clients) along with the edited entity (Contract).
Using conversations (keeping a Hibernate/JPA session open over several requests) is a nice alternative for applications with complex entity relations:
The Hibernate session and its entities is kept separate from the page and is never serialized. The component just keeps an identifier to fetch its conversation.
This question is so simple, you can probably just read the code
This is a very simple performance question. In the code example below, I wish to set the Owner on my Cat object. I have the ownerId, but the cats method for requires an Owner object, not a Long. Eg: setOwner(Owner owner)
#Autowired OwnerRepository ownerRepository;
#Autowired CatRepository catRepository;
Long ownerId = 21;
Cat cat = new Cat("Jake");
cat.setOwner(ownerRepository.findById(ownerId)); // What a waste of time
catRepository.save(cat)
I'm using the ownerId to load an Owner object, so I can call the setter on the Cat which is simply going to pull out the id, and save the Cat record with an owner_id. So essentially I'm loading an owner for nothing.
What is the correct pattern for this?
First of all, you should pay attention to your method to load an Owner entity.
If you're using an Hibernate Session :
// will return the persistent instance and never returns an uninitialized instance
session.get(Owner.class, id);
// might return a proxied instance that is initialized on-demand
session.load(Owner.class, id);
If you're using EntityManager :
// will return the persistent instance and never returns an uninitialized instance
em.find(Owner.class, id);
// might return a proxied instance that is initialized on-demand
em.getReference(Owner.class, id);
So, you should lazy load the Owner entity to avoid some hits to the cache nor the database.
By the way, I would suggest to inverse your relation between Owner and Cat.
For example :
Owner owner = ownerRepository.load(Owner.class, id);
owner.addCat(myCat);
Victor's answer is correct (+1 from me), but requires going through the EntityManager or Hibernate session. Assuming the repositories you have autowired are JPA repositories from Spring Data and you would prefer to go through them, use the JpaRepository#getOne method. It calls EntityManager#getReference, so it does the same thing, returning a proxy to the entity.
I do not think the relationship necessarily needs to be reversed here, which mapping to use depends on the situation. In many cases many-to-one is preferred.
Probably not what you were looking for, but nothing in your question implies that you have to solve this with JPA. Some things are just much much simpler with plain old SQL:
INSERT INTO cat (name, owner_id) VALUES ('Jake', 21)
If you are using Hibernate you can do this:
Long ownerId = 21;
Cat cat = new Cat("Jake");
Owner owner = new Owner();
owner.setId(ownerId);
cat.setOwner(owner);
catRepository.save(cat)
It's not standard JPA, but, if you are not willing to migrate to other JPA provider, it's the best from a performance perspective.
Update
As Nathan pointed out, you need to make sure the Owner is not already associated (in which case you can get a NonUniqueObjectException since the Persistence Context can have at most one entity associated in the 1st level cache).
Using EntityManager.contains(entity) doesn't help in this case, since Hibernate stores the entities in an IdentiyHashMap, where the key is the Object reference itself.
So you should use this method when, for example, you have a use case where you must insert these entities for the first time, or when you need to update them and the Owner hadn't been loaded in the current running Persistence Context (either directly or through JPQL or a Criteria API).
Otherwise, use EntityManager.getReferemce(Class entityClass, Object primaryKey).
One more way (can come handy sometimes in legacy code or db schema):
#Entity
public class Cat {
#Column(name = "OWNER_ID")
private Long ownerId;
#ManyToOne
#JoinColumn(name = "OWNER_ID", insertable = false, updatable = false)
private Owner owner;
}
I have a unidirectional relation Project -> ProjectType:
#Entity
public class Project extends NamedEntity
{
#ManyToOne(optional = false)
#JoinColumn(name = "TYPE_ID")
private ProjectType type;
}
#Entity
public class ProjectType extends Lookup
{
#Min(0)
private int progressive = 1;
}
Note that there's no cascade.
Now, when I insert a new Project I need to increment the type progressive.
This is what I'm doing inside an EJB, but I'm not sure it's the best approach:
public void create(Project project)
{
em.persist(project);
/* is necessary to merge the type? */
ProjectType type = em.merge(project.getType());
/* is necessary to set the type again? */
project.setType(type);
int progressive = type.getProgressive();
type.setProgressive(progressive + 1);
project.setCode(type.getPrefix() + progressive);
}
I'm using eclipselink 2.6.0, but I'd like to know if there's a implementation independent best practice and/or if there are behavioral differences between persistence providers, about this specific scenario.
UPDATE
to clarify the context when entering EJB create method (it is invoked by a JSF #ManagedBean):
project.projectType is DETACHED
project is NEW
no transaction (I'm using JTA/CMT) is active
I am not asking about the difference between persist() and merge(), I'm asking if either
if em.persist(project) automatically "reattach" project.projectType (I suppose not)
if it is legal the call order: first em.persist(project) then em.merge(projectType) or if it should be inverted
since em.merge(projectType) returns a different instance, if it is required to call project.setType(managedProjectType)
An explaination of "why" this works in a way and not in another is also welcome.
You need merge(...) only to make a transient entity managed by your entity manager. Depending on the implementation of JPA (not sure about EclipseLink) the returned instance of the merge call might be a different copy of the original object.
MyEntity unmanaged = new MyEntity();
MyEntity managed = entityManager.merge(unmanaged);
assert(entityManager.contains(managed)); // true if everything worked out
assert(managed != unmanaged); // probably true, depending on JPA impl.
If you call manage(entity) where entity is already managed, nothing will happen.
Calling persist(entity) will also make your entity managed, but it returns no copy. Instead it merges the original object and it might also call an ID generator (e.g. a sequence), which is not the case when using merge.
See this answer for more details on the difference between persist and merge.
Here's my proposal:
public void create(Project project) {
ProjectType type = project.getType(); // maybe check if null
if (!entityManager.contains(type)) { // type is transient
type = entityManager.merge(type); // or load the type
project.setType(type); // update the reference
}
int progressive = type.getProgressive();
type.setProgressive(progressive + 1); // mark as dirty, update on flush
// set "code" before persisting "project" ...
project.setCode(type.getPrefix() + progressive);
entityManager.persist(project);
// ... now no additional UPDATE is required after the
// INSERT on "project".
}
UPDATE
if em.persist(project) automatically "reattach" project.projectType (I suppose not)
No. You'll probably get an exception (Hibernate does anyway) stating, that you're trying to merge with a transient reference.
Correction: I tested it with Hibernate and got no exception. The project was created with the unmanaged project type (which was managed and then detached before persisting the project). But the project type's progression was not incremented, as expected, since it wasn't managed. So yeah, manage it before persisting the project.
if it is legal the call order: first em.persist(project) then em.merge(projectType) or if it should be inverted
It's best practise to do so. But when both statements are executed within the same batch (before the entity manager gets flushed) it may even work (merging type after persisting project). In my test it worked anyway. But as I said, it's better to merge the entities before persisting new ones.
since em.merge(projectType) returns a different instance, if it is required to call project.setType(managedProjectType)
Yes. See example above. A persistence provider may return the same reference, but it isn't required to. So to be sure, call project.setType(mergedType).
Do you need to merge? Well it depends. According to merge() javadoc:
Merge the state of the given entity into the current persistence
context
How did you get the instance of ProjectType you attach to your Project to? If that instance is already managed then all you need to do is just
type.setProgessive(type.getProgressive() + 1)
and JPA will automatically issue an update effective on next context flush.
Otherwise if the type is not managed then you need to merge it first.
Although not directly related this quesetion has some good insight about persist vs merge: JPA EntityManager: Why use persist() over merge()?
With the call order of em.persist(project) vs em.merge(projectType), you probably should ask yourself what should happen if the type is gone in the database? If you merge the type first it will get re-inserted, if you persist the project first and you have FK constraint the insert will fail (because it's not cascading).
Here in this code. Merge basically store the record in different object, Let's say
One Account pojo is there
Account account =null;
account = entityManager.merge(account);
then you can store the result of this.
But in your code your are using merge different condition like
public void create(Project project)
{
em.persist(project);
/* is necessary to merge the type? */
ProjectType type = em.merge(project.getType());
}
here
Project and ProjectType two different pojo you can use merge for same pojo.
or is there any relationship between in your pojo then also you can use it.
I think the best way to describe the problem is by using an example. I will keep it as simple as possible by removing unneeded details of the implementation.
Say there is a book store. The store keeps track of all books, customers, and orders using a backend database to store all the data, and a Java front-end to present them to the manager of the store.
The database contains the following relations:
Book ( id, title, author )
Customer ( id, name, tel, address )
Order ( id, date, custId, bookId )
On the other side, the Java interface uses JDBC driver to connect to the database and retrieve the data. The application consists of the following classes:
Book
BooksDataLoader
BooksTableModel
BooksView
Customer
CustomersDataLoader
CustomersTableModel
CustomersView
Order
OrdersDataLoader
OrdersTalbeModel
OrdersView
These classes use respective design guidelines and you can use the following source code as reference:
public class Book {
private String id;
private String title;
private String author;
/*
* Builder pattern is used so constructor should be hidden. Book objects
* are built in the BooksDataLoader SwingWorker thread.
*/
private Book() {}
}
public class BooksDataLoader extends SwingWorker<List<Book>, Book> {
private final BooksTableModel booksModel;
private final List<Book> books = new ArrayList<Book>();
}
public class BooksTableModel extend AbstractTableModel {
private final String columnNames = { "Book ID", "Book Title", "Book Author" };
private final List<Book> books = new ArrayList<Book>();
}
public class BooksView extends JPanel {
private final JTable booksTable;
private final BooksTableModel booksModel;
}
I am using the Builder pattern to implement the classes Book, Customer, and Order. The instances of these classes are built using data retrieved by the database inside a SwingWorker thread and are published to the view using an AbstractTableModel. So, actually the application consists of the following views (JPanels): BooksView, CustomersView, and OrdersView, each of which contains a single JTable with columns as shown below:
BooksView.booksTable: Book ID | Book Title | Book Author
CustomersView.customersTable: Customer ID | Customer Name
OrdersView.ordersTable: Order ID | Date | Customer Name | Book Title | Book Author
The problem appears when we try to resolve an instance variable which represents a foreign key in the database, to the data it links. For example, the OrdersTableModel has a List structure of all Order objects found in the database, however the columns 3, 4, and 5 of the of the OrdersView table cannot be directly accessed from an Order object since it only contains ids for the book and the customer, and not the actual data. One solution I tried was to create a static HashMap inside each of the Book, Customer, and Order classes in order to keep track of all retrieved objects but it leads to data duplication since we already have a List structure of the retrieved objects in the table model of each view.
I am looking for an efficient and extensible (object-oriented) design solution/recommendation.
Thank you in advance.
You should definitely use ORM like Hibernate or EclipseLink or whatever technology fits you. Currently JPA2 is the common standard implemented by every such tool. You define the mapping between your object and db model by using annotations or xml files.
These tools also offer ways to generate your database schema according to your object model (even the other way is possible if you have legacy schemes).
I recommend you not to make use of jpa criteria api since its design is quite flawed. There are a number of frameworks out there that help you build your queries. QueryDSL is one that seems really nice to me. I used the specification pattern (which I actually implemented using criteria api under the hood) for abstracting query construction. See http://martinfowler.com/apsupp/spec.pdf and http://adrianhummel.wordpress.com/2010/07/02/composed-specifications-using-jpa-2-0/ for first references.
And do some search on DAO pattern and repositories (a term coming from domain driven design).
Well this is a typical issue when mapping OO design on relational database tables.
Let's take as example your OrdersTableModel:
Order ID | Date | Customer Name | Book Title | Book Author
the last three columns are database foreign key ids instead of the values you want to show.
To manage this correctly you have 2 possible solution:
FIRST:
desgin the Order class like this
public class Order{
private String id;
private Date date;
private Customer customer;
private Book book;
private Author author;
get and set methods
}
Note that customer is of type CUSTOMER, book of type BOOK...
Now suppose you query the db to retreive a list of orders.
From the rows returned you have to build a list of Objects Order. Of course the db return foreign key ids for customer, book and author so:
Query table orders of db
For each row build and Orders object, fill id and date with the values of the rows
Take the foreign key id for customer. Build a new query on customer db table and take the right customer based on the id. Build a new Customer id filling its values whit the results of this second query. Assign the Customer object to the field customer of the object Orders
Same for Book and author
Add the object Order to a list
Now you have a list of suppose 10 orders
Iterate on it and fill the order table you display.
To display for example Customer field you will have
listOrders[i].getCustomer().getName(). // listOrders[i] is an Order object. getCusotmer returns a customer object. getName is a Customer's method that return the String name.
Same for book and Author
SECOND APPROACH
design Order Class like this:
public class Order{
private String id;
private Date date;
private int customer;
private int book;
private int author;
get and set methods
}
Note now the customer etc are INT fields. The hold the int reference key retreived from db
Again you query table orders.
For each row build an Order object. Fill the customer etc. values simply with the id of the db.
Now you want to display a list of orders.
Iterate on the list.
when displayng customer use
listOrders[i].getCustomer().getName().
NOTE as customer field is an int reference key the geCustomer should
Execute e query on db customer table to retreive the correct customer based on the id
Build a Customer object filling its fields
REturn the object
So the differences beetwen the two appracches:
The first build a complete Order object that contains also Customer object etc. When need to display something, you have all what you need.
Second approach build a light Order object. When need to display for example the data of the customer whe need to query the db (this is called lazy loading)
I suggest you to consider using an ORM wich really helps you mapping OO design on DB and helps you build queries that return directly objects instead of "ids"