I want to develop a CMS using Java, Spring Data/ MVC/ DI , Hibernate defining REST-like API.
I have the following model entities:
there are multiple Articles
each article has multiple Sections
each section can have subsections and / or Item
All these entities have properties of their own (e.g. name, type etc.), but as it is obvious they refer to their aggregated entities. I need to defined CRUD API methods for each such entity.
I decided to stray a bit from dogmatical REST and when I do modify I need to pass in only the entity-specific properties (like name, type etc.), but would not affect the aggregations. Thus I have endpoints like:
post /articles - creates an article, no sections
put /articles/{article_id} - updates basic article properties, does not affect sections
post /articles/{article_id}/sections - creates a section in the article
delete /articles/{article_id}/sections/{section_id} - removes the section from the article
put /articles/{article_id}/sections/{section_id} - updates basic section properties, does not affect owning article properties, nor aggregated sections and items
etc...
So my question is:
When I receive a modify request I get all basic properties of the element along with owning entity identifier. How can I effectively combine those with the existing relations in the database, so that I keep all of them and modify the basic properties without the need of copying over all properties one by one. Here is an example for the article-section relation.
public void modifySection(int articleId, int sectionId, Section section) {
assert(article.owns(sectionId));
Section dbSection = sectionDao.findOne(sectionId);
copyOverProperties(section, dbSection); // this is the thing I do not know how to do
sectionDao.save(dbSection);
}
You require hibernates session.merge(object_name);
Link : From Hibernate docs
Examples from edit functionality of our webapp :
#Repository
public class GroupCanvasDAOImpl implements GroupCanvasDAO {
private final SessionFactory sessionFactory;
#Autowired
public GroupCanvasDAOImpl(SessionFactory sessionFactory) {
this.sessionFactory = sessionFactory;
}
#Override
public void editGroupCanvas(GroupCanvas groupCanvas) {
Session session = this.sessionFactory.getCurrentSession();
GroupCanvas groupCanvas1 = (GroupCanvas) session.get(GroupCanvas.class, groupCanvas.getMcanvasid());
// Below 2 steps are not necessary if object was retrieved from DB and //then persisted back-again. If it was newly created to replace an //old-one, then the below 2 lines are needed.
groupCanvas.setGroupAccount(groupCanvas1.getGroupAccount());
groupCanvas.setCanvasowner(groupCanvas1.getCanvasowner());
session.merge(groupCanvas);
session.flush();
}
}
}
If this is not what you are looking for, kindly let me know, I will delete my answer.
Related
I have the following question. From what I understand the #Transactional annotation is supposed to keep the session alive, thus enabling to lazy fetch child entities without the need to performe a specific joining query.
I have the following scenario where I do not understand why I'm still getting a LazyInitializationException.
My app runs a resolver in order to provide the various controller services with a resolved object so that it can be used directly.
Said resolver intercepts a header from the request and using it's value attempts to query the db in order to fetch the object. Now the object in question is quite simple is it's doings albeit it has a list of two sub-entities.
In order to perform the resolving action I'm using an extra service where I basically wrap some JpaRepository methods. The complete is below:
#Service
public class AppClientServiceImpl implements AppClientService {
private static final Logger LOGGER = LoggerFactory.getLogger(AppClientServiceImpl.class.getCanonicalName());
private final AppClientRepository repository;
#Autowired
public AppClientServiceImpl(AppClientRepository repository) {
this.repository = repository;
}
#Override
#Transactional(readOnly = true)
public AppClient getByAppClientId(final String appClientId) {
LOGGER.debug("Attempting to retrieve appClient with id:: {}", appClientId);
return repository.findByAppClientId(appClientId);
}
#Override
#Transactional
public void saveAndFlush(final AppClient appClient) {
LOGGER.debug("Attempting to save/update appClient:: {}", appClient);
repository.saveAndFlush(appClient);
}
}
As you can see both methods are annotated as #Transactional meaning that the should keep the session alive in the context of that said method.
Now, my main questions are the following:
1) Using the debugger I'm seeing even on that level getByAppClientId the list containing on the sub-entities which is lazy loaded has been resolved just fine.
2) On the resolver itself, where the object has been received from the delegating method, the list fails to be evaluated due to a LazyInitializationException.
3) Finally on the final controller service method which is also marked as #Transactional, the same as above occurs meaning that this eventually fails to it's job (since it's performing a get of the list that has failed to initialize.
Based on all the above, I would like to know what is the best approach in handling this. For once I do not want to use an Eager fetching type and I would also like to avoid using fetch queries. Also marking my resolver as #Transactional thus keeping the session open there as well is also out of the question.
I though that since the #Transactional would keep the session open, thus enabling the final service method to obtain the list of sub-entities. This seems not to be the case.
Based on all the above it seems that I need a way for the final service method that gets call (which needs the list on hand) to fetch it somehow.
What would the best approach to handle this? I've read quite a few posts here, but I cannot make out which is the most accepted methods as of Spring boot 2.0 and hibernate 5.
Update:
Seems that annotating the sub-entitie with the following:
#Fetch(FetchMode.SELECT)
#LazyCollection(LazyCollectionOption.TRUE)
Resolves the problem but I still don't know whether this is the best approach.
You initialize the collection by debugging. The debugger usually represents collections in a special way by using the collection methods which trigger the initialization, so that might be the reason why it seems to work fine during debugging. I suppose the resolver runs outside of the scope of the getByAppClientId? At that point the session is closed which is why you see the exception.
I created Blaze-Persistence Entity Views for exactly that use case. You essentially define DTOs for JPA entities as interfaces and apply them on a query. It supports mapping nested DTOs, collection etc., essentially everything you'd expect and on top of that, it will improve your query performance as it will generate queries fetching just the data that you actually require for the DTOs.
The entity views for your example could look like this
#EntityView(AppClient.class)
interface AppClientDto {
String getName();
}
Querying could look like this
List<AppClientDto> dtos = entityViewManager.applySetting(
EntityViewSetting.create(AppClientDto.class),
criteriaBuilderFactory.create(em, AppClient.class)
).getResultList();
I know that when using Wicket with JPA frameworks it is not advisable to serialize entities that have already been persisted to the database (because of problems with lazy fields and to save space). In such cases we are supposed to use LoadableDetachableModel. But what about the following use-case?
Suppose we want to create a new entity (say, a Contract) which will consist, among other things, of persisted entities (say, a Client which is selected from a list of clients stored in the DB). The entity under creation is a model object of some Wicket component (say, a Wizard). In the end (when we finish our wizard) we save the new entity to the DB. So my question is: what is the best generic solution to the serialization problem of such model objects? We can't use LDM because the entity is not in the DB yet but we don't want our inner entities (like Client) to be serialized wholly, too.
My idea was to implement a custom wicket serializer that checks if the object is an entity and if it is persisted. If so, store only its id, otherwise use the default serialization. Similarly, when deserializing use the stored id and get the entity from the DB or deserialize using the default mechanism. Not sure, though, how to do that in a generic way. My next thought was that if we can do it, then we do not need any LDM anymore, we can just store all our entities in simple org.apache.wicket.model.Model models and our serialization logic will take care of them, right?
Here's some code:
#Entity
Client {
String clientName;
#ManyToOne(fetch = FetchType.LAZY)
ClientGroup group;
}
#Entity
Contract {
Date date;
#ManyToOne(fetch = FetchType.LAZY)
Client client;
}
ContractWizard extends Wizard {
ContractWizard(String markupId, IModel<Contract> model) {
super(markupId);
setDefaultModel(model);
}
}
Contract contract = DAO.createEntity(Contract.class);
ContractWizard wizard = new ContractWizard("wizard", ?);
How to pass the contract? If we just say Model.of(contract) the whole contract will be serialized along with inner client (and it can be big), moreover if we access contract.client.group after deserialization we can bump into the problem: https://en.wikibooks.org/wiki/Java_Persistence/Relationships#Serialization.2C_and_Detaching
So I wonder how people go about solving such issues, I'm sure it's a fairly common problem.
I guess there are 2 approaches to your problem:
a.) Only save the stuff the user actually sees in Models. In your example that might be "contractStartDate", "contractEndDate", List of clientIds. That's the main approach if you don't want your DatabaseObjects in your view.
b.) Write your own LoadableDetachableModel and make sure you only serialize transient objects. For example like: (assuming that any negative id is not saved to the database)
public class MyLoadableDetachableModel extends LoadableDetachableModel {
private Object myObject;
private Integer id;
public MyLoadableDetachableModel(Object myObject) {
this.myObject = myObject;
this.id = myObject.getId();
}
#Override
protected Object load() {
if (id < 0) {
return myObject;
}
return myObjectDao.getMyObjectById(id);
}
#Override
protected void onDetach() {
super.onDetach();
id = myObject.getId();
if (id >= 0) {
myObject = null;
}
}
}
The downfall of this is that you'll have to make your DatabaseObjects Serializable which is not really ideal and can lead to all kind of problems. You would also need to decouple the references to other entities from the transient object by using a ListModel.
Having worked with both approaches I personally prefer the first. From my expierence the whole injecting dao objects into wicket can lead to disaster. :) I would only use this in view-only projects that aren't too big.
Most projects I know of just accept serializing referenced entities (e.g. your Clients) along with the edited entity (Contract).
Using conversations (keeping a Hibernate/JPA session open over several requests) is a nice alternative for applications with complex entity relations:
The Hibernate session and its entities is kept separate from the page and is never serialized. The component just keeps an identifier to fetch its conversation.
I have a unidirectional relation Project -> ProjectType:
#Entity
public class Project extends NamedEntity
{
#ManyToOne(optional = false)
#JoinColumn(name = "TYPE_ID")
private ProjectType type;
}
#Entity
public class ProjectType extends Lookup
{
#Min(0)
private int progressive = 1;
}
Note that there's no cascade.
Now, when I insert a new Project I need to increment the type progressive.
This is what I'm doing inside an EJB, but I'm not sure it's the best approach:
public void create(Project project)
{
em.persist(project);
/* is necessary to merge the type? */
ProjectType type = em.merge(project.getType());
/* is necessary to set the type again? */
project.setType(type);
int progressive = type.getProgressive();
type.setProgressive(progressive + 1);
project.setCode(type.getPrefix() + progressive);
}
I'm using eclipselink 2.6.0, but I'd like to know if there's a implementation independent best practice and/or if there are behavioral differences between persistence providers, about this specific scenario.
UPDATE
to clarify the context when entering EJB create method (it is invoked by a JSF #ManagedBean):
project.projectType is DETACHED
project is NEW
no transaction (I'm using JTA/CMT) is active
I am not asking about the difference between persist() and merge(), I'm asking if either
if em.persist(project) automatically "reattach" project.projectType (I suppose not)
if it is legal the call order: first em.persist(project) then em.merge(projectType) or if it should be inverted
since em.merge(projectType) returns a different instance, if it is required to call project.setType(managedProjectType)
An explaination of "why" this works in a way and not in another is also welcome.
You need merge(...) only to make a transient entity managed by your entity manager. Depending on the implementation of JPA (not sure about EclipseLink) the returned instance of the merge call might be a different copy of the original object.
MyEntity unmanaged = new MyEntity();
MyEntity managed = entityManager.merge(unmanaged);
assert(entityManager.contains(managed)); // true if everything worked out
assert(managed != unmanaged); // probably true, depending on JPA impl.
If you call manage(entity) where entity is already managed, nothing will happen.
Calling persist(entity) will also make your entity managed, but it returns no copy. Instead it merges the original object and it might also call an ID generator (e.g. a sequence), which is not the case when using merge.
See this answer for more details on the difference between persist and merge.
Here's my proposal:
public void create(Project project) {
ProjectType type = project.getType(); // maybe check if null
if (!entityManager.contains(type)) { // type is transient
type = entityManager.merge(type); // or load the type
project.setType(type); // update the reference
}
int progressive = type.getProgressive();
type.setProgressive(progressive + 1); // mark as dirty, update on flush
// set "code" before persisting "project" ...
project.setCode(type.getPrefix() + progressive);
entityManager.persist(project);
// ... now no additional UPDATE is required after the
// INSERT on "project".
}
UPDATE
if em.persist(project) automatically "reattach" project.projectType (I suppose not)
No. You'll probably get an exception (Hibernate does anyway) stating, that you're trying to merge with a transient reference.
Correction: I tested it with Hibernate and got no exception. The project was created with the unmanaged project type (which was managed and then detached before persisting the project). But the project type's progression was not incremented, as expected, since it wasn't managed. So yeah, manage it before persisting the project.
if it is legal the call order: first em.persist(project) then em.merge(projectType) or if it should be inverted
It's best practise to do so. But when both statements are executed within the same batch (before the entity manager gets flushed) it may even work (merging type after persisting project). In my test it worked anyway. But as I said, it's better to merge the entities before persisting new ones.
since em.merge(projectType) returns a different instance, if it is required to call project.setType(managedProjectType)
Yes. See example above. A persistence provider may return the same reference, but it isn't required to. So to be sure, call project.setType(mergedType).
Do you need to merge? Well it depends. According to merge() javadoc:
Merge the state of the given entity into the current persistence
context
How did you get the instance of ProjectType you attach to your Project to? If that instance is already managed then all you need to do is just
type.setProgessive(type.getProgressive() + 1)
and JPA will automatically issue an update effective on next context flush.
Otherwise if the type is not managed then you need to merge it first.
Although not directly related this quesetion has some good insight about persist vs merge: JPA EntityManager: Why use persist() over merge()?
With the call order of em.persist(project) vs em.merge(projectType), you probably should ask yourself what should happen if the type is gone in the database? If you merge the type first it will get re-inserted, if you persist the project first and you have FK constraint the insert will fail (because it's not cascading).
Here in this code. Merge basically store the record in different object, Let's say
One Account pojo is there
Account account =null;
account = entityManager.merge(account);
then you can store the result of this.
But in your code your are using merge different condition like
public void create(Project project)
{
em.persist(project);
/* is necessary to merge the type? */
ProjectType type = em.merge(project.getType());
}
here
Project and ProjectType two different pojo you can use merge for same pojo.
or is there any relationship between in your pojo then also you can use it.
Is there a way I can share Http/Wicket Session information to the service layer without introducing servlet api/Wicket dependency?
I'll provide some context to why am I asking this question, just in case I'm missing something and asking the wrong question.
I've got several entities that have groups of attributes that can be validatable.
Being validatable means there are fields indicating the validation value, the user who made the validation and the date it was validated in.
This is how these entities are modelled:
#Embeddable
public class ValidationBean<T> implements Serializable {
private T validated;
private String user;
private Date date;
// Constructors, getters, setters ahead.
// ...
}
#Entity
#Table(name="SOME_TABLE")
public class SomeEntity implements Serializable, SomeInterface {
// Some attributes which conform validation group 1
public String attribute11;
public String attribute12;
public String attribute13;
private ValidationBean<Integer> validationBean1 = new ValidationBean<Integer>();
// Some attributes which conform validation group 2
public String attribute21;
private ValidationBean<String> validationBean2 = new ValidationBean<Integer>();
// Constructors, various attribute getters with JPA annotations
// ...
#Embedded
#AttributeOverrides(/*various overrides, each entity/validation group has its own validation column names...*/)
public ValidationBean<Integer> getValidationBean1() { return validationBean1; }
#Embedded
#AttributeOverrides(/*various overrides, each entity/validation group has its own validation column names...*/)
public ValidationBean<Integer> getValidationBean2() { return validationBean2; }
}
ValidationBean's user and date fields are automatically modified in the presentation layer when a change in the validated field is detected.
All of this is working correctly. Now, I'm trying to find an elegant & general solution that integrates with the current modelling to the following requirement: When any of the attributes in a validation group gets its value changed, and the related ValidationBean.validated doesn't change, user and date must also be modified with the current user's id and the current date.
There are, as I see it, two alternatives; putting that logic in the presentation layer, or in business/service layer
Putting it in the presentation layer would have an efficieny advantage. Entities are stored in session so that the DB doesn't have to be queried again to check for field changes. But unfortunately, some entities have some of their fields ajax-updated and it would be hard to tell if the entity really changed. Apart from not being the presentation layer's responsability to fulfill this requirement.
Putting it in the service layer seems the best alternative, and I've already found a possible way to handle this properly. I've come up with #PreUpdate. It would be easy to implement a #PreUpdate method on the #Entities to compare the values in DB with the values about to be updated, and modify the related ValidationBeans accordingly. The problem here, and I suppose it's a common problem, is that in the business layer, I don't have where to get the user id from. The current user Id is stored in the Session, which belongs to the presentation layer.
So, any tips, comments, recommendations on how can I share http session information to the service layer (not necessarily Wicket-specific), or even alternatives to fulfill this requirement will be welcome.
UDPATE : Following gkamal's suggestion, I'll try to integrate spring-security in the less intrusive way I can, just to take advantage of SecurityContext. I'd also appreciate tips on this matter.
The common approach used to solve this is to introduce a SecurityContext class that holds the details of the current user as a static thread local variable. The variable is initialized (from the httpsession) by the security filter or some other filter and cleared after the request processing is complete. The SecurityContext class will itself be part of the business layer which provides a set / get methods and hence doesn't have any web layer dependency.
Given that you have a lot of domain objects, that all interact with one another, it would be very useful to know which objects have changed in a particular transaction.
Is this possible ? I would like to essentially do this :
public void someBusinessLogicMethod(someparams) {
Session s = getSession();
Transaction tr = s.beginTransaction()
domainObject = s.load(...)
domainObject.setSomethingOrOther(...);
domainObject.getSomeLink().setSomethingElse(...);
callSomeOtherBusinessLogicMethod();
tr.commit();
/* at this point many objects have changed, Hibernate knows which ones */
for (Object s : tr.getAffectedObjects(?)) {
....
}
}
Does this exist ?
Assuming you want to do something like create audit entries for all the changes, you could use a Hibernate Listener or an Interceptor. If you hook the listener/interceptor at the right moment (e.g. onFlushDirty), you have access to the objects and properties that have changed.
More info: http://docs.jboss.org/hibernate/core/3.3/reference/en/html/events.html
Hope this helps.