Currently I am working on a project with Spring web-service, hibernate and JAXb.
1) I have generated hibernate beans using IDE 'hibernate code generation,
2) also, I have generated the jaxb beans using maven compiler.
..
Now, my question is,
1) Is this the right approach? (to have so many beans).
2) Shall I use JAXb beans for processing into the service layer? How can I keep layers decoupled?
3) Or, do I need to create another set of beans ie. map (JAXb beans) to (new beans) to (hibernate beans)?
.
Please tell your views?
Thanks,
adi
You know, you cannot have everything fully decoupled. There will be always a layer that will know the other two layers.
Usually when I design 3 layers architecture, like:
Service Layer - the one that probably uses JAXB, exposes web services or other APIs
Business Layer - any real logic
Persistence Layer - hibernate
I allow the Business layer to know about the Service Layer (JAXB) and about the Persistence Layer (hibernate beans). But I don't allow the Service Layer and the Persistence Layer to know about each other.
Note: I'm the EclipseLink JAXB (MOXy) lead and a member of the JAXB 2 (JSR-222) expert group. EclipseLink also provides an excellent JPA implementation (open sourced from TopLink).
There are costs to maintaining multiple models. Each model you add introduces a bean-to-bean conversion that must be written, tested, and maintained.
Another approach is to use the same beans for both the JPA and JAXB bindings. For this use case it will be easier to start with domain model and add JAXB & JPA metadata to apply mappings to XML and the database. Below is example of where a single model is leveraged to create a RESTful web service:
http://blog.bdoughan.com/2010/08/creating-restful-web-service-part-35.html
Since EclipseLink provides both JAXB and JPA implementations we provide a number of extensions to make this easier:
http://wiki.eclipse.org/EclipseLink/Examples/MOXy/JPA
http://blog.bdoughan.com/2010/07/jpa-entities-to-xml-bidirectional.html
UPDATE
In response to:
Agree to what you are saying. However, using same beans will couple
the code very tightly and will be highly dependent. Change in one
layer will need changes elsewhere as well. What you say?
It all depends how you look at things. My preference for building data access services is to design and build a solid domain model. Then use JPA and JAXB to solve the impedance mismatches between object-relational and object-XML.
One Model Approach
Using one model for both JPA and JAXB means that when you make a change to the model you need to decide at that time how it will be handled for both JPA and JAXB (this can be good or bad). If you don't want every new addition to the model to affect the JAXB mapping you can leverage JAXB concepts like #XmlAccessorType(XmlAccessType.NONE).
http://blog.bdoughan.com/2011/06/using-jaxbs-xmlaccessortype-to.html
Two (or More) Models Approach
When you want to add a field that is mapped to both relational and XML you need to add it to two models and add the necessary conversion logic. In this case there is a cost to keeping the models de-coupled.
Related
As far as I know, there are two ways to configure JPA / Hibernate:
XML based configuration through something like hibernate.cfg.xml. I don't like his approach because, well, XML ...
Through annotations in the entity object. Much better than the XML config, but it couples my entities to JPA.
As I am currently investigating an architecture where the domain model does not know anything about the database (The 'Onion' architecture), I am looking for is a way to specify the mappings without changing my entities.
Of course I could create separate mapping classes, e.g. if I have a Customer domain object, create a JPA-annotated CustomerEntity and let the repository translate from one to another. But this approach doesn't feel quite right because the Customer and CustomerEntity will essentially be the same.
So it seems like I have to resort to Hibernate XML configuration, but as mentioned before, I don't like that approach.
Spring has a nice way of configuration: Java-based configuration. I was wondering if there is something similar for Hibernate/JPA configuration, and if not, why not?
My apologies if none of the above makes sense, but any help is welcome, even if it doesn't answer my question :-)
I've never heard of an Onion Theory of Java EE design. I did hear of The Onion, but that's a satirical newsletter (if that :-). Separation of concerns in Java EE I typically expressed as a MVC, or Model View Controller architecture. Your JSP or JSF pages will be your view, your #ManagedBean controllers will be your controllers, and your model will hold your Entities.
The model, which is where the JPA will be, can usually be further separated into a Service Layer, a Persistence Layer and EIS (Enterprise Information System) or Database tier. The Service tier will hold your EJBs, annotated with Stateless, Statefull or Singleton, and will encapsulate the business logic for the application. The Persistence Layer will have #Entity annotated objects that are stored by the Database.
Java EE defines these layers with JSR's, with numbers of some sort. For example, the Java Persistence API (JPA) is JSR-220. Tests are developed against these JSR's and if a vendor meets these tests then their product can be (More or less) swapped out for another vender's version.
Apparantly there is somethinig called Fluent NHibernate for C#, which does exactly what I was looking for.
Unfortunately, there seems to be no "Fluent Hibernate" for Java.
There is a fluent-hibernate project on GitHub, but that one seems to be about fluently writing HQL-queries, not about mapping configurations.
I have implemented a Data Access Layer by mean of Spring-Data's facilities. At the moment I use the Hibernate Tools (in Eclipse) to generate the annotated EJB3 Entities of the tables of my DB.
Clearly the fetched data should flow through my application layers so I need to implement the Domain (as Spring calls it) part of the application. Theoretically the Domain should be a agnostic-to-the-implementation POJO, meaning it should not contain Entity's annotations.
Keeping in mind that the Entity is unavoidable to make things work, what is the usual behaviour? Should I create a Domain class library which simply mirrors the Entity data and is only a POJO (with the help of Dozer to make the copy part) or should I use the Entity even outside the DAL (thus losing the Domain's agnostic-to-the-implementation attitude)?
Thank you in advance
Giulio
We are now doing SOA migration and our old system's architecture is based on spring and hibernate. We use PO (persistence object) across all the layers.
When facing SOA migration, if we use DTO for remote procedure call, we have to create so many DTOs.
What are some suggestions on how to avoid this?
Develop a Canonical Model, probably the most important SOA pattern there is.
- Define a representation using an XML Schema for that model.
- Use jaxb to create the Java POJO representations.
Once you have these you 'could' map these to your existing Persistent Objects and then round-trip until they are equivalent.
Alternatively given you already use persistent object you could work bottom up with Jaxb, but in my experience that is more difficult/work intensive approach.
Greetings,
I have a complicated scenario to handle. I have a wsdl file which uses a particular XML schema.
The XML schema is actually a handcrafted implementation of a specification. There is also a Java based implementation of the same specification. So the XSD used in WSDL and Java classes at hand are quite similar, but not exactly same.
Almost all web service stacks allow creating classes from WSDL or creating WSDL from Java class annotations.
What I want to do, is to use the WSDL and bind XSD used in the wsdl to existing java classes.
Should/can I do this by manually replacing generated Java classes with existing ones? Is it a matter of changing type names in config files and moving binding annotations to existing classes?
If you know any best practices, or java web service stacks that support this kind if flexibility in a practical way, your response would be much appreciated.
Best Regards
Seref
I suggest Spring's Web Services module, which has no code generation involved, but provides a clean separation of concerns. Different concerns are broken out nicely by allowing you to provide your WSDL and existing schema(s) on one side (contract first), your existing Java-based domain model on the other, and a way to plugin in your OXM (Object-XML Mapping) technology of choice.
Since you have hand-crafted WSDL/schema and hand-crafted Java classes, the real work will be in configuring your OXM. I prefer JiBX as it keeps the concerns separated (no XML annotation garbage mixed into your domain) with JAXB as a backup if the learning curve looks too steep. Spring Web Services supports several other OXM frameworks, and you can even use several different ones at once.
As far as best-practices, I consider hand-crafted code a best practice, though I may be in the minority. If you generate classes from XML you end up with classes that are simple data containers with no behavior (assuming you want to regenerate them whenever your WSDL/XSD changes). This is bad if you favor the object-oriented paradigm because you end up having to place your "business logic" in utilities/helpers/services etc. instead of in the domain objects where it really belongs. This is one reason I favor JiBX. I can make very nice OO objects with behavior, a nice clean schema that doesn't necessarily match the objects, and can manage changes to either side with a mapping file similar to how hibernate does it for ORM (Object-Relational Mapping). You can do the same with JAXB, but that requires embedding XML structure into your object model, and binds a single XML representation to it (whereas with JiBX you can have many).
MOXY (I'm the tech lead) was designed for instances where you have an existing XML Schema and an exsting object model. It accomplishes this through XPath based mapping and can ever handle cases where the models are not that similar:
parse google geocode with xstream
MOXy also has an external binding file:
http://wiki.eclipse.org/EclipseLink/Examples/MOXy/EclipseLink-OXM.XML
MOXy is a JAXB implementation with extensions (A couple of which are mentioned above). IF you go ahead with Spring, MOXy is configured as a JAXB implementation, and you need to add a jaxb.properties file in with your model classes with the following entry:
javax.xml.bind.context.factory=org.eclipse.persistence.jaxb.JAXBContextFactory
I would like to know if there are any tools to automatically generate EJB3 Entity Beans(for JPA) from a database schema.
Thanks.
Dali supports top-down, bottom-up (this is what you're looking for), and meet-in-the-middle development approaches.
Some of IDs have such feature, for example NetBeans
OpenJPA has a tool which will generate your Entity definitions.
From the OpenJPA user manual:
OpenJPA includes a reverse mapping tool for generating persistent class definitions, complete with metadata, from an existing database schema. You do not have to use the reverse mapping tool to access an existing schema; you are free to write your classes and mappings yourself, as described in Section 3, “ Meet-in-the-Middle Mapping ”. The reverse mapping tool, however, can give you an excellent starting point from which to grow your persistent classes.
No IDE required!
-Rick
I have a solution for you i.e to create auto generate domain objects with all table relationship properly mapped in class ...Try Dal4j yes you can find it in sourceforge.net/p/dal4j/wiki/ DAL4j is a Command Line and Framework tool that can be used to reverse engineer a MySQL or SQLServer database schema into a set of JPA Entity Beans.
DAL4j can be useful for scenarios where there is an existing database schema but a technology other that JPA is used by applications to interact with the database. DAL4j can provide an easy way to migrate your code base from other technologies such as JDBC or Hibernate to JPA.
The beans generated can be 1 or two types: Simple or Framework. Simple beans are standard pojo classes managed by your application using JPA semantics. Framework generated pojos use the DAL4j framework DAO generic to simplify CRUD operations.
DAL4j provides optional hooks to allow you integrate encryption/decryption of data fields that must be encrypted in the database.
Last, DAL4j provides a set of Generic classes that can be used to simplify creation of Session Beans which perform CRUD operations using generated Entities.
I think you will find this article feasible....