Business logic in domain objects - java

I am coding a ribbon/achievements system for a website and I have to write some logic for each ribbon in my system. For example, you could earn a ribbon if you're in the first 2,000 people registering to the website or after 1,000 post in the forum. The idea is very similar to stackoverflow's badges, really.
So, every ribbon is obviously in the database but they also need a bit of logic to determine when a user has earned the ribbon.
In the way I coded it, Ribbon is a simple interface:
public interface Ribbon {
public void setId(int id);
public int getId();
public String getTitle();
public void setTitle(String title);
public boolean isEarned(User user);
}
RibbonJpa is an abstract class that implements the Ribbon interface, avoiding the definition of the isEarned() method:
#Entity
#Table(name = "ribbon")
#Inheritance(strategy = InheritanceType.SINGLE_TABLE)
#DiscriminatorColumn(name = "ribbon_type")
public abstract class RibbonJpa implements Ribbon {
#Id
#Column(name = "id", nullable = false)
int id;
#Column(name = "title", nullable = false)
private String title;
#Override
public int getId() {
return id;
}
#Override
public void setId(int id) {
this.id= id;
}
#Override
public String getTitle() {
return title;
}
#Override
public void setTitle(String title) {
this.title = title;
}
}
You can see I define the inheritance strategy as SINGLE_TABLE (since I have to code like 50 ribbons and I don't need additional columns for any of them).
Now, a specific ribbon will be implemented like this:
#Entity
public class FirstUsersRibbon extends RibbonJpa implements Ribbon {
public FirstUsersRibbon() {
super.setId(1);
super.setTitle("First 2,000 users registered to the website");
}
#Override
public boolean isEarned(User user) {
// My logic to check whether the specified user has won the award
}
}
This code works fine, the tables are created in the database in the way I expect (I use DDL generation in my local environment).
The thing is, it feels wrong to code business logic in a domain object. Is it good practice? Can you suggest a better solution? Also, I'm not able to Autowire any DAOs in the entity (FirstUsersRibbon) and I need them in the business logic (in this case, I need a DAO to check whether the user is in the first 2,000 users registered to the website).
Any help very appreciated.
Thank you!

The thing is, it feels wrong to code business logic in a domain object.
Many would say the reverse was true: that it is an anti-pattern (an anaemic domain model) to have business logic anywhere else. See Domain-Driven Design for more information.
You might then wonder what the middle tier of the conventional 3-tier architecture was for. It provides a service layer for the application. See my related question "What use are EJBs?".

Also, I'm not able to Autowire any DAOs in the entity
If you're using Spring and Hibernate, have a look at http://jblewitt.com/blog/?p=129: this gives a good description of a similar problem with a variety of solutions.
If you're looking for a rich domain model in the way that you describe, then it can be a good idea to instantiate domain objects via Spring, and hence be able to inject DAOs into your domain objects.

Related

jOOQ: Allowed-Character constraints?

I am considering moving from Hibernate to jOOQ but I can't find e.g.
how to have Pattern-Constraints on a String like this in Hibernate:
#NotEmpty(message = "Firstname cannot be empty")
#Pattern(regexp = "^[a-zA-Z0-9_]*$", message = "First Name can only contain characters.")
private String firstname;
How would I do that in jOOQ?
The "jOOQ way"
The "jOOQ way" to do such validation would be to create either:
A CHECK constraint in the database.
A trigger in the database.
A domain in the database.
After all, if you want to ensure data integrity, the database is where such constraints and integrity checks belong (possibly in addition to functionally equivalent client-side validation). Imagine a batch job, a Perl script, or even a JDBC statement that bypasses JSR-303 validation. You'll find yourself with corrupt data in no time.
If you do want to implement client-side validation, you can still use JSR-303 on your DTOs, which interact with your UI, for instance. But you will have to perform validation before passing the data to jOOQ for storage (as artbristol explained).
Using a Converter
You could, however, use your own custom type by declaring a Converter on individual columns and by registering such Converter with the source code generator.
Essentially, a Converter is:
public interface Converter<T, U> extends Serializable {
U from(T databaseObject);
T to(U userObject);
Class<T> fromType();
Class<U> toType();
}
In your case, you could implement your annotations as such:
public class NotEmptyAlphaNumericValidator implements Converter<String, String> {
// Validation
public String to(String userObject) {
assertNotEmpty(userObject);
assertMatches(userObject, "^[a-zA-Z0-9_]*$");
return userObject;
}
// Boilerplate
public String from(String databaseObject) { return databaseObject; }
public Class<String> fromType() { return String.class; }
public Class<String> toType() { return String.class; }
}
Note that this is more of a workaround, as Converter hasn't been designed for this use-case, even if it can perfectly implement it.
Using formal client-side validation
There's also a pending feature request #4543 to add more support for client-side validation. As of jOOQ 3.7, this is not yet implemented.
I recommend you don't try to use jOOQ in a 'hibernate/JPA' way. Leave the jOOQ generated classes as they are and map to your own domain classes manually, which you are free to annotate however you like. You can then call a JSR validator before you attempt to persist them.
For example, jOOQ might generate the following class
public class BookRecord extends UpdatableRecordImpl<BookRecord> {
private String firstname;
public void setId(Integer value) { /* ... */ }
public Integer getId() { /* ... */ }
}
You can create your own domain object
public class Book {
#NotEmpty(message = "Firstname cannot be empty")
#Pattern(regexp = "^[a-zA-Z0-9_]*$", message = "First Name can only contain characters.")
private String firstname;
public void setId(Integer value) { /* ... */ }
public Integer getId() { /* ... */ }
}
and map by hand once you've retrieved a BookRecord, in your DAO layer
Book book = new Book();
book.setId(bookRecord.getId());
book.setFirstname(bookRecord.getFirstname());
This seems quite tedious (and ORM tries to spare you this tedium) but actually it scales quite well to complicated domain objects, in my opinion, and it's always easy to figure out the flow of data in your application.

GAE Endpoints (Java) with objectify - how to model partial data (for client)?

I have the following entities:
#Entity
public class Person {
#Id public Long id;
public String name;
public Ref<Picture> picture;
public String email;
public byte age;
public short birthday; // day of year
public String school;
public String very_long_life_story;
... some extra fields ...
}
#Entity
public class Place {
#Id public Long id;
public String name;
public String comment;
public long createdDateMS;
public long visitors;
#Load public List<Ref<Person>> owners;
}
Few notes:
(A) Maximum size of owners, in Place entity, is 4 (~)
(B) The person class is presumable very big, and when querying place, I would like to only show a subset of the person data. This optimizations is aimed both at server-client and server-database communications. Since objectify (gae actually) only load/save entire entities, I would like to do the following:
#Entity
pubilc class PersonShort {
#Id public Long id;
public Ref<Picture> picture;
public String name;
}
and inside Place, I would like to have (instead of owners):
#Load public List<PersonShort> owners;
(C) The problem with this approach, is that now I have a duplication inside the datastore.
Although this isn't such a bad thing, the real problem is when a Person will try to save a new picture, or change name; I will not only have to update it in his Person class,
but also search for every Place that has a PersonShort with same id, and update that.
(D) So the question is, is there any solution? Or am I simply forced to select between the options?
(1) Loading multiple Person class, which are big, when all I need is some really small information about it.
(2) Data duplication with many writes
If so, which one is better (Currently, I believe it's 1)?
EDIT
What about loading the entire class (1), but sending only part of it?
#Entity
public class Person {
#Id public Long id;
public String name;
public Ref<Picture> picture;
public String email;
public byte age;
public short birthday; // day of year
public String school;
public String very_long_life_story;
... some extra fields ...
}
public class PersonShort {
public long id;
public String name;
}
#Entity
public class Place {
#Id public Long id;
public String name;
public String comment;
public long createdDateMS;
public long visitors;
// Ignore saving to datastore
#Ignore
public List<PersonShort> owners;
// Do not serialize when sending to client
#ApiResourceProperty(ignored = AnnotationBoolean.TRUE)
#Load public List<Ref<Person>> ownersRef;
#OnLoad private void loadOwners() {
owners = new List<PersonShort>();
for (Ref<Person> r : ownersRef) {
owners.add(nwe PersonShort(r.get()));
}
}
}
It sounds like you are optimizing prematurely. Do you know you have a performance issue?
Unless you're talking about hundreds of K, don't worry about the size of your Person object in advance. There is no practical value in hiding a few extra fields unless the size is severe - and in that case, you should extract the big fields into some sort of meaningful entity (PersonPicture or whatnot).
No definite answer, but some suggestions to look at:
Lifecycle callbacks.
When you put your Person entity, you can have an #OnSave handler to automatically store your new PersonShort entity. This has the advantage of being transparent to the caller, but obviously you are still dealing with 2 entity writes instead of 1.
You may also find you are having to fetch two entities too; initially you may fetch the PersonShort and then later need some of the detail in the corresponding Person. Remember Objectify's caching can reduce your trips to Datastore: it's arguably better to have a bigger, cached, entity than two separate entities (meaning two RPCs).
Store your core properties (the ones in PersonShort) as separate properties in your Person class and then have the extended properties as a single JSON string which you can deserialize with Gson.
This has the advantage that you are not duplicating properties, but the disadvantage is that anything you want to be able to search on cannot be in the JSON blob.
Projection Queries. You can tell Datastore to return only certain properties from your entities. The problem with this method is that you can only return indexed properties, so you will probably find you need too many indexes for this to be viable.
Also, use #Load annotations with care. For example, in your Place class, think whether you really need all those owners' Person details when you fetch the owners. Perhaps you only need one of them? i.e., instead of getting a Place and 4 Persons every time you fetch a Place, maybe you are better off just loading the required Person(s) when you need them? (It will depend on your application.)
It is a good practice to return a different entity to your client than the one you get from your database. So you could create a ShortPerson or something that is only used as a return object in your REST endpoints. It will accept a Person in its constructor and fill in the properties you want to return to the client from this more complete object.
The advantage of this approach is actually less about optimization and more that your server models will change over time, but that change can be completely independent of your API. Also, you choose which data is publicly accessible, which is what you are trying to do.
As for the optimization between db and server, I wouldn't worry about it until it is an issue.

Objectify Key/Ref roundtrip between backend and client [without GWT]

There are a lot of articles here and all over the web, but these all target different Objectify versions and seem not to work for one or the other reason.
I have an entity, which references another entity (e.g. an Account entity references a User entity):
#Cache
#Entity
public final class Account {
#Id Long id;
#Index private Ref<User> user;
public Long getId() {
return id;
}
public void setId(Long id) {
this.id = id;
}
public User getUser() {
return user.get();
}
public void setUser(User user) {
this.user = Ref.create(user);
}
}
I am trying to do this:
From the client, GET the account entity over REST/Google Cloud Endpoints.
Modify the resource.
UPDATE it on the server.
As discussed here Objectify loads object behind Ref<?> even when #Load is not specified above code always returns the referenced user as well, whích I don't want.
One option would be, as #svpino suggested, "Make your #ApiMethod return a different Account object without the user property (thus avoiding fetching the user if you don't need it)." This works as long as I don't want to UPDATE the resource. If I need to UPDATE, the Key/Ref needs to be preserved (even though I don't need it on the client).
One possible approach that I can see would be using Key instead of Ref and rendering a web-safe string, then recreating the user during UPDATE.
private Key<User> user;
public String getUser() {
return user.toString();
}
public void setUser(String user) {
this.user = Key.create(user);
}
The string looks like "Key(User(5723348596162560))", but it seems not to be reconstituted (at least I get an exception here, haven't tracked it down yet).
Another approach would be writing an #ApiTransformer, which did not solve the problem either.
Jeff #StickFigure posted several times during the last years and the issue still seems not to be solved.
What's the current state with Objectify 5.0.2 and what's the recommendation for preserving the key between roundtrips, when the key is not needed on the client?
You need to annotate the property that you want to omit with #ApiResourceProperty(ignored = AnnotationBoolean.TRUE)
Google documentation says the following about the #ApiResourceProperty:
#ApiResourceProperty provides provides more control over how resource
properties are exposed in the API. You can use it on a property getter
or setter to omit the property from an API resource. You can also use
it on the field itself, if the field is private, to expose it in the
API. You can also use this annotation to change the name of a property
in an API resource.
I encourage you to read more by visiting this link
https://developers.google.com/appengine/docs/java/endpoints/annotations#apiresourceproperty
So in your case your class should look like this after the modification.
#Cache
#Entity
public final class Account
{
#Id Long id;
#Index private Ref<User> user;
public Long getId() {
return id;
}
public void setId(Long id) {
this.id = id;
}
#ApiResourceProperty(ignored = AnnotationBoolean.TRUE)
public User getUser() {
return user.get();
}
#ApiResourceProperty(ignored = AnnotationBoolean.TRUE)
public void setUser(User user) {
this.user = Ref.create(user);
}
}
The following code serializes an entity object to a web-safe string so it can be transferred over REST. When the entity is sent back to the server, the Ref<> is reconstituted. This way a server-side reference is not lost while the object does a round-trip to the client. This way referenced objects are not transferred to the client and back, but can be "worked" as Ref<> on the client.
#Index private Ref<User> user;
// for serialization
public String getUser() {
return user.getKey().getString(); // .toWebSafeString() will be added in future version of objectify and .toWebSafeString() will do the same as .getString()
}
public void setUser(String webSafeString) {
Key<User> key = Key.create(webSafeString);
this.user = Ref.create(key);
}
Two separate functions (not named well, I admit) are there for loading the actual object on the server and for creating the reference in the first place:
// for load and create reference
public User loadUser() {
return user.get();
}
public void referenceUser(User user) {
this.user = Ref.create(user);
}
I hope this solves the problem for everybody. This did not yet go through thorough testing, so comments are still welcome.
I have run a test to compare between using a Key<> and a Ref<> and to me it looks like even with Ref<> the entity is only reconstituted when loadEntity()/.get() is called. So Ref<> if probably better as #Load annotations will work. Maybe the objectify guys can confirm this.
You can create a class that extends Ref<User> and use an #ApiTransformer to transfer that class between backend and client
#ApiTransformer(UserRefTransformer.class)
public class UserRef extends LiveRef<User>
{
}
public class UserRefTransformer implements Transformer<UserRef, User>
{
// Your transformation code goes here
}

JPA POJO of joined table in Java

What I would like to realize is the following:
I have a dashboard class and a user class.
In my (Java EE project) Java code I would like to get all dashboards, to which the user has been subscribed.
The database contains a table (dashboard_users), with the following fields: idUser, idDashboard, isDefault en ID.
There should also be a Java POJO of the joined tabled.
My question:
How should the JPA M-M connection between these three classes look like (Dashboard.java/User.java/UserDashboard.java)?
I followed a lot of tutorials and examples, but for some reason there are always errors or other problems. It would be very welcome if someone could give an example, so I can see what I am doing wrong.
Thank you
Given the extra attribute on the association table you are going to need to model it (via a UserDashboard.java class as you asked). Quite unfortunate, as it adds a significant amount of work to your model layer.
If you find you do not need the extra attribute after all then I would model User with a set of Dashboards, linked directly via a #JoinTable.
One way you could do this would be to see the relationship between User and Dashboard as a map in which the Dashboard is a key, there being an entry for every Dashboard associated with the User, and the value is a flag indicating whether that Dashboard is the default for that User. I admit this is a bit forced; it's an odd way to look at the relationship, perhaps even suspect as has been charged.
But the advantage of this view is that it lets you map the living daylights out of everything like this:
#Entity
public class Dashboard {
#Id
private int id;
private String name;
public Dashboard(int id, String name) {
this.id = id;
this.name = name;
}
protected Dashboard() {}
}
#Entity
public class User {
#Id
private int id;
private String name;
#ElementCollection
private Map<Dashboard, Boolean> dashboards;
public User(int id, String name) {
this.id = id;
this.name = name;
this.dashboards = new HashMap<Dashboard, Boolean>();
}
protected User() {}
// disclaimer: the following 'business logic' is not necessarily of the finest quality
public Set<Dashboard> getDashboards() {
return dashboards.keySet();
}
public Dashboard getDefaultDashboard() {
for (Entry<Dashboard, Boolean> dashboard : dashboards.entrySet()) {
if (dashboard.getValue()) {
return dashboard.getKey();
}
}
return null;
}
public void addDashboard(Dashboard dashboard) {
dashboards.put(dashboard, false);
}
public void setDefaultDashboard(Dashboard newDefaultDashboard) {
Dashboard oldDefaultDashboard = getDefaultDashboard();
if (oldDefaultDashboard != null) {
dashboards.put(oldDefaultDashboard, false);
}
dashboards.put(newDefaultDashboard, true);
}
}
This maps a table structure which looks like this Hibernate-generated SQL, which i think is roughly what you want. The generated names on the User_dashboards table are pretty shoddy; you could customise them quite easily with some annotations or some XML. Personally, i like to keep all the filthy details of the actual mapping between the objects and the database in an orm.xml; here's what you'd need to add to use more sensible names:
<entity class="User">
<attributes>
<element-collection name="dashboards">
<map-key-join-column name="Dashboard_id" />
<column name="is_default" />
</element-collection>
</attributes>
</entity>

GWT RequestFactory, ValueProxy or EntityProxy for collection/lookup table

I am not sure what the best practice is for dealing with collection/lookup tables/in RequestFactory.
For example if I have following two Domain objects:
#Entity
public class Experiment {
private Long id;
private String name;
#ManyToOne(cascade={CascadeType.PERSIST,CascadeType.MERGE})
private UnitOfMeasure unitOfMeasure;
public Experiment() { }
public String getName() {
return name;
}
public Long getId() {
return id;
}
public void setName(String name) {
this.name = name;
}
public UnitOfMeasure getUnitOfMeasure() {
return unitOfMeasure;
}
public void setUnitOfMeasure(UnitOfMeasure unitOfMeasure) {
this.unitOfMeasure = unitOfMeasure;
}
}
#Entity
public class UnitOfMeasure {
private Long id;
private String unit_type;
public UnitOfMeasure() { }
public String getUnitType() {
return unit_type;
}
public Long getId() {
return id;
}
public void setUnitType(String unitType) {
this.unit_type = unitType;
}
}
This is a normal unidirectional 1:n realtionship between Experiment and UnitOfMeasure using a ForeignKey in the Experiment table.
I have a limited amount of different UnitOfMeasure instances which usually don't change.
The web-app provides a view where the user can change some properties of the Experiment instance. The view uses the Editor framework. For changing the UnitOfMeasure of a specific Experiment I use a ValueListBox and render the unit_type property.
Because the list of available UnitOfMeasure instances is static I use AutoBeanFactory to create a json string which I put into the HTML host page and during application start I parse it (same thing for all other collection like table values) and store them in a Singleton class instance (AppData) which I pass to `setAcceptableValues``.
Currently I derive UnitOfMeasureProxy from EntityProxy but in order to decode/encode it with AutoBeanFactory I have to annotate the Factory with EntityProxyCategory. I somehow suspect that a ValueProxy would be a better fit.
However with a ValueProxy when I change the UnitOfMeasure of a specific Experiment the entire ValueProxy instance is transmitted over the wire.
From a database point of view however only changing the value for the foreignkey in the Experiment table is required.
So what is the best practice (ValueProxy vs EntityProxy) for collection like tables and child values respectively?
In many cases, references to other entities are best modelled using their IDs rather than the EntityProxys themselves (it's debatable, but I think it's also true for server-side code, or actually any code that crosses unit-of-work boundaries –JPA EntityManager lifetime, Hibernate session, etc.–)
BTW, the proper way to serialize RequestFactory proxies is to use a ProxySerializer.
Make sure you use GWT 2.5.0-rc1 though if you have lists of ValueProxys (see issue 6961)

Categories

Resources