Separate database model from Network model - java

Im using GreenDAO and Volley. So I have the following problem: When I make a network request I need to parse with GSON so I have a model to represent entities retrieved from server and other model to represent the GreenDAO objects. Is there any way to only have 1 class per model to represent as a GSON and a Class of ORM?
class Product:
#SerializedName("id")
private String id;
#SerializedName("pictures")
private List<Picture> pictures;
get & set
class PersistentProduct:
private Long id;
private List<Picture> pictures;
/** To-many relationship, resolved on first access (and after reset). Changes to to-many relations are not persisted, make changes to the target entity. */
public List<PersistencePicture> getPictures() {
if (pictures == null) {
if (daoSession == null) {
throw new DaoException("Entity is detached from DAO context");
}
PersistencePictureDao targetDao = daoSession.getPersistencePictureDao();
List<PersistencePicture> picturesNew = targetDao._queryPersistenceProduct_Pictures(id);
synchronized (this) {
if(pictures == null) {
pictures = picturesNew;
}
}
}
return pictures;
}
First I thought to make a Interface, but when you retrieve the data from a DAO the DAO returns the class and not the interface, so I think cannot do in this way, the only solution I found is to make a "ProductUtils" that converts from a "PersistentProduct" to a "Product" and vice versa.

The most elegant way would be to implement a small extension for greendao, so that you can specify the serialized name during schema-creation.
For Example:
de.greenrobot.daogenerator.Property.java:
// in PropertyBuilder append these lines
public PropertyBuilder setSerializedName(String sname) {
// Check the sname on correctness (i.e. not empty, not containing illegal characters)
property.serializedName = sname;
return this;
}
// in Property append these lines
private String serializedName = null;
public boolean isSerialized() {
return serializedName != null;
}
In entity.ftl add this line after line 24 (after package ${entity.javaPackage};):
<#if property.serializedName??>
import com.google.gson.annotations.SerializedName;
</#if>
And after line 55 (after: <#list entity.properties as property>)
<#if property.serializedName??>
#SerializedName("${property.serializedName}")
</#if>
Afterwards you should be able to use you generated greendao-entity for volley with the following restrictions:
If you get a Product over network, nothing is changed in the db, yet. You have to call insertOrReplace().
If you get a Product from db and send it via network some undesired fields might be serialized (i.e. myDao and daoSession)
If you get a Product via network and call insertOrReplace() the "network"-Product will be persisted and a already existing Product will be replaced by it BUT the referenced entities won't get updated or persisted if insertOrReplace() isn't called for each of them!
If you get a Product via network and call insertOrReplace() for every referenced entity toMany-entities that were referenced by the db-Product are still referenced by the updated Product, although they are not listed in the updated Product. You have to call resetPictures() and getPictures() to get the correct list, which will contain all toMany()-entities references by either the original Product stored in DB or the updated Product from network.
Update addressing 2.
To prevent daoSession and myDao from being serialized, you can use the following ExclusionStrategy:
private static class TransientExclusionStrategy implements ExclusionStrategy {
public boolean shouldSkipClass(Class<?> clazz) {
return (clazz.getModifiers() & java.lang.reflect.Modifier.TRANSIENT) != 0;
}
public boolean shouldSkipField(FieldAttributes f) {
return f.hasModifier(java.lang.reflect.Modifier.TRANSIENT);
}
}
Update addressing 1.,3. and 4.
As a fast solution you can add the following method in the KEEP-SECTIONS of your entity:
public void merge(DaoSession s) {
s.insertOrReplace(this);
// do this for all toMany-relations accordingly
for (Picture p : getPictures()) {
s.insertOrReplace(p);
newPics.add(p.getId());
}
resetPictures();
}
This will result in the original entity being updated and attached to the session and dao. Also every Picture that is references by the network-product will be persisted or updated. Pictures reference by the original entity, but not by the network-entity remain untouched and get merged into the list.
This is far from perfect, but it shows where to go and what to do. The next steps would be to do everything that is done in merge() inside one transaction and then to integrate different merge-methods into dao.ftl.
NOTE
The code given in this answer is neither complete nor tested and is meant as a hint on how to solve this. As pointed out above this solution still has some restrictions, that have to be dealt with.

Related

Trouble converting between java.sql.Timestamp & java.time.Instant with JOOQ

I'm having trouble converting between java.sql.Timestamp and java.time.Instant using JOOQ converters.
Here's a simplified version of the code I'm working with.
public class User {
private static final Converter<Timestamp, Instant> MY_CONVERTER= Converter.of(
Timestamp.class,
Instant.class,
t -> t == null ? null : t.toInstant(),
i -> i == null ? null : Timestamp.from(i)
)
public static Table<?> table = DSL.table("user");
public static Field<String> name = DSL.field(DSL.name(table.getName(), "name"), String.class);
public static Field<Instant> name = DSL.field(DSL.name(table.getCreated(), "created"), SQLDataType.TIMESTAMP.asConvertedDataType(Converter.of(MY_CONVERTER)));
}
private class UserDto {
private String name;
private Instant created;
// getters, setters, etc.
}
public class UserWriter {
// constructor with injected DefaultDSLContext etc..
public void create(UserDto user) {
dslContext.insertInto(User.table, User.firstName, User.lastName)
.values(user.getName(), user.getCreated())
.execute();
}
}
public class UserReader {
// constructor with injected DefaultDSLContext etc..
public Result<Record> getAll() {
return dslContext.select().from(User.table).fetch();
}
}
public class UserService {
// constructor with injected UserReader etc..
public Collection<UserDto> getAll() {
return userReader
.getAll()
.stream()
.map(Users::from)
.collect(Collectors.toList());
}
}
public class Users {
public static UserDto from(Record record) {
UserDto user = new UserDto();
user.setName(record.get(User.name));
user.setCreated(record.get(User.created);
return user;
}
}
When I create a new User the converter is called and the insertion works fine. However, when I select the Users the converter isn't called and the record.get(User.created) call in the Users::from method returns a Timestamp (and therefore fails as UserDto.setCreated expects an Instant).
Any ideas?
Thanks!
Why the converter isn't applied
From the way you phrased your question (you didn't post the exact SELECT statement that you've tried), I'm assuming you didn't pass all the column expressions explicitly. But then, how would jOOQ be able to find out what columns your table has? You declared some column expressions in some class, but that class isn't following any structure known to jOOQ. The only way to get jOOQ to fetch all known columns is to make them known to jOOQ, using code generation (see below).
You could, of course,let User extend the internal org.jooq.impl.TableImpl class and use internal API to register the Field values. But why do that manually, if you can generate this code?
Code generation
I'll repeat the main point of my previous question, which is: Please use the code generator. I've now written an entire article on why you should do this. Once jOOQ knows all of your meta data via code generation, you can just automatically select all columns like this:
UserRecord user = ctx
.selectFrom(USER)
.where(USER.ID.eq(...))
.fetchOne();
Not just that, you can also configure your data types as INSTANT using a <forcedType>, so you don't need to worry about data type conversion every time.
I cannot stress this enough, and I'm frequently surprised how many projects try to use jOOQ without code generation, which removes so much of jOOQ's power. The main reason to not use code generation is if your schema is dynamic, but since you have that User class, it obviously isn't dynamic.

Data/Object Consistency in noSQL database

I got a question when I develop a pretty large project. Say I store an object in no-sql db, like Google Cloud Datastore, but then I add a new field to this class, then once I make a query and get this object, what will be the value of new field? Does it depend on the serializer or DB or programming language?
For example, in java:
public class Car{
private int numOfDoors;
public Car(int nod){
numOfDoors = nod;
}
}
Then I save an object car1 to Datastore, but I update my code after that.
public class Car{
private int numOfDoors;
private Set<String> tags;
private boolean condition;
public Car(int nod, Set<String> tags, boolean cod){
numOfDoors = nod;
this.tags = tags;
condition = cod;
}
public Set<String> getTags(){
return tags;
}
}
What will happen if I call getTags() when I just update the code and call get to an object just fetched from Datastore?
What if tags and condition are not in contructor? Like:
private Set<String> tags = new HashSet<>();
What about delete a field?
Thanks!
In the Defining Data Classes with JDO of the Google Cloud Documentation, specifically the Object Fields and Entity Properties section, it is explained that:
If a datastore entity is loaded into an object and doesn't have a
property for one of the object's fields and the field's type is a
nullable single-value type, the field is set to null. When the object
is saved back to the datastore, the null property becomes set in the
datastore to the null value. If the field is not of a nullable value
type, loading an entity without the corresponding property throws an
exception....
So basically the newly added properties in the modified class will be set to null because the saved entity doesn't have them.

#JsonIgnore field in frontend UI and include it in Jsonb json serialization to store in Postgres

I need a field to be ignored in front end UI, whereas the same field will be calculated in backend and needs to get stored in Postgres DB as a Jsonb object. Other than transforming the value object into a newer one, do we have any feature in Jackon for this use case.
Test.java
public class Test {
private Integer score;
private Date dateValidated = null;
private Boolean consent = false;
private Date dateConsented;
public void setConsent(Boolean consent) {
this.consent = consent;
this.dateConsented = consent ? new Date() : null;
}
}
Based on consent, dateConsented will be set and i don't want this to be set while calling my service. I can use #JsonIgnore for this
Problem
I will store this Test as json object in postgres (Jsonb). So if i use #JsonIgnore dateConsented will be ignored in DB as well. I don't want that to happen. Any suggestions/solution for this?
Just create a resource class for yourself, and convert this class to it. finally return this resource class to frontend UI.Take a look ConverterFactory from spring.

DB management using Entity Objects

I have a bean in my Fusion Web Application where I'm supposed to insert new data into a table of my database through java code (after appropriate validation).
The question is how should I do the insertion?
Should I use Entity Objects?
How?
P.S.: This is not the way it should work http://jneelmani.blogspot.com/2009/11/adf-insert-using-storeprocedure.html
I created Entity Object and View Object by the database table "Employees" and then created application module where included this view object (also were generated java classes for entity object, view object and appModule. EmployeeInfo is just POJO). Inside the application module I created methods:
public EmployeeViewRowImpl saveEmployee(EmployeeInfo EmployeeInfo) {
// Получаем ViewObject
EmployeeViewImpl employeeView = getEmployeeView1();
// Готовим новую строку.
EmployeeViewRowImpl employee = createEmployeeViewRowImpl(employeeView, employeeInfo);
// Производим операцию вставки.
employeeView.insertRow(employee);
// Коммитим
try {
getDBTransaction().commit();
return employee;
} catch (JboException e) {
getDBTransaction().rollback();
return null;
}
}
private EmployeeViewRowImpl createEmployeeViewRowImpl(EmployeeViewImpl employeeView, EmployeeInfo employeeInfo) {
EmployeeViewRowImpl employee = (EmployeeViewRowImpl)EmployeeView.createRow();
employee.setName(employeeInfo.getName());
return employee;
}
And to use this one should just call:
public static AppModuleImpl getApp() {
return (AppModuleImpl)Configuration.
createRootApplicationModule(
"com.test.service.AppModule", // where your module is stored
"AppModuleShared"); // chosen configuration
}
and then
...
RegistrationAppModuleImpl app = getApp();
app.saveUser(userInfo)
...
May be i'm not to clear on the dynamics of what you are trying to do, but with Oracle ADF, CRUD operations (such as Insert), are easily handled by exposing them from Data Controls. To be more specific, once you have an EO, you should create a View Object and an Application Module. After that, inside the AppMod -> Data Model , add the created VO. This way it will be exposed in the Data Controls panel, and you can expand the 'Operations' folder, and drag'n'drop the CreateInsert operation possibly within a form, or an updatable table.
Please refer to this link: CreateInsert Operation - ADF.
If for some other reason you want to handle this process in a programmatic approach, i might think about two possible ways:
1. Get into your managed bean code an instance of the above mentioned AppMod, and from that, a VO instance.
AppModule mod = AppModule)Configuration.createRootApplicationModule("packageName.AppModule", "AppModuleLocal");
ViewObject vo = mod.getViewObject1();After that, create a new row and commit the newly added values.
2. If you have already exposed a UI component (such a table), you can grab the Binding Context of the current page and from the table's iterator, create a new row.
DCBindingContainer DCB = (DCBindingContainer)BindingContext.getCurrent().getCurrentBindingsEntry();
DCIteratorBinding iterator = bc.findIteratorBinding("ViewObject1Iterator");
Row r = iterator.getCurrentRow();
r.setAttribute("attibName", attribValue);
You can do the insertion using entity object as below:
/* Create a new Customer and Return the new id */
public long createCustomer(String name, String city, Integer countryId) {
EntityDefImpl customerDef = CustomerImpl.getDefinitionObject();
CustomerImpl newCustomer =
(CustomerImpl)customerDef.createInstance2(getDBTransaction(),null);
newCustomer.setName(name);
newCustomer.setName(name);
newCustomer.setCountryId(countryId);
try {
getDBTransaction().commit();
}
catch (JboException ex) {
getDBTransaction().rollback();
throw ex;
}
DBSequence newIdAssigned = newCustomer.getId();
}

How to store created/lastUpdate fields in AppEngine DataStore using Java JDO 3?

Abstract
I have a working application in Appengine using Java and JDO 3.
I found these arguments (auto_now and auto_now_add) which correspond exactly what I want to implement in Java. So essentially the question is: How to convert AppEngine's Python DateTimeProperty to Java JDO?
Constraints
Converting my application to Python is not an option.
Adding two Date properties and manually populating these values whenever a create/update happens is not an option.
I'm looking for a solution which corresponds to what JDO/Appengine/Database authors had in mind for this scenario when they created the APIs.
It would be preferable to have a generic option: say I have 4 entities in classes: C1, C2, C3, C4 and the solution is to add a base class C0, which all 4 entities would extend, so the 4 entities don't even know they're being "audited".
[update] I tried (using a simple entity)
#PersistenceCapable public class MyEntity {
#Persistent(valueStrategy = IdGeneratorStrategy.IDENTITY, primaryKey = "true")
private Long id;
#Persistent private String name;
...
1. #Persistent public void getLastUpdate() { return new Date(); }
As suggested by answer, but it seems to always update the value, even when I just load the value from the datastore or just modify an unrelated field (e.g. String name).
You can easily enough have a property (setter/getter) on a java class and have the property persistable (rather than the field). Within that getter you can code whatever you want to control what value goes into the datastore.
If I didn't do the following hack, I can't read the value stored in the datastore [neither with the hack :( ]:
#Persistent public Date getLastUpdate() { return new Date(); }
private Date prevUpdate;
public void setLastUpdate(Date lastUpdate) { this.prevUpdate = lastUpdate; }
public Date getPrevUpdate() { return prevUpdate; }
Is there any way to differentiate if a persistence operation is in progress or my code is calling the getter?
2. #Persistent(customValueStrategy = "auto_now_add") private Date lastUpdate;
I modeled auto_now_add after org.datanucleus.store.valuegenerator.TimestampGenerator replacing Timestamp with java.util.Date.
But it was only populated once at the first makePersistent call, regardless of how many times I modified other fields and called makePersistent. Also note that it doesn't seem to behave as the documentation says (or my English is rusty):
Please note that by defining a value-strategy for a field then it will, by default, always generate a value for that field on persist. If the field can store nulls and you only want it to generate the value at persist when it is null (i.e you haven't assigned a value yourself) then you can add the extension "strategy-when-notnull" as false
3. preStore using PersistenceManager.addInstanceLifecycleListener
Works as expected, but I could make it work across multiple entities using a base class.
pm.addInstanceLifecycleListener(new StoreLifecycleListener() {
#Override public void preStore(InstanceLifecycleEvent event) {
MyEntity entity = (MyEntity)event.getPersistentInstance();
entity.setLastUpdate(new Date());
}
#Override public void postStore(InstanceLifecycleEvent event) {}
}, MyEntity.class);
4. implements StoreCallback and public void jdoPreStore() { this.setLastUpdate(new Date()); }
Works as expected, but I could make it work across multiple entities using a base class.
To satisfy my 4th constraint (using solutions 3 or 4)
Whatever I do I can't make the following structure work:
public abstract class Dateable implements StoreCallback {
#Persistent private Date created;
#Persistent private Date lastUpdate;
public Dateable() { created = new Date(); }
public void jdoPreStore() { this.setLastUpdate(new Date()); }
// ... normal get/set properties for the above two
}
#PersistenceCapable public class MyEntity extends Dateable {
#Persistent(valueStrategy = IdGeneratorStrategy.IDENTITY, primaryKey = "true") private Long id;
#Persistent private String name;
The problems when the enhancer runs:
public abstract class Dateable:
DataNucleus.MetaData Registering class "[...].Dateable" as not having MetaData.
public abstract class Dateable with the above log, but running the code anyway:
Creation date changes whenever I create or read the data from datastore.
#PersistenceCapable public abstract class Dateable:
DataNucleus.MetaData Class "[...].MyEntity" has been specified with 1 primary key fields, but this class is using datastore identity and should be application identity.
JDO simply provides persistence of Java classes (and its fields/properties) so don't see what the design of JDO has to do with it.
You can easily enough have a property (setter/getter) on a java class and have the property persistable (rather than the field). Within that getter you can code whatever you want to control what value goes into the datastore. Either that or you use a preStore listener to be able to set things just before persistence so the desired value goes into the datastore.

Categories

Resources