ModelMapper DTO-->Entity. How to skip unconditionally all fields not mapped - java

I have two classes (entity and DTO)
public class Deliver {
private Long id;
private String uri;
private Instant moment;
private DeliverStatus status; // enum PENDING,ACCEPTED,REJECTED
private String feedback; // feedback about received task
private Integer correctCount; // nr of correct questions
private Enrollment enrollment;
private Lesson lesson;
// constructors, getters and setters..
public class DeliverRevisionDto {
private DeliverStatus status;
private String feedback;
private Integer correctCount;
// constructors, getters and setters..
The goal is pretty simple, update the entity fields conveyed by Dto class I have the following code at Service layer (Spring Boot version 2.4.4):
#Service
public class DeliverService {
#Autowired
private DeliverRepository deliverRepository;
#Autowired
private ModelMapper modelMapper;
#Transactional
public void saveRevision(Long id, DeliverRevisionDto dto) {
Deliver deliver = deliverRepository.getOne(id);
System.out.println("BEFORE MAPPING: " + deliver.toString()); // # debug purpose
deliver = modelMapper.map(dto, Deliver.class);
// # debug purpose
TypeMap<DeliverRevisionDto, Deliver> tm = modelMapper.getTypeMap(DeliverRevisionDto.class, Deliver.class);
List<Mapping> list = tm.getMappings();
for (Mapping m : list)
{
System.out.println(m);
}
System.out.println("AFTER MAPPING: " + deliver.toString()); // # debug purpose
deliverRepository.save(deliver);
}
}
The console output is:
BEFORE MAPPING: Deliver [id=1, uri=``https://github/someone.com``, moment=2020-12-10T10:00:00Z, status=PENDING, feedback=null, correctCount=null, enrollment=com.devsuperior.dslearnbds.entities.Enrollment#7e0, lesson=com.devsuperior.dslearnbds.entities.Task#23]`
`PropertyMapping[DeliverRevisionDto.correctCount -> Deliver.correctCount]`
`PropertyMapping[DeliverRevisionDto.feedback -> Deliver.feedback]`
`PropertyMapping[DeliverRevisionDto.status -> Deliver.status]`
`AFTER MAPPING: Deliver [id=null, uri=null, moment=null, status=ACCEPTED, feedback=Muito bem cabra, tarefa aceita., correctCount=5, enrollment=null, lesson=null]
The mapping of the 3 fields in DTO is done correctly, BUT all the other fields of my entity are set to null. I know that I can skip fields according http://modelmapper.org/user-manual/property-mapping/
The problem is that I don´t want to couple the code with specific field names/getters/setters, that´s the reason I´m using ModelMapper. I wonder if there is any configuration that, upon mapping the modelmapper object says "Hey, the TARGET class have way more fields than the SOURCE class, I will left them untouched unconditionally (meaning I don´t need to say what fields are).
I'm trying to map fields between 2 classes with different set of fields (some are the same), and when I map the class with smaller set of fields to the one with bigger set of fields, the mapper set fields that don´t match with "null", I want these fields untouched (with original values) without I telling which one they are, after all, the mapper knows which ones match.

ModelMapper documentation is not the best part of that framework. Let us see what happens in your code.
Here you fetch the entity to be updated from the repo:
Deliver deliver = deliverRepository.getOne(id);
and log it having all the fields as should be. However this line:
deliver = modelMapper.map(dto, Deliver.class);
does a re-assignment to your variable deliver. This method creates a new instance of Deliver class and assigns it to variable deliver so discarding the entity fetched from repo.
This new instance will have all the fields that are not existing or not set in DTO null.
This is the API doc that my IDE provides, fotr these two different methods:
String org.modelmapper.ModelMapper.map(Object source, Class destinationType)
Maps source to an instance of destinationType. Mapping is performed according to the corresponding TypeMap. If no TypeMap exists for source.getClass() and destinationType then one is created.
Versus
void org.modelmapper.ModelMapper.map(Object source, Object destination)
Maps source to destination. Mapping is performed according to the corresponding TypeMap. If no TypeMap exists for source.getClass() and destination.getClass() then one is created.
It might not be clearly stated that the first method actually creates a new instance based on the type (Class) passed but it should be clear that ModelMapper cannot alter some arbitrary variable just by knowing the type. You need to pass the variable to alter as method parameter.

Related

Android Realm copyToRealmOrUpdate creates duplicates of nested objects

I have following classes:
public class Note extends RealmObject {
#PrimaryKey
private String id;
private Template template;
// other primitive fields, getters & setters
}
public class Template extends RealmObject {
private String name;
private String color;
// other primitive fields, getters & setters
}
I get my data from backend via Retrofit & Gson, so I have ready-to-use java objects in response.
Let's imagine that backend returns me same three Notes each time I call it.
When I get the list of Note objects, I do the following:
private void fetchNotesAndSave() {
List<Notes> notes = getNotesViaRetrofit();
Realm realm = Realm.getInstance(mContext);
realm.beginTransaction();
realm.copyToRealmOrUpdate(notes);
realm.commitTransaction();
realm.close();
}
After that I call these lines to check count of stored objects:
int notesCount = mRealm.where(Note.class).findAll().size();
int templatesCount = mRealm.where(Template.class).findAll().size();
For the first time:
notesCount == 3;
templatesCount == 3;
That's right. But, if I call the server again, get same notes (same primaryKey ids), and call fetchNotesAndSave() again, I'll get these results:
notesCount == 3;
templatesCount == 6;
Each time I call copyToRealmOrUpdate(), nested objects, that are inside of objects with primaryKey are duplicated - not updated.
Is there any way to change this behaviour?
Please let me know if you need more information. Thanks in advance!
It is because your Template class doesn't have any primary key. In that case these objects are inserted again as there is no guarantee that the referenced template objects safely can be updated, even if they are part of another object that has a primary key.
If you add a #PrimaryKey to your template class it should work as you expect it to.
If you can't provide a PK as suggested, you might want to use the following work around to avoid duplicates.
for (Note note: notes) {
realm.where(Note.class)
.equalTo("id", note.getId())
.findFirst()
.getTemplate()
.deleteFromRealm();
}
realm.copyToRealmOrUpdate(notes);

Multiple instances of generic class

I'm trying to create a generic DAO in order to avoid having more or less the same code in many separate DAOs.
My problem is that in the following lines of code:
private BaseDAOImpl<Artist> baseDAOArtist = new BaseDAOImpl<>(Artist.class);
private BaseDAOImpl<ArtistRelation> baseDAOArtistRelation = new BaseDAOImpl<>(ArtistRelation.class);
The first one seems to be skipped.
An excerpt of the BaseDAOImpl:
public class BaseDAOImpl<T> implements BaseDAO<T> {
private Class<T> entity;
private DAOFactory daoFactory = Config.getInstance().getDAOFactory();
private static String SQL_FIND_BY_ID;
public BaseDAOImpl(Class entity) {
this.entity = entity;
SQL_FIND_BY_ID = "SELECT * FROM VIEW_" + entity.getSimpleName() + " WHERE id = ?";
}
}
Is it not possible to instantiate multiple objects this way?
Yes. It's not clear what you mean by "The first one seems to be skipped." but it could be that your using a static value for "SQL_FIND_BY_ID"? As at the moment:
private BaseDAOImpl<Artist> baseDAOArtist = new BaseDAOImpl<>(Artist.class);
Creates two instance variables and sets the value of SQL_FIND_BY_ID then:
private BaseDAOImpl<ArtistRelation> baseDAOArtistRelation = new BaseDAOImpl<>(ArtistRelation.class);
Creates two new instance variables and will change the value "SQL_FIND_BY_ID" for both instances.
Without a more detailed description of the error I am more or less guessing now, but judging from variable names and the code snippet I would suspect the static field SQL_FIND_BY_ID to be the cause.
When you instantiate the two DAOs, the second execution of the constructor BaseDAOImpl will overwrite the value of the static field. If the DAO relies on the SQL query stored there, it will always query for the entity of the last instantiated DAO.
Static fields and methods are shared among all instances of a class even if they differ on their generic parameters. In contrast to e.g. C++'s templates, there are no separate classes generated for each generic parameter.
To achieve the desired behavior of separate queries for each entity you may change the static field to a non-static member.

how find fields of all member variables contained in java bean

I want to make a GUI using Java in which a user can select a bean, edit its fields, and then add an instance of the created bean to a queue. My question though is about accessing the fields. I have a class MyCompositeObject that inherits from MyParentObject. The MyParentObject is composed of multiple beans, each being composed of more beans. The class MyCompositeObject is also composed of beans. I want to find all accessible fields from MyCompositeObject.
Class MyParentObject
{
MyObjectOne fieldOne;
MyObjectTwo fieldTwo;
String name;
...
}
Class MyCompositeObject extends MyParentObject
{
MyObjectThree fieldThree;
Integer number;
...
}
Class MyObjectThree
{
boolean aBoolean;
MyObjectFour fieldFour;
...
}
I have been trying to use the BeanUtils api, but I'm getting stuck trying to get the fields of all the member beans. What I am imagining is a depth first search of all fields that could be accessed from an instance of MyCompositeObject. For example, this would include, but not be limited to, the fields: MyCompositeObject.fieldOne, MyCompositeObject.number, MyCompositeObject.fieldThree.aBoolean.
I realized when I tried:
Fields[] allFields = BeanUtils.getFields(myCompositeObject);
that I was in over my head. My research has so far not turned up any prebuilt methods that could do what I describe. Please let me know of any API methods that can do this or tell me how I can go about building my own. Thanks.
It's kind of a pain but you have to go in two dimensions
yourBeanClass.getSuperclass(); (and recursively get all superclasses until Object)
and then you can get the fields of each one
eachClass.getDeclaredFields() NOT getFields so you can get all the private fields
Once you have each field
field.getType() which returns the Class of that field
then of course, you need to go up that dudes superclass chain again to make sure you get ALL the fields of the class including the ones in the superclass
Once you have that chain of classes for that field, you can then get it's fields by repeating the above....yes, the jdk made this fun!!!! I wish to god they had a getAllDeclaredFields method so I didn't have to go up the superclass heirarchy.
IMPORTANT: you need to call field.setAccessible(true) so you can read and write to it when it is a private field by the way!!!
Here is code that gets all the fields for a Class including the superclasses..
private static List<Field> findAllFields(Class<?> metaClass) {
List<Field[]> fields = new ArrayList<Field[]>();
findFields(metaClass, fields);
List<Field> allFields = new ArrayList<Field>();
for(Field[] f : fields) {
List<Field> asList = Arrays.asList(f);
allFields.addAll(asList);
}
return allFields;
}
private static void findFields(Class metaClass2, List<Field[]> fields) {
Class next = metaClass2;
while(true) {
Field[] f = next.getDeclaredFields();
fields.add(f);
next = next.getSuperclass();
if(next.equals(Object.class))
return;
}
}
later,
Dean

Need to know if each field has changed, how should I model this in Hibernate

So I have a class with three fields that maps to a table using hibernate
Class Widget
{
String field1;
String field2;
String field3;
}
On application startup a number of instances these widgets will be added to the database from an external files, but when I exit the application I need to know which (if any) of these fields have been changed by the user since the application was started, so the changes can be saved back to the files. I also need to store the original value for logging purposes.
I can't work whether I need a status field in the table or whether there is already a way of doing this using Hibernate/Database.
EDIT:A good solution to the program was given below . however the main reason I am using Hibernate is to reduce memory consumption so storing the original values when changed is not a good solution for me , I want everthing stored in the database. So I have create this new question How do I store a copy of each entity I add to database in Hibernate
Given an entity like the following you can track changes on one of it's field (while preserving its original value too).
#Entity
#Table(schema = "test", name = "test")
public final class Test {
private static final int ORIGINAL = 0;
private static final int CURRENT = 1;
private Integer id;
// holds the original and current state of the field
private final AtomicReferenceArray<String> field = new AtomicReferenceArray<>(2);
#Id
public Integer getId() {
return id;
}
public void setId(Integer id) {
this.id = id;
}
#Transient
public String getOriginalField() {
return field.get(ORIGINAL);
}
#Basic
public String getField() {
return field.get(CURRENT);
}
public void setField(String field) {
this.field.compareAndSet(ORIGINAL, null, field);
this.field.set(CURRENT, field);
}
#PreUpdate
public void preUpdate() {
System.out.format("Original: %s, New: %s\n", getOriginalField(), getField());
}
...
}
If there is a single row in a database like this:
id: 1
field: a
version: 2011-12-02 11:24:00
before the field gets updated (say, from a to b) you'll get the following output.
Original: d, New: b
The original value gets preserved even if the the entity is updated multiple times and both state can be accessed through the corresponding getters (getField and getOriginalField—you can get more creative than me in the naming :).
This way, you can spare yourself from creating version columns in your database and also can hide the implementation details from clients.
Instead of an AtomicReferenceArray you could use arrays, lists, etc, to track all changes like this way.
The #PreUpdate isn't necessary of course, but this way you can be notified of changes in the entity's state and atomically save the updated fields into file. There more annotations like these: see the documentation for javax.persistence for other annotation types.
If you are using MySql then you can get table's last update time from information_schema database like
SELECT UPDATE_TIME FROM `information_schema`.`tables`
WHERE TABLE_SCHEMA = 'dbName' AND TABLE_NAME = 'tableName'
Or else simple solution will be to add a column for update time stamp. By this you can even monitor which particular row has been updated.
If you need to synchronize with files as soon as you save into database, You can use the Hibernate event mechanism to intercept any save to database and save it to file, here's a sample doing that.

Best practice for storing global data in J2EE App using Hibernate

I'm looking for the best solution to store Java EE application's global data using Hibernate. It will consist of key value pairs. Example:
projectStarted = "10-11-11"
developerNumber = 3
teamLeader = "John"
As you see, all of this entries have different types.
For now I see two options:
1) Create GlobalData entity. Each field of it will be represented as unique column in the table and will contain unique setting. This way I have no problems with type casting, but I would like to avoid it in case where there will be big amount of settings.
2) Create Setting entity. Each of it will contain two fields: key(Primary key) and value and will be represented as unique record in the table. This is preferable solution, but It's seems to me that I will get a lot of type casting, because settings can be any type.
So basically, I'm looking for the way to implement second solution without getting a lot of troubles from different types. Can anybody help me?
Thanks.
Edit 1.
Yeah, thanks Christian. Just got similar idea.
What if I will have Settings entity, which will be like:
#Entity
#Table(name = "settings")
public class Setting {
#Column
private String key;
#Column
private String value;
#Column
private String converterClassFullName; //example by.lugovsky.MyConverter
//Getters, setters
}
And GlobalData class.
public class GlobalData {
private Date projectStarted;
private int developerNumber;
private String teamLeader;
Set<Setting> settings;
//Getters and setters for all, except settings.
}
So basically my idea is to convert Setting entity before persisting/updating/ etc. I can do this in my DAO, but I was wondering, if I could annotate GlobalData class with #Entity annotation as well without creating new table. This way I can set OneToMany annotation to Setting's set and Perform conversions in the internal #PrePersist etc. methods.
Will Hibernate allow me to do this?
Thanks again
You could store a Converter-Class into the db and the let it run through the given converter for a property before using the value. JSF offers Converter API:
public interface Converter{
public Object getAsObject(FacesContext fc, UIComponent component, String value) throws ConverterException;
public String getAsString(FacesContext fc, UIComponent component, Object obj) throws ConverterException;
}
If you have a schema with
name: String
value: String
converter: Class
then you could do something like this:
PropertyEntry pe = // Get from OR-Mapper
Converter c = (Converter) pe.getConverter().newInstance();
Object o = c.getAsObject(null, null, pe.getValue());
// use the object o instead of value
For even more coolness you could also define a field in the class which will not be persisted which you could use to hold the converted value within the object.

Categories

Resources