I have recently upgraded to Genson 1.3 and I am not 100% sure if this issue is new or not as previously I patched the 0.98 version to make it work.
Context
We are using our own implementation of the BeanMutatorAccessorResolver. This is so that we can dynamically decide whether a property should be serialized or not. Basically we have integrated Genson into our generic jersey REST API interface. Genson does all the serializing and deserializing. When doing a GET requests it is possible for a user to pass fields in the URL in order to filter those he specifically needs (especially for large objects this is necessary where you only need 3 fields or so for displaying a table overview). For example: ?fields=field1, field2, field3. We then know in our implementation of BeanMutatorAccessorResolver exactly which fields to serialize and which ones to ignore. This is mainly intended for speeding up requests and parsing as we are then working with less data.
Problem
Unfortunately it seems that once Genson has read in all the fields via reflection or whatever, it caches that. This would be no problem if we were always requesting the same fields. Unfortunately on some occasions we need more fields then before, but because Genson does not visit our BeanMutatorAccessorResolver a second time it only returns the few fields that it has already cached.
Is there anyway around this? Perhaps there is a better solution than turning cahing off completely - because that would most probably hurt performance, right?
Update
Is seems that I have found the location where this is happening. Basically Genson returns a cached converter in Genson.provideConverter(Type forType) (line: 154).
Converter<T> converter = (Converter<T>) converterCache.get(forType);
At the top of the method I have noticed that it looks for a __GENSON$DO_NOT_CACHE_CONVERTER.
if (Boolean.TRUE.equals(ThreadLocalHolder.get("__GENSON$DO_NOT_CACHE_CONVERTER", Boolean.class))) {
Should I perhaps set this value or is there a better solution?
The problem has been solved thanks to Eugen. The solution can be found here: https://groups.google.com/forum/#!topic/genson/Z1oFHJfA-5w.
Basically you need to extend 3 classes to get this working:
GensonBundle, which you can register with the GensonBuilder.
BaseBeanDescriptorProvider, which gets created in GensonBundle.
BeanDescriptor, which gets created in BaseBeanDescriptorProvider and
which contains the serialize method to adapt to your needs.
Related
I'm using Axon 2.4.6.
I have a Saga the payload of which was serialized in binary using XStreamSerializer.
The saga looks like this:
public class MySaga extends AbstractAnnotatedSaga {
...
private MyEvent myEvent;
...
}
It contains one event which is the initialize event of the related aggregate object.
Right now I'm having is a deserialization problem because I changed MyEvent by adding one property to it.
I figured out a workaround to this by adding the serialize id that the deserializer is expecting, however this solution might not be the best since I'm on production data right now and would be nice if I was able to somehow upcast the sagas.
So what I intend to do is create a custom serializer that extends JavaSerializer and tweak the SerializedObject<S> that is coming in. The problem is that the SerializedObject is in hex/binary so I need a way to convert it in to an org.dom4j object for instance so I could add the missing property and then be able to deserialize it in to MySaga.
I tried several approaches like
ByteArrayInputStream bos = new ByteArrayInputStream((byte [])serializedObject.getData());
or new XStream();
but they all go from the binary representation straight to the object deserialization, what I need is to get the dom4j or even xml conversion first.
I can't figure out how to do it.
I have to say that Axon 2 is not something I have experience with but let me try to help you nonetheless.
As I can find on the docs, Axon provide an example of how to write an upcaster here using the correct 2.4 documentation link.
What is not clear for me, based on your question, is if you are using the JavaSerializer or an XStreamSerializer (or JacksonSerializer to make it complete).
In the case you are using XML, the docs will provide you an example of an upcaster. What is good to mention (and check) is that you can also look into xStream.ignoreUnknownElements() which will make your Serializer lenient meaning it won't fail when trying to deserialize something which contains an attribute it does not know (very useful I would say).
If you are using JSON, you also have the FAIL_ON_UNKNOWN_PROPERTIES "feature" that can be disabled in this case to make it lenient.
Making your serializers lenient seems to be the correct route if you ask me. If you really need to add a default/derived value to the new field, than the upcaster route is the one you should pick.
KR,
Edit 1: triggered by the comment given by Steven, this led me to add this edit and ask you how long do you expect this Saga to live. Now that I noticed the Event is part of your Saga, I would rather write a new Saga that does not contains the Event as part of it but just fields not coupling it to any specific Event.
I have defined a class which acts like a model/pojo. The class has many keys/variable. I have implemented custom solution for storing the POJO on disk for future uses. Now what I want to do is that whenever any value in the class/POJO is changed, I should call a method which sync the fresh changes with file on disk.
I know I can define setter for each variable. But it's quite tedious to do for 100s of direct and sub fields, and even if I define setter for each field, I have to call sync function from all the setters.
What I need is single proxy setter or interceptor for all change pushes to variables in class.
I am using this in an android application, so whenever the user enters new details in his/her account I have to store those details at that specific instance of time for preventing the data loss. I am using GSON for serialising and de-serialising.
Sorry for using vague terminologies, never been to college :|.
The easiest solution is indeed to use a setter. You only have to create one for each field you want to monitor, and most IDEs generate them for you or you can use something like Koloboke, so it being tedious isn't really an argument.
A proxy class or reflection would also be possible, but that is pretty hacky. Another way would be an asynchronous watcher/worker that checks for changes in you POJO instances, but even that seems unnecessarily complicated.
Apart from that you might need to rethink your POJOs structure if it has that many fields.
The problem with persisting(in your case writting to a disk) entity on each property update is that most of the updates are modifying more then one property. So in case you have a code like this:
entity.setA(avalue);
entity.setb(bvalue);
entity.setc(cvalue);
You would write it to the disk 3 times, which is probably not a best way, as it takes more resources, and 2 out of 3 writes are unnecessary.
There are several ways to deal with it. Imagine you have some service for saving this data to a disk, lets name it entityRepository. So one option is manually call this entityRepository each time you want to save/update your entity. It seems to be very uncomfortable, comparing to calling this automatically on setter call, however, this approach clearly shows you when and why your entity is persisted/updated, in your approach it's unclear, and can lead to some problems future problems and mistakes, for example, in future you will decide that you now need to update one of the properties without immideately persisting, then it appears that you will need 2 setter, one with update, and one without...
Another way is to add version property, and when its setter is called inside this setter call entityRepository.save(this).
The other way is to look at AOP, however anyway I don't recommend persist entity on any change, without having control over it.
You are talking about data binding. There is no built-in way for that so you have indeed to sync it yourself. Look into How to Write a Property Change Listener. There are also lots of other approaches to this, but as said no built-in way.
I'm using Spring to manage communications between my Android client and a Java backend. In particular the class MappingJackson2HttpMessageConverter does the job of converting JSON to Java objects back and forth on Android.
My issue is the following: sometimes I need to update the app, which often results in additional fields to be added to some of the classes which build my model, and as a consequence additional fields in the JSON data that travels between the clients and my server. When I do that it would be crucial for the "old" versions of the app to remain compatible with the new, slightly enhanced, object model. In particular, if the server sends to the client a JSON payload with too many fields (compared to what the client "knows"), the client should just ignore those fields without complaining that it cannot create the object properly. Unfortunately this is not the case now, since if the server sends an additional field, say called "country", the client throws the following exception when trying to convert the object:
Could not read JSON: Unrecognized field "country" (class com.example.MyUser), not marked as ignorable (19 known properties: ...)
Thanks for any help!
Use #JsonIgnoreProperties annotation
This advice will sound slightly trollish, but its honest advice- don't use Spring. Complicated frameworks like that just cause more problems than they're worth unless you want to use them exactly the way the devs envisioned. They also make your app slow as hell. You're better off without them, you'll spend more times working around problems like this than they save you.
Need to serialize java objects to JSON while doing compression such as name change, exclusion etc. Objects use class from jar, source code of which is not available.
Looked through many libraries(Jackson , Gson), but found none solving this particular problem. Most of them are annotations based, which I can't use given I don't have source code.
One way to solve this problems is, use reflection and recursively go through object until you find a property name of which should be replaced or object is excluded in serialized JSON.
Need solution for this. Better if it is already implemented and tested.
You can also have a look at Genson library http://code.google.com/p/genson/.
You can rename and filter with quite concise code:
// renames all "fieldOfName" to "toName", excludes from serialization
// and deserialization fields named "fieldNamed" and declared in DefinedInClass
// and uses fields with all visibility (protected, private, etc)
Genson genson = new Genson.Builder().rename("fieldOfName", "toName")
.exclude("fieldNamed", DefinedInClass.class)
.setFieldFilter(VisibilityFilter.ALL)
.create();
genson.serialize(myObject);
If you want to do some more complex filtering (based on annotations for example) you can implement BeanMutatorAccessorResolver or extend BaseResolver.
Same for property renaming you can implement PropertyNameResolver and have full control.
And finally if you want to filter fields, methods or constructors according to their modifiers you can define your own VisiblityFilter.
Concerning performances of filtering/renaming there should be no problem as it is done only once per class and then cached.
To start using Genson you can have a look at the Getting Started Guide.
Found solution to the problem.
Google gson has class called GsonBuilder which has methods for exclusion strategy and naming strategy.
Using these two methods implemented a custom solution, where all the mapping and exclusion rules are stored using a xml and used at the time of serialization and de-serialization.
Works perfectly, though not sure about the performance of same.
AFAIK JSR-303 is the standard bean validation system.
I don't know whether it could do validations like this (I guess no):
if an object has a deleted flag set, you cannot modify the object
you cannot change the start date property, after the date is passed
you cannot decrease some integer properties in the bean
So how can I handle validations, which depend on the previous state of an object?
I would like to solve problems like that in hibernate3.5 - spring3 - JPA2 environment.
Thanks
My solution was to mess with hibernate, reload the object to see the old state (after evicting the new object). This time I need some smarter solution...
I don't think this can be done using JSR 303 validation (or any other validation framework I've used). Validation is usually stateless - you pass it an instance of an object, and your validation framework tests things to make sure the current values of your object are valid. There's no real knowledge of previous states of the object.
You can do this - just not with validation. You could use a constrained property, or you could make this work using the proxy pattern or AOP.
It sounds like the fields which you want to validate (with regards to previous state) are all metadata about the records as opposed to real data. All of these fields (idDeleted, createdDate, etc.) are better left out of your domain layer and therefor do not require validation. I would put the logic for determining & setting these values in you data-access layer so that the systems using your repository interfaces do not need to know or care about getting them right.
If my assumption about these fields being meta-data is not correct and you have user-entered data which validation depends on previous state, then I do not think that an extra lookup for the previous values is absurd and should not be out of the question. It makes sense in your case. Hibernate itself does a lookup under then hood to determine whether to INSERT or UPDATE when using it's save function.
Hope you find a reasonable solution.
how can I handle validations, which depend on the previous state of an object?
I'm not 100% sure it's doable but the only way I can think of would be to create an object graph made of the "new state" and the "old-state" (transient) and to validate the object graph as a whole using custom constraints. That's at least what I would try.
I would probably create a transient field that says previous version which points to a copy of the data that represents its previous state. This object is created on construction but since it is marked as transient it is not serialized. Then do the validations against it.
Simplest implementation would be to add a method called makeACopy() which makes a copy of the object and put it into the field.
You can add complexity by implementing Clonable or creating a utility class that would do reflection, but that's up to you. I suggest makeACopy() and refactor later since it is easier to think about.
I don't know any ready-to-use solution either. As you suspect JSR-303 won't do the job, because it's validation is 'static'.
But...
An idea would be to use some AOP techniques to do that. So...
if an object has a deleted flag set, you cannot modify the object
This one I would implement as a proxy method registered around every setter. The proxy method would check the 'deleted' flag. If it was set to true, an exception would be thrown, otherwise the original method would be executed.
you cannot change the start date property, after the date is passed
This one is similar. This time you wouldn't access any other property in the intercepted setter, but the original (not changed yet) value of the field and setter argument.
you cannot decrease some integer properties in the bean
That one is the same as with the dates, the only difference is the date type (date vs integer).
One can argue if AOP is a good choice for this task, but still a solution. I am doubtful too.
One more concern is that I guess you would want to enforce these contraints on JPA entities. So using Spring AOP wouldn't be that easy, since the entities wouldn't be Spring managed.
A completely different approach is to put the validation checks into the setters of properties. The downside is that you would lose declarativeness.
Example:
public void setCounter(int newCounter) {
if (newCounter < this.counter) {
throw new IllegalOperationException("Cannot decrease the counter");
} else {
this.counter = newCounter;
}
}
You might want to look at OVal instead. We do this kind of validation all the time. Normally, it's done using the SimpleCheck where you get the object and the value and can do all kinds of cross-checking.