Which Camel construct is suited for transforming? - java

Apache Camel offers several ways of performing data transforms: its concept of the Transform EIP, custom DataFormats, as well as its allowance for custom Type Converters.
I have a situation where I need to do a very complex transform from inside a Camel route. Should I be implementing my own Type Converter, my own DataFormat, or should I implement org.apache.camel.Expression and put all the transform stuff in there:
public class MyTransformer implements Expression {
#Override
public <T> T evaluate(Exchange arg0, Class<T> arg1) {
// ...
}
}
I guess I'm confused about where/when it's appropriate to use your own Type Converter, when to use the .transform(myTransformer) processor, or when to use a custom DataFormat. Thanks in advance!

The differences are subtle, though they are all used for different things. You should use:
a transformer when you are converting a "business payload" from one shape to another. For example, when you are converting value objects that you pulled from a DAO into JAXB annotated objects that you are going to use to invoke a webservice.
a data format when you want to marshall an high-level representation, such as some type of Object, into a lower level representation - something that you would send over a wire. Data formats include serialization, Google protocol buffers, JSON, JAXB etc.
a type converter when you are changing the way you access a representation of a message. E.g. a String and a byte array or an InputStream still read the same characters, so there you might write (though there actually are built-in) converters that convert between any two of these.

Just to add what Jake said above. It all depends.
And you do not need to use any of the Camel APIs for doing that. There can be situations where you need to transform the message payload only once or few times. And for that you can use a plain POJO and invoke it from a Camel route etc
For example a method in a POJO that converts a String to a MyOrder instance.
public MyOrder doSomething(String data) {
...
return ...
}
And then use a method call in the message transformer in the route
.transform().method(MyBusinessClass.class, "doSomething")
Though using any of the Camel ways for message transformation as Jake answered, allows you to integrate this seamless into Camel and use it as a first-class citizen as it was provided out of the box from Camel itself. And allows you to reuse that in other routes and Camel applications.
Implementing an org.apache.camel.Expression to transform the message payload is though not so often used. There are better ways as Jake says. Or use a POJO as shown above. Though the POJO above gets eventually evaluated as an org.apache.camel.Expression, and hence why you can implement once and use that yourself as well.
If you have a copy of Camel in Action book, then chapter 3 is all transforming data with Camel.

Related

Returning superclasses from Jersey resource

I'm doing a very simple thing that should just work, IMO. I've got a resource like:
#GET
#Produces(MediaType.APPLICATION_JSON)
#Path("{nodeType}/{uuid}")
public Object getResourceInfo(#PathParam("nodeType") String nodeType,
#PathParam("uuid") String uuid,
#Context SecurityContext authority) { ...
Note I'm returning type Object. This is because depending on the call (here depending on the nodeType argument) I want to return a different concrete class (which will always be #XmlRootElement) and have that get marshalled out into the response.
However, this does not work. I get exception like:
Exception Description: A descriptor for class com.mycompany.XmlElementTypeInstance was not found in the project. For JAXB, if the JAXBContext was bootstrapped using TypeMappingInfo[] you must call a marshal method that accepts TypeMappingInfo as an input parameter.
If I change Object to a single subclass, it works. But I want it to be able to handle any subclass, XmlElementTypeInstance, XmlElementTypeInstance2, etcetc.
I tried making a common interface from which all of the XmlElementTypeInstance subclasses derive, but then I only get those properties in the interface, not the extra properties in the subclasses. Playing with #XmlElementRef and adding all possible properties to the common interface is extremely ugly and can't work quite correctly to generate the JSON I want, so please don't suggest that. =)
Is there any way to do this? It seems like simple, basic, necessary functionality... any other REST framework I've used, no problem...
The solution it turns out is simple (had to read the JSR instead of the actual Jersey docs, however!)
Instead of returning Object, returning Response (section 3.3.3 of JSR 339) with the object set as the entity forces the implementation to pick an appropriate MessageBody{Writer,Reader} at runtime.
return Response.ok().entity(<the object>).build();
Lost way too much time on this. Hope it helps someone later. =/

Multiple JaxB Marshalling profiles

I'm looking to have some conditional marshalling completed with jaxb. Something like this:
Class A{
//Only marshall when condition X applies
public String fieldOne;
//Only marshall when condition Y applies
public String fieldTwo;
//Always marshall
public String fieldThree;
}
Essentially I have 2 different Web Service methods which use the same model, but I need the information sent to be different on each of these web service methods.
My best option so far would be to create a custom XMLJavaTypeAdapter which verifies some conditional logic. The adapter would return null when I don't want the object, when I do need it marshalled it would return itself.
I'm looking to see if anyone has a better alternative. My jaxb context is quite complex and already has a few layer of adapters.
Thanks in advance.
My best option so far would be to create a custom XMLJavaTypeAdapter
which verifies some conditional logic. The adapter would return null
when I don't want the object, when I do need it marshalled it would
return itself.
I've been there and done that, it gets very messy very fast. If you can use MOXy (I see your post is tagged with moxy), you can can use the XmlNamedObjectGraph annotation to create named profiles of elements that are included when you instance is serialized.
Blaise Doughan (team lead for the MOXy project) explains it better than I can.
Blaise's blog post shows how to use annotations, but he also wrote a page on the EclipseLink wiki that shows how to do it programmatically.

In Jersey, how do you deal with #POST parameters of a deeply nested, complex object?

I'm using Jersey 1.x here and I have a #POST method that requires sending over a deeply nested, complex object. I'm not sure of all my options, but it seems like a lot are described in this documentation:
In general the Java type of the method parameter may:
Be a primitive type;
Have a constructor that accepts a single String argument;
Have a static method named valueOf or fromString that accepts a single String argument (see, for example, Integer.valueOf(String) and
java.util.UUID.fromString(String)); or
Be List, Set or SortedSet, where T satisfies 2 or 3 above. The resulting collection is read-only.
Ideally, I wish that I could define a method like this:
#POST
#Consumes(MediaType.APPLICATION_FORM_URLENCODED)
#Path("complexObject")
public void complexObject(#FormParam("complexObject") ComplexObject complexObject) throws Exception {
But I guess I can only do that if my object satisfies the requirements above (which in my case, it does not). To me it seems that I have a choice.
Option 1: Implement fromString
Implement item #3 above.
Option 2: Pass in the complexObject in pieces
Break up the complexObject into pieces so the parameters become this:
#POST
#Consumes(MediaType.APPLICATION_FORM_URLENCODED)
#Path("complexObject")
public void complexObject(#FormParam("piece1") LessComplexPiece lessComplexPiece1,
#FormParam("piece2") LessComplexPiece lessComplexPiece2,
#FormParam("piece3") LessComplexPiece lessComplexPiece3) throws Exception {
This may not be enough if LessComplexPiece does not satisfy the requirements above. I'm wondering what the best option is here. What do people usually do in this situation? Here are the pros and cons I can think of:
Cons of Implement fromString
Have to maintain a custom deserializer. Every time the class is modified, this deserializer may break. There's more risk for bugs in general.
It will probably be impossible to generate documentation that describes the pieces of the complex object. I'll have to write that by hand.
For each piece of the complex object, I'll have to write my own casting and validation logic.
I'm not sure what the post data would look like. But, this may make it very difficult for someone to call the API from a web page form. If the resource accepted primitives, it would be easy. EG: complexObject=seralizedString vs firstName=John and lastName=Smith
You may not be able to modify the class for various reasons (thankfully, this is not a limitation for me)
Pros of Implementing fromString
This could avoid a method with a ton of parameters. This will make the API less intimidating to use.
This argument is at the level of abstraction I want to work at in the body of my method:
I won't have to combine the pieces together by hand (well technically I will, it'll just have to be in the deserializer method)
The deserializer can be a library that automates the process (XStream, gensen, etc.) and save me a lot of time. This can mitigate the bug risk.
You may run into "namespace" clashes if you flatten the object to send over pieces. For example, imagine sending over an Employee. If he has a Boss, you now have to provide a EmployeeFirstName and a BossFirstName. If you were just deserializing an object, you could nest the data appropriately and not have to include context in your parameter names.
So which option should I choose? Is there a 3rd option I'm not aware of?
I know that this question is old but in case anybody has this problem there is new better solution since JAX-RS 2.0. Solution is #BeanParam. Due to documentation:
The annotation that may be used to inject custom JAX-RS "parameter aggregator" value object into a resource class field, property or resource method parameter.
The JAX-RS runtime will instantiate the object and inject all it's fields and properties annotated with either one of the #XxxParam annotation (#PathParam, #FormParam ...) or the #Context annotation. For the POJO classes same instantiation and injection rules apply as in case of instantiation and injection of request-scoped root resource classes.
If you are looking for extended explanation on how this works please look at article I've found:
http://java.dzone.com/articles/new-jax-rs-20-%E2%80%93-beanparam
For complex object models, you may want to consider using JSON or XML binding instead of URL-encoded string to pass your objects to your resource call so you can rely on JAXB framework?
The Jersey Client library is compatible with JAXB and can handle all the marshaling transparently for you if you annotate your classes #XmlElementRoot.
For documentation, XSDs are a good starting point if you choose the XML binding.
Other REST documentation tools like enunciate can take the automatic generation to the next level.
What about special handler which transforms object to e.g. json - kryo if you would prefer performance? You got couple options
Look also at persistence ignorance.

Am I abusing/misusing Java reflection?

I'm writing a program to read data from a file, which may be in one of several format (different versions of the same format, actually) and I'm using reflection for calling the appropriate function for each format. Assuming that file format is a number specified on the first byte of the file:
Class DataFile extends Model {
...
Blob file
...
public void parse() throws Exception{
InputStream is = file.get();
Class c = Class.forName("models.DataFile");
Method m = c.getMethod("parse_v"+is.read(), (Class []) null);
m.invoke(this, (Object []) null);
}
public void parse_v0() throws Exception{
...
}
public void parse_v1() throws Exception{
...
}
}
My question is, am I abusing/misusing reflection? I have the feeling that I should be using inheritance and create a different class for each file type with its own "parse" procedure, but I don't know the file type until I start parsing... and then I cannot "downcast" and just use something like ((DataFile_v1) this).parse() so I am a little lost.
Thank you for your time!
There's nothing fundamentally wrong with this, but a more flexible and extensible way to do the same thing would be to use the version information as a key in a Map, and have the values in the Map be handler objects. Then any code can register a handler (the handlers can all implement a common interface) and your reader code can just look up the handler in the Map and invoke it.
Be sure to handle the case where the Map doesn't include a handler for a particular version!
If you make a DataFile interface define a parse method, and implement the interface with multiple classes (DataFile_v1, etc.), then the calling code doesn't have to know which implementation was chosen.
DataFile dataFile = dataFileFactory.getForVersion(is.read());
dataFile.parse(file);
I'd argue that this is a better approach from a general design perspective. However, at some point you will need to create some kind of mapping between the version number and the DataFile implementations. (In this case I'm doing it in an imaginary dataFileFactory.) You'll have to determine whether it would be more appropriate to select an implementation using reflection or some other method.
I think it's OK to use reflection here. The alternative would be using inheritance or an enum (i.e. the Strategy pattern), and a map from the version code to the proper Strategy. Once you have initialized all the desired mappings, you just get the right parser object from the map and invoke it. However, setting up this solution still requires a significant amount of boilerplate code, which diminishes its readability.
What you're doing isn't bad. If you want to have the different parsers in different classes, you can't downcast as you say, but you could instantiate a new parser object. So your existing class would be a facade in front of the actual parsers which aren't instantiated until you know which format you're parsing.
You can use a collection, but using reflections is looking up a collection as well. Provided your mapping doesn't change I would use reflections.
getClass().getMethod("parse_v"+is.read()).invoke(this);

How can I have more flexible serialization and deserialization in Java?

If I serialize an object in Java, and then later add an extra field to the java class, I can't deserialize the object into the modified class.
Is there a serialization library or some way that I can have deserialization be less strict, like if there is an extra field added to the class then it just fills that with null upon deserialization of the old version of the class?
You need to keep a serialVersionUID on your class. Check out the section "Version Control" in this article by Sun.
You've got lots of potential options.
You could use a graph serialisation library to define and manage your format e.g. Google's protocol buffers or Kryo. I believe both of these have built-in support for versioning.
You can write your own custom serialisation code and handle the versions explicitly - e.g. serializing to a flexible format like XML. When reading the XML you can configure it to use default values if a particular field isn't specified.
Or you could design your class in a "flexible" way, e.g. have all the fields stored in a HashMap and indexed by Strings. Depending on what you are trying to do, this may be a convenient option.
There's a fair few serialization libraries, take a look at Simple though:
http://simple.sourceforge.net/
or as mentioned above Google Protocol Buffers.
http://code.google.com/apis/protocolbuffers/
Implement Externalizable and you can do whatever you want. The puts the onus of serial/deserialization completely upon the class being serialized.
Did you add a serialVersionUID? This must be present (and unchanged) if you want to serialize / deserialize different Versions of a class.
Furthermore you can add the following two methods to your class to define exactly the serialization process:
private void writeObject(java.io.ObjectOutputStream stream)
throws IOException;
private void readObject(java.io.ObjectInputStream stream)
throws IOException, ClassNotFoundException;
The Javadoc of ObjectInputStream gives more detail on its usage.
If I serialize an object in Java, and
then later add an extra field to the
java class, I can't deserialize the
object into the modified class.
That's untrue for a start. You need to have a good look at the Versioning section of the Object Serialization specification before you go any further.

Categories

Resources