Bean with generic field that is not always used - java

This is a curious situation: I have a bean like this that store some information and I need generics, because the field "data" can contain different types of data.
public class Request<T> {
private String name;
private Integer code;
private T data;
/* Getter and setters */
}
The fact is that "data" is not always used and sometimes it can be null. And if I want to avoid raw types I have to define a type anyway, like this:
Request<Object> req = ....
Is there a better way of doing that, where I can both 1) Avoid raw types 2) Have a generic data field in my request objects ???

If you don't mind the request type, use Request<?> in your declaration.
If the request is empty (meaning there is no type, which can be set as data), declare it as Request<Void>

You could always use the Void type, e.g.
Request<Void> req = ...
or a wildcard:
Request<?> req = ...

Maybe you should consider to change object hierarhy. If you dont use data in some cases, maybe you should have two objects:
class Request {
private String name;
private Integer code;
/* Getter and setters */
}
class DataRequest<T> extends Request {
private T data;
/* Getter and setters */
}
and use:
Request req1 = new Request();
Request req2 = new DataRequest<String>();

Maybe you should think in a different way: In your case a Request not always has associated data. Modelling this with null-values is a bad decision, because you have to check for this every time you want to use your data. At some point you want to handle Request without data in a different way then Request with data.
Maybe you should make your Request to an interface containing Methods like getName() and getCode() and create two concrete classes RequestWithData<T> and RequestWithoutData implementing this interface. Then you can check on creation of an RequestWithData<T>-instance, if a non-null data is provided. Furthermore, you can express in your method signature that you require a Request with data or without data. This leads to a more clean design and avoids your problem.

Use the new 'Optional' type in Java 8. It was made specifically for cases like these. If you cannot, for whatever reason, use Java 8 yet, the Google Guava library also implements that type. Check this example : https://gist.github.com/9833966

Related

Can you rename a variable in a generic Class?

I have got a Class PagedResult. The class is there to help me realize a JSON output with different objects in a pages format. The E is the object, that is wrapped in the List. It works all fine, but one thing still bothers me. I would like that the list with the objects does not always get the same name. I would like to adapt the name to the corresponding objects.
Class PagedResult:
public class PagedResult<E> {
Long totalItems;
Integer totalPages;
Integer currentPage;
List<E> elements;
[... Getter & Setter ...]
}
The actual JSON Output with an Object like MyPojo looks like this:
{
"totalItems": 2,
"totalPages": 1,
"currentPage": 1,
"elements": [
{
"myPojoAttr1": "hello",
"myPojoAttr2": "there"
},
{
"myPojoAttr1": "hello",
"myPojoAttr2": "folks"
}
]
}
So for each response, no matter which objects, the array is namend as "elements". I don´t want the ugly name in my JSON response, because of the changing objects in the PagedResult-class. When I get a response with objects like MyPojo the name of the JSON-Array should be "myPojos" and when I want to get a response with objects like MyWin the name "myWins".
I tried alot with #JsonProperty, but I can´t find a way, to do this "object-array-name" also generic. Can someone assist me with the problem please? Thanks in advance.
No. You can't do that. Generic types have parameters for types, not for identifiers. AFAIK, nothing in the Java language allows you to treat a Java identifier as a parameter when producing a type. (Certainly, nothing you could use in this context!)
Alternatives:
Don't do it. (Take a good hard look at your reasons for wanting the JSON attribute name to vary. What does it actually achieve? Is it worth the effort?)
Don't use a generic type. Define a different class for each kind of "paged result". (Clunky. Not recommended.)
Use a map, and populate it with a different map key for the elements attribute of each kind of "paged result". (The disadvantage is that you lose static type checking, and take a small performance and storage penalty. But these are unlikely to be significant.)
Write a custom mapper to serialize and deserialize the PagedResult as per your requirements.
For what it is worth, identifiers as parameters is the kind of thing you could do with a macro pre-processor. That Java language doesn't have standard support for that kind of thing.
Yes it's possible, using custom serializers. But even with a custom serializer you still have a problem: Generics are removed at compile time. So we need to somehow get the type during runtime.
Here is an example that will just check the type of the first element in the elements list. Definietly not the cleanest way to do it, but you don't have to adjust your PagedResult class.
public class PagedResultSerializer<T> extends JsonSerializer<PagedResult<Object>> {
#Override
public void serialize(PagedResult<Object> value, JsonGenerator gen, SerializerProvider provider) throws IOException {
gen.writeStartObject();
gen.writeNumberField("totalItems", value.getTotalItems());
// Your other attributes
if (!value.getElements().isEmpty()) {
Object firstElement = value.getElements().get(0);
String elementsFieldName;
if (firstElement instanceof MyPojo) {
elementsFieldName = "myPojos";
} else if (firstElement instanceof MyWin) {
elementsFieldName = "myWins";
} else {
throw new IllegalArumentException("Unknown type");
}
serializers.defaultSerializeField(elementsFieldName, value.getElements(), gen);
}
gen.writeEndObject();
}
}
Now you just need to tell Jackson to use this serializer instead of the default one.
#JsonSerialize(using = PagedResultSerializer.class)
public class PagedResult<T> {
// Your code
}
Improvments: Add a Class<T> elementsType attribute to your PagedResult and use this attribute in your serializer instead of checking the first element in the list.
Another approach: use inheritance.
Have an abstract base class PagedResult that contains all the common fields, to then distinctively subclass it to PagedResultWithElements, PagedResultWithMyPojo and so on. And the subclasses contain just that "type" specific list.
As a drawback: you get some code duplication. But on the other side, you get quite more control over what happens without doing overly complicated (de)serialization based on custom code.
So, when you know the different flavors of "element types", and we talk say 3, max 5 different classes, using inheritance might be a viable solution.

How to use getResources.getString(R.string) in an Interface?

I have an interface that has a static key. And I wanted to retrieve this key from a string file, but if I just put R.string.key, it shows incompatible types, because it retrieves the integer value, and if I put R.string.key + "", it becomes a string but retrieves the String file id, it would be best to use getResources.getString (R.string.key), but there is no way to use the getResources.getString method in the Interface.
Working:
public interface NotificacaoService {
#Headers({"34853485734",
"Content-Type:application/json"}) #POST("send")
Call<NotificacaoDados> salvarNotificacao(#Body NotificacaoDados notificacaoDados);
}
I want to leave it like this:
public interface NotificacaoService {
#Headers({getResources.getString(R.string.key),
"Content-Type:application/json"}) #POST("send")
Call<NotificacaoDados> salvarNotificacao(#Body NotificacaoDados notificacaoDados);
}
Sorry, what you want is not possible. Annotation parameters need to be constants.
One solution is to switch from string resources to BuildConfig. Use buildConfigField in your Gradle script to define your key (buildConfigField "String", "API_KEY", "\"34853485734\""). Then, you can reference that generated constant in your interface (e.g.,BuildConfig.API_KEY`).
Alternatively — if this interface is for Retrofit — you could add your headers via an OkHttp interceptor, instead of via a #Headers annotation.

DesignPattern: Partial Object creation of unknown instance members at runtime

Currently I have an endpoint which returns a a fairly large data object.
This call for all objects of that type can generate 20MBs of data. However not always the clients need all the information in the object, quite often a subset of data is all that is required. I want to give the client the option to pass in some parameters to determine what parts of the object they require.
For example, specifying an array of restriction fields with each field itself a group of instance members on the object, a user can restict how much f the object they want. Looking at the figure Object below, restriction field value r1 may refer to instance members a and b.
example request "myurl/restrict=r1,r2"
MyObject(){
a;
b;
c;
d;
e;
f;
g;
h;
.... many more fields
}
So with that in mind, I created an ENUM to model the restriction fields and the subset of instance members which each instance field represents.
Now on the DB query I want to use the ENUM(s) values to decide which parts of the object I want.
So the select query will select the Object and the object can be partially instantiated by calling whatever get/set methods are required. I have implemented this on the query side by using the request params(i.e. groupings of instance members) and performing reflection on the object returned from the DB to get/set the instance fields on the object to return.
I am however, unsure if there is an already existing design pattern for this problem other than a refactor or create a new endpoints for the "lighter" objects. I dont want to argue that case, I just want to discuss for the problem at hand, is reflection a valid method of fulfilling the requirement and if not why not and what are the alternatives.
I believe this solution can cater for change easily, Only the enum needs updated if instance members change or a restriction grouping needs adapting. The rest layer or data layer requires no change. However I am unsure of the performance impact, i only implemented this today so I haven't been able to benchmark yet.
I guess the real question is; is there a design pattern for partial object creation of unknown member fields at runtime
Below is psuedo of how i implemented the select aspect below.
select(MyObj obj){
//get all the restricted fields from the request
// Instantiate new object
// for each restriction field(i.e. instance member)
// use reflection to invoke the set method of the new object(partial) passing the get method of the method argument(full object)
}
You can use object mapper for that
Employee -> name , id, uid, address
Using objectmapper readvaluetotree
Returns JsonNode/ObjectNode
Select your keys to construct the new json
Json= { "name": "xyz", "id": 101, "uid": "xoz", "address": "xqp street" }
Delete the keys which you don't need using
jsonNode.remove or delete key
then use the jsonNode to parse back to object
Employee em = objectmapper.readValue( json, Employee.class)
I think I may have found a really nice method for this task leveraging the functional aspect of Java8. Indeed this could also be implemented using an anonymous class pre Java8.
I can make use of this in the Enum and construct each one with a BiConsumer.
I can then implement that copy method while I iterate through the passed in arguments.
Now I have have the behaviour I had with reflection but without the performance impact.
public enum RestrictFields {
R1((source, target) -> {
target.setA(source.getA());
target.setB(source.getB());
target.setC(source.getC());
}),
R2((source, target) -> {
target.setD(source.D());
});
private final BiConsumer<MyObj, MyObj> copier;
private RestrictFields (final BiConsumer<MyObj, MyObj> copier) {
this.copier = copier;
}
public void copy(final MyObj source, final MyObj target){
this.copier.accept(source, target);
}
}
Now when in the select clause I can cycle through the passed Enum Values and invoke the copy method and build the restricted object based on that.
public Object select(MyObj source) {
MyObj myobj = new MyObj ();
if (!restrictedFields.isEmpty()) {
// Instead of refelction here I can use the biconsumer in the enum
for (RestrictFields field : restrictedFields) {
field.copy(source, myobj);
}
return myObj;
}
return source;
}

ObjectMapper using TypeReference not working when passed type in generic method

This is the method:
protected <T> TestPageResult<T> getTestPageResutForRequest(MockHttpServletRequestBuilder request) throws Exception {
String responseJson = mockMvc.perform(request).andReturn().getResponse()
.getContentAsString();
TestPageResult<T> response = getObjectMapper().readValue(responseJson,
new TypeReference<TestPageResult<T>>() {
});
return response;
}
I call it like this:
TestPageResult<SomeDto> pageResult = this.<SomeDto>getTestPageResutForRequest(getRequest());
TestPageResult is:
protected static class TestPageResult<T> {
private List<T> items;
private long totalCount = -1;
public TestPageResult() {
}
//omitted getters and setters
}
The resulting pageResult.getItems() contains a List of LinkedHashMap instead of a list of SomeDto. If I were to just hardcode the SomeDto type in the objectMapper.readValue method I'd get the correct results.
What's the problem?
edit: The suggested duplicated did solve my problem - kind of.
I used:
JavaType type = getObjectMapper().getTypeFactory().constructParametricType(TestPageResult.class, clazz);
TestPageResult<T> response = getObjectMapper().readValue(responseJson, type);
Problem is there is no going around not passing down a Class argument to the method. So the method looks ugly due to both passing a generic type and the same thing as a Class. Obviously you can just not pass the generic now but this way a casting would be required and adding SuppressWarnings and so on.
The problem is erasure. All these <T> parameters don't exist in the compiled code, after they're erased. This means that source new TypeReference<TestPageResult<T>>() looks like new TypeReference<TestPageResult>() once compiled, which is not what you want. (Similar to how a List<String> ends up being a List in compiled code, and it's just compile-time validation that you don't add Integers to your String List.)
I think there's roughly two ways to deal with this (in this case), both of these you already stumbled upon:
Either you create a type that properly represents what you want, such as: new TypeReference<TestPageResult<SomeDto>>(), or class SomeDtoPageResult extends TestPageResult<SomeDto> which you can then use in places like readValue(..., SomeDtoPageResult.class);
Or you create a complete class representation, like you were doing with JavaType
What you really want won't work. Your best bet is to tinker and come up with the cleanest code that solves it. Generics let you express really elaborate structures, and when you serialize an actual instance (nested objects), that comes out just fine, but when the classes need to be introspected at runtime, e.g. for deserialization (your use case) or to build a model (e.g. to generate Swagger docs), this becomes problematic.

Java type conversion where types are not known until runtime

I'm trying to write a data access layer for an AJAX web project. This DAL has to convert data coming in via an AJAX servlet to objects that can be passed to a PreparedStatement for execution.
Data in the AJAX servlet, retrieved by using HttpServletRequest.getParameter(...), come in as strings.
In each data class, I have a known set of fields as well as their data types, e.g. CustomerId(integer), CustomerName(string).
I can of course write a method in the Customer class to handle the conversion, but this means I have to do it for every data object's class. I would much rather have a generic method that does conversion, e.g.
Object convert(String value, Class<?> targetType) { ... }
Can anyone point me in the right direction?
Create an utility class with all conversion methods you would like to use. Inside its static initializer, make use of reflection to collect all those methods by parameter type and return type in a map. Then, in the convert() method just pick the method which suits the given source and target type and invoke it. Make use of generics to fix the return type to be the same as the target type:
public static <T> T convert(Object from, Class<T> to)
You can find an example in this article.
But as bmargulies pointed out, JSON is also an interesting option. You could let ajax to send all parameters as one JSON string. Then, you can use a JSON-to-Javabean converter like Google Gson to convert the JSON string to a fullworthy Javabean like Customer. It'll be as simple as:
String jsondata = request.getParameter("jsondata");
Customer customer = new Gson().fromJson(jsondata, Customer.class);
// ...
See also this answer for another example.
There are JSON libraries that will do data type conversion. Jackson is one. Or, you could code the whole think using a JAX-RS service framework instead of a raw servlet, and it will take care of all this for you. Apache CXF is one framework that contains this support. Since you are asking for a generic solution, why not use one that's already out there.
We do this exact thing using a plethora of static converters in a utility class. It isn't elegant but it sure is easy and effective.
class Util {
public static Long StringToLong(String s) { ... }
public static Integer StringToInt(String s) { ... }
public static Date StringToDate(String s) { ... }
public static Date StringToDateYYMMDD(String s) { ... }
public static BigDecimal StringToBigDecimal(String s) { ... }
// etc ad naseum
}
Since you want to use the parameters in your PreparedStatement, why do you have to convert them at all?
When using setString(index, parameter) SQL will be happy to do the conversion for you.
Thus the only thing you might want to do is some kind of validation that the input is really valid (or you could even leave this part to your SQL engine which will throw an exception if it doesn't understand you.

Categories

Resources