I have two enums I've created for my Java project, and both of them need to be serialized and deserialized using Gson. The problem is, I need to use a value on each enum Field as the serialized value.
As an example, I have these enums:
Options Enum
Language Enum
My hope is that I am able to serialize both enums using the key value provided to both. This is a really simplified example, but it still perfectly describes my situation.
I tried using custom serializer classes for both:
Options Serializer
Language Serializer
And yes, I did register both using registerTypeAdapter(type, adapter)
The strange this is, it would work for one enum, serializing to the correct value, but not the other. I suspect it's because the class that's being serialized is formatted similar to this:
public class Item {
public Language language;
public List<Options> options;
}
Where in this case, Language is serialized properly, but the Options enum is not, just returning the enum value name.
I'm not sure if there's some special way I need to handle this, but it's getting frustrating.
EDIT: I know about using the #SerializedName() annotation, but both of the enums I'm using have hundreds of entries and the keys that are part of the enum are used elsewhere throughout the program as well. Using #SerializedName(), at least in my case, I don't think would be feasible.
Write an adapter for Item. I would add more but laptop on 1%.
static class ItemAdapter extends TypeAdapter<Item> {
#Override
public void write(JsonWriter out, Item value) throws IOException {
out.beginObject();
out.name("language").value(value.language.key);
out.name("options");
out.beginArray();
for (Option option : value.options) {
out.value(option.key);
}
out.endArray();
out.endObject();
}
#Override
public Item read(JsonReader in) throws IOException {
in.beginObject();
in.nextName();
String languageKey = in.nextString();
in.nextName();
in.beginArray();
List<String> optionKeys = new ArrayList<>();
while (in.hasNext()) {
optionKeys.add(in.nextString());
}
in.endArray();
in.endObject();
return new Item(Language.BY_KEY.get(languageKey),
optionKeys.stream()
.map(Option.BY_KEY::get)
.collect(Collectors.toList()));
}
}
Related
I have a POJO that contains the following attributes
public class Example {
#JsonProperty("inputFoo")
private String foo
#JsonProperty("inputBar")
private String bar
#JsonProperty("inputBaz")
#JsonDeserialize(using = MyDeserializer.class)
private Set<String> baz
}
The JSON that I am working with to represent this data currently represents the baz attribute as a single string:
{"inputFoo":"a", "inputBar":"b", "inputBaz":"c"}
I am using the Jackson ObjectMapper to attempt to convert the JSON to my POJO. I know that the input baz String from the JSON wont map cleanly to the Set that I am trying to represent it as, so I defined a custom Deserializer:
public class MyDeserializer extends StdDeserializer<Set<String>> {
public MyDeserializer(){}
public MyDeserializer(Class<?> vc) {
super(vc);
}
public Set<String> deserialize(JsonParser p, DeserializationContext cxt) throws IOException, JsonProcessingException {
String input = p.readValueAs(String.class);
Set<String> output = new HashSet<>();
if(input != null) {
output.add(input);
}
return output;
}
}
I am getting an IllegalArgumentException referencing the "inputBaz" attribute, which I can provide details on. Does anyone see any obvious issue with my deserializer implementation? Thanks
You do not need to implement custom deserialiser, use ACCEPT_SINGLE_VALUE_AS_ARRAY feature. It works for sets as well:
Feature that determines whether it is acceptable to coerce non-array
(in JSON) values to work with Java collection (arrays,
java.util.Collection) types. If enabled, collection deserializers will
try to handle non-array values as if they had "implicit" surrounding
JSON array. This feature is meant to be used for
compatibility/interoperability reasons, to work with packages (such as
XML-to-JSON converters) that leave out JSON array in cases where there
is just a single element in array. Feature is disabled by default.
See also:
com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot deserialize instance of java.util.ArrayList out of START_OBJECT token
Replace the 2 constructors with this no-arg constructor:
public MyDeserializer() {
super(TypeFactory.defaultInstance().constructCollectionType(Set.class, String.class));
}
ACCEPT_SINGLE_VALUE_AS_ARRAY as suggested is a good option.
Maybe your actual problem is more complicated but if not you could also try #JsonCreator instead of custom deserializer. Like:
public class Example {
#JsonCreator
public Example(#JsonProperty("inputFoo") String foo,
#JsonProperty("inputBar") String bar,
#JsonProperty("inputBaz") String strBaz) {
this.foo = foo;
this.bar = bar;
this.baz = new HashSet<>();
baz.add(strBaz);
}
private String foo;
private String bar;
private Set<String> baz;
}
Just to show that in more general case you might avoid implementing custom deserializer with #JsonCreator also but still make some simple conversions.
I wrote a TypeAdapter for a class that contains an enum attribute. This is the write method, which uses standard GSON serialization for the enum value:
#Override
public void write(JsonWriter writer, MyClass object) throws IOException {
if (object == null) {
writer.nullValue();
return;
}
writer.beginObject();
writer.name("type").value(gson.toJson(object.getType())); //this is the enum
writer.endObject();
}
When using this TypeAdapter, the produced JSON contains this part for the enum:
"type":"\"ENUM_VALUE\""
But when I use gson.toJson(object) on a class which contains this enum without a TypeAdapter, it produces:
"type":"ENUM_VALUE"
All Gson objects use the standard configuration. It produces the same result in the first version, whether I test the TypeAdapter directly or use a Gson and registering it.
Why is there a difference? I guess escaping is not needed here, so I’d like to avoid it.
Interestingly, deserialization works for both serialized versions with the TypeAdapter (with gson.fromJson(reader.nextString())).
I guess that the problem might occure because gson.toJson(object.getType()) already produces quotes: "ENUM_VALUE" and when adding them to the JsonWriter with writer.value(gson.toJson(object.getType()) it gets escaped. But how to handle this correctly, like GSON does it?
Simply your TypeAdapter is wrong. Replace it with:
public void write(JsonWriter writer, MyClass object) throws IOException {
if (object == null) {
writer.nullValue();
return;
}
writer.beginObject();
writer.name("type").value(object.getType().toString()); //this is the enum
writer.endObject();
}
In your code, you create a string from the enum doing a JSON serialization. This produces "ENUM_VALUE" (gson.toJson(object.getType())), and then it is serialized again into a string so the result is \"ENUM_VALUE\".
In my code, I get the string representation of the enum using the toString() method so no additional quotes are created.
I have been tinkering with this idea for a few days, and I was wondering if anyone else has thought of doing this. I would like to try and create a ResourceBundle that I can access the values with by using an enum. The benefits of this approach would be that my keys would be well defined, and hopefully, my IDE can pick up on the types and auto-complete the variable names for me. In other words, I'm after a sort of refined ListResourceBundle.
Essentially, this is what I'm after...
I have an enum that consists of various bundles set up like so:
interface Bundle {
String getBundleName();
EnumResourceBundle<??????> getEnumResourceBundle();
}
enum Bundles implements Bundle {
BUNDLE1("com.example.Bundle1", Keys.class);
private final String bundleName;
private final EnumResouceBundle<??????> bundle;
/**
* I understand here I need to do some cast with ResourceBundle.getBundle(bundleName);
* in order to have it back-track through parents properly. I'm fiddling with this
* right now using either what I specified earlier (saving bundleName and then
* retrieving the ResourceBundle as needed), and saving a reference to the
* ResourceBundle.
*/
private <E extends Enum<E> & Key> Bundles(String bundleName, Class<E> clazz) {
this.bundleName = bundleName;
this.bundle = new EnumResourceBundle<??????>(clazz);
}
#Override
public String getBundleName() {
return bundleName;
}
#Override
public EnumResourceBundle<??????> getEnumResourceBundle() {
return bundle;
}
}
interface Key {
String getValue();
}
enum Keys implements Key {
KEY1("This is a key"),
KEY2("This is another key");
private final String value;
private Keys(String value) {
this.value = value;
}
#Override
public String getKey() {
return value;
}
}
class EnumResourceBundle<E extends Enum<E> & Key> extends ResourceBundle {
// Can also store Object in case we need it
private final EnumMap<E, Object> lookup;
public EnumResourceBundle(Class<E> clazz) {
lookup = new EnumMap<>(clazz);
}
public String getString(E key) {
return (String)lookup.get(key);
}
}
So my overall goal would be to have to code look something like this:
public static void main(String[] args) {
Bundles.CLIENT.getEnumResourceBundle().getString(Keys.KEY1);
Bundles.CLIENT.getEnumResourceBundle().getString(Keys.KEY2);
// or Bundles.CLIENT.getString(Keys.KEY1);
}
I'd also like to provide support for formatting replacements (%s, %d, ...).
I realize that it isn't possible to back-track a type from a class, and that wouldn't help me because I've already instantiated Bundles#bundle, so I was wondering if I could somehow declare EnumResourceBundle, where the generic type is an enum which has implemented the Key interface. Any ideas, help, or thoughts would be appreciated. I would really like to see if I can get it working like this before I resort to named constants.
Update:
I had a thought that maybe I could also try changing EnumResourceBundle#getString(E) to take a Key instead, but this would not guarantee that it's a valid Key specified in the enum, or any enum for that matter. Then again, I'm not sure how that method would work when using a parent enum Key within a child EnumResourceBundle, so maybe Key is a better option.
I've done something like this before but I approached it the other way around and it was pretty simple.
I just created an enum translator class that accepts the enum, and then maps the enum name to the value from the property file.
I used a single resource bundle and then the translate just looked something like (from memory):
<T extends enum>String translate(T e) {
return resources.getString(e.getClass().getName()+"."+e.getName());
}
<T extends enum>String format(T e, Object... params) {
return MessageFormat.format(translate(e), params);
}
Now for any enum you can just add a string to the file:
com.example.MyEnum.FOO = This is a foo
com.example.MyEnum.BAR = Bar this!
If you want to ensure that the passed class is the correct enum for this you could either define a shared interface for those enums or you could make this into a class with the T defined on the class type and then generate instances of it for each enum you want to be able to translate. You could then do things like create a translator class for any enum just by doing new EnumFormatter(). Making format() protected would allow you to give a specific enforceable format for each enum type too by implementing that in the EnumFormatter.
Using the class idea even lets you go one step further and when you create the class you can specify both the enum that it is for and the properties file. It can then immediately scan the properties file and ensure that there is a mapping there for every value in the enum - throwing an exception if one is missing. This will help ensure early detection of any missing values in the properties file.
Id like to represent a Class object as JSON. For example, if I have the class defintions as follows:
public class MyClass {
String myName;
int myAge;
MyOtherClass other;
}
public class MyOtherClass {
double myDouble;
}
I'd like to get the following nested JSON from a Class object of type MyClass:
{
myName: String,
myAge: int,
other: {
myDouble: double;
}
}
EDIT:
I don't want to serialize instances of these classes, I understand how to do that with GSON. I want to serialize the structure of the class itself, so that given a proprietary class Object I can generate JSON that breaks down the fields of the class recursively into standard objects like String, Double, etc.
With Jettison, you can roll your own mappings from Java to JSON. So in this case, you could get the Class object of the class you want, then map the Java returned by the getFields, getConstructors, getMethods etc. methods to JSON using Jettison.
I would recommend to use Jackson.
You can also take a look at the JSonObjectSerializer class based on Jackson which can be found at oVirt under engine/backend/manager/module/utils (you can git clone the code) and see how we used Jackson there.
Looking to do the same thing, in the end I wound up writing my own method, this does not handle all cases e.g. if one of the declared fields is a Map this will break, but this seems to be alright for most common objects:
#Override
public Map reflectModelAsMap(Class classType) {
List<Class> mappedTracker = new LinkedList<Class>();
return reflectModelAsMap(classType, mappedTracker);
}
private Map reflectModelAsMap(Class classType, List mappedTracker) {
Map<String, Object> mapModel = new LinkedHashMap<String, Object>();
mappedTracker.add(classType);
Field[] fields = classType.getDeclaredFields();
for (Field field : fields) {
if (mappedTracker.contains(field.getType()))
continue;
if (BeanUtils.isSimpleValueType(field.getType())) {
mapModel.put(field.getName(), field.getType().toString());
} else if (Collection.class.isAssignableFrom(field.getType())) {
Class actualType = (Class) ((ParameterizedType) field.getGenericType()).getActualTypeArguments()[0];
mapModel.put("Collection", reflectModelAsMap(actualType, mappedTracker));
} else {
mapModel.put(field.getName(), reflectModelAsMap(field.getType(), mappedTracker));
}
}
return mapModel;
}
The mapped tracker there because of how I handle relationships in Hibernate; without it there is an endlessly recursive relationship between parent and child e.g. child.getFather().getFirstChild().getFather().getFirstChild().getFather()...
I have an object and in several components I need to render two of its properties concatenated together with a delimiter. If one of the properties is null then it should not display the delimiter but just the not null property. If both are null then it should not display at all.
The two properties are accessed thus:
thing.getFoo()
and
thing.getStuff().getBar()
The renderer will be a class with one static method taking an instance of the type of thing and will return a string.
The problem is that it seems ugly to do so much null checking on thing and the result of thing.getStuff() so I was wondering if to use PropertyResolver instead.
The problem is that the Javadoc says it's not part of the Wicket API and to only use it if I know what I'm doing. I presume therefore that there are certain caveats or issues that I should know about? If so, what are they?
I would use a custom read only model in that case. Something like:
private static class ConcatenatingPropertyModel extends AbstractReadOnlyModel<String> {
private List<PropertyModel<String>> models = new ArrayList<PropertyModel<String>>();
public ConcatenatingPropertyModel(Object object, String... props) {
for (String prop : props) {
models.add(new PropertyModel<String>(object, prop));
}
}
#Override
public String getObject() {
// iterate over delegate property models and concatenate
}
#Override
public void detach() {
super.detach();
for (PropertyModel<String> model : models) {
model.detach();
}
}
}
Then you can use the model like this:
new ConcatenatingPropertyModel(thing, "foo", "stuff.bar");