Maybe I'm only being stupid right now, but I've been struggling really hard.
I am writing a networking application that interacts with another library right now.
The messages that this library produces are in the form Map<Path, Object>. These messages now need to be serialized.
I do not know of what type these Object's are. They are only transferred between two objects that can handle them, but for this need to be serialized.
I however struggle to understand how to do it. I've tried with Gson already but don't unterstand how to solve it.
Code looks something like this:
public interface Path extends Serializable{}
public interface Network{
public Map<Path, Object> getSendMessages();
public void receiveMessage(Map<Path, Object> message);
}
public class Main {
public static void main(String[] args) {
Network nw1 = NetworkProvider.getNetwork();
Network nw2 = NetworkProvider.getNetwork();
//I don't know what these actually do with the messages.
while(true) {
Map<Path, Object> message = nw1.getSendMessages();
//__________What to do here?__________________
SerializedMessage serializedMessage = ....
Map<Path, Object> deserializedMessage = ....
//____________________________________________
nw2.receiveMessage(deserializedMessage);
}
}
}
}
You can easily serialize and deserialize an Object Map by using ObjectMapper from Jackson.
This is an example of serializing:
ObjectMapper mapper = new ObjectMapper();
String jsonResult = mapper.writerWithDefaultPrettyPrinter()
.writeValueAsString(message);
For more information, check this link:
Map Serialization
Related
I am a very new to mapstruct. I am trying to convert List to Map, I've searched a lot online, I've got some solutions like its not yet implemented in mapstruct seems. I will be glad if someone could able to provide some alternative solution.
All I am looking to convert mapping as below:
#Mapping
Map<String, Object> toMap(List<MyObj>)
#Mapping
List<MyObj> toList(Map<String, Object>)
where MyObj as below:
class MyObj {
String key; //map key
String value; //map value
String field1;
}
In above, only use key and value fields from MyObj class. I've found one solution but below is converting some object to MAP, but using Jackson below:
#Mapper
public interface ModelMapper {
ObjectMapper OBJECT_MAPPER = new ObjectMapper();
default HashMap<String, Object> toMap(Object filter) {
TypeFactory typeFactory = OBJECT_MAPPER.getTypeFactory();
return OBJECT_MAPPER.convertValue(filter, typeFactory.constructMapType(Map.class, String.class, Object.class));
}
}
is there anyway now to implement using mapstruct?
Map struct doesn't have implicit conversion for your desired List to Map. You can have a custom mapping method as follows:
#Mapper
public interface FooMapper {
default Map<String, Foo> convertFooListToMap(List<Foo> foos) {
// custom logic using streams or however you like.
}
}
Other options include custom mapper implementations that you write and refer with something like #Mapper(uses=CustomMapper.class)
In Java, I need to consume JSON (example below), with a series of arbitrary keys, and produce Map<String, String>. I'd like to use a standard, long term supported JSON library for the parsing. My research, however, shows that these libraries are setup for deserialization to Java classes, where you know the fields in advance. I need just to build Maps.
It's actually one step more complicated than that, because the arbitrary keys aren't the top level of JSON; they only occur as a sub-object for prefs. The rest is known and can fit in a pre-defined class.
{
"al" : { "type": "admin", "prefs" : { "arbitrary_key_a":"arbitary_value_a", "arbitrary_key_b":"arbitary_value_b"}},
"bert" : {"type": "user", "prefs" : { "arbitrary_key_x":"arbitary_value_x", "arbitrary_key_y":"arbitary_value_y"}},
...
}
In Java, I want to be able to take that String, and do something like:
people.get("al").get("prefs"); // Returns Map<String, String>
How can I do this? I'd like to use a standard well-supported parser, avoid exceptions, and keep things simple.
UPDATE
#kumensa has pointed out that this is harder than it looks. Being able to do:
people.get("al").getPrefs(); // Returns Map<String, String>
people.get("al").getType(); // Returns String
is just as good.
That should parse the JSON to something like:
public class Person {
public String type;
public HashMap<String, String> prefs;
}
// JSON parsed to:
HashMap<String, Person>
Having your Person class and using Gson, you can simply do:
final Map<String, Person> result = new Gson().fromJson(json, new TypeToken<Map<String, Person>>() {}.getType());
Then, retrieving prefs is achieved with people.get("al").getPrefs();.
But be careful: your json string is not valid. It shouldn't start with "people:".
public static <T> Map<String, T> readMap(String json) {
if (StringUtils.isEmpty(json))
return Collections.emptyMap();
ObjectReader reader = new ObjectMapper().readerFor(Map.class);
MappingIterator<Map<String, T>> it = reader.readValues(json);
if (it.hasNextValue()) {
Map<String, T> res = it.next();
return res.isEmpty() ? Collections.emptyMap() : res;
}
return Collections.emptyMap();
}
All you need to do next, it that check the type of the Object. If it is Map, then you have an object. Otherwise, this is a simple value.
You can use Jackson lib to achieve this.
Put the following in pom.xml.
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>2.9.8</version>
</dependency>
Refer the following snippet that demonstrates the same.
ObjectMapper mapper = new ObjectMapper();
HashMap<String, Object> people = mapper.readValue(jsonString, new TypeReference<HashMap>(){});
Now, it is deserialized as a Map;
Full example:
import java.io.IOException;
import java.util.HashMap;
import com.fasterxml.jackson.core.JsonParseException;
import com.fasterxml.jackson.core.type.TypeReference;
import com.fasterxml.jackson.databind.JsonMappingException;
import com.fasterxml.jackson.databind.ObjectMapper;
public class testMain {
public static void main(String[] args) throws JsonParseException, JsonMappingException, IOException {
String json = "{\"address\":\"3, 43, Cashier Layout, Tavarekere Main Road, 1st Stage, BTM Layout, Ambika Medical, 560029\",\"addressparts\":{\"apartment\":\"Cashier Layout\",\"area\":\"BTM Layout\",\"floor\":\"3\",\"house\":\"43\",\"landmark\":\"Ambika Medical\",\"pincode\":\"560029\",\"street\":\"Tavarekere Main Road\",\"subarea\":\"1st Stage\"}}";
ObjectMapper mapper = new ObjectMapper();
HashMap<String, Object> people = mapper.readValue(json, new TypeReference<HashMap>(){});
System.out.println(((HashMap<String, String>)people.get("addressparts")).get("apartment"));
}
}
Output: Cashier Layout
I am creating an API (written in Java) which I am deploying through serverless which ports to a AWS Lambda function. All aspects of the API function great except for the fact that the requests which are returned include the '\' character in front of all quotes.
To put this into perspective, I have a person class which contains instance variables for name (String) and mood (String). I then have my class which uses this class to get and create a Person object, and then Jackson is used to parse this into JSON format. This is what is returned to the handler function (for lambda) and is displayed as the "object body".
public class Person{
String name;
String mood;
//getters and setters and constructor
}
Then, later on there will be something in a different class like
Person person = new Person("bob", "good");
Which would be passed into my method which is supposed to convert things to JSON:
private String convStrToJson(Person person) throws JsonProcessingException {
ObjectMapper mapper = new ObjectMapper();
String json = mapper.writeValueAsString(person);
return json;
}
If I were to print this in the output, I'd get something like:
{"name":"bob","mood":"good"}
Which is what I want and expect. However, when deployed and called via GET request, the result is:
"{\"name\":\"bob\",\"mood\":\"good\"}"
I've tried several strategies, including additions to the parsing method such as:
json = json.replace("\"", "");
Which removes the quotes fully from both outputs, or:
json = json.replace("\\","");
Which has no effect at all. I also tried both of these as replaceAll methods and that just messed things up even more. I'm not sure what else I can do to get rid of these '\' characters, I understand why they're there but I don't know how to stop that. Any assistance is appreciated.
Okay so I figured it out. Turns out serverless not only includes Jackson, but actually in the layout it creates for handling responses, the "setObjectBody" section will accept any kind of object and use Jackson to parse it to JSON. This is where I messed up. I assumed it would only accept Strings, which is where the double encoding was occurring. Now, if I pass in the Person object, serverless/Jackson handles it appropriately for me and the expected output is returned. I'll include code snippets below to better demonstrate this solution. Serverless creates a 'handler' class which has a template including a method called handleRequest. Once filled in, this class now looks like this:
public class GetStatusHandler implements RequestHandler<Map<String, Object>, ApiGatewayResponse> {
private static final Logger LOG = Logger.getLogger(GetStatusHandler.class);
#SuppressWarnings("unchecked")
public ApiGatewayResponse handleRequest(Map<String, Object> input, Context context) {
BasicConfigurator.configure();
LOG.info("received: " + input);
try {
Map<String, String> pathParameters = (Map<String, String>) input.get("queryStringParameters");
if(pathParameters == null) {
LOG.info("Getting details for all persons ");
PersonControl control = new PersonControl();
Person[] result = control.myGetHandler(context);
return ApiGatewayResponse.builder()
.setStatusCode(200)
.setObjectBody(result)
.setHeaders(Collections.singletonMap("X-Powered-By", "AWS Lambda & serverless"))
.build();
}else {
String name = pathParameters.get("name");
LOG.info("Getting details for "+name);
PersonControl control = new PersonControl();
Person result = control.myGetHandler(name, context);
return ApiGatewayResponse.builder()
.setStatusCode(200)
.setObjectBody(result)
.setHeaders(Collections.singletonMap("X-Powered-By", "AWS Lambda & serverless"))
.build();
}
}catch(Exception e) {
LOG.error(e, e);
Response responseBody = new Response("Failure getting person", null);
return ApiGatewayResponse.builder()
.setStatusCode(500)
.setObjectBody(responseBody)
.setHeaders(Collections.singletonMap("X-Powered-By", "AWS Lambda & serverless"))
.build();
}
}
}
Not that when returning the ApiGatewayResponse (via builder), an object is simply passed in to the .setObjectBody method ('result') which serverless automatically converts to JSON for us. Thats it! No parsing to JSON necessary in the code.
The response can be a user defined object as below
class Handler implements RequestHandler<SQSEvent, CustomObject> {
public CustomObject handleRequest(SQSEvent event, Context context) {
return new CustomObject();
}
}
Sample code can be found here.
Just use the Google Gson java library that can be used to convert Java Objects into their JSON representation.
Gson gson = new Gson();
gson.toJson(person);
I'm using Amazon's DynamoDBMapper Java class to save data to a DynamoDB table. This code needs to work for data structured in multiple different ways, so I would like to stay away from writing particularly structure-specific code. For this reason, I store the code as JSON objects in Java -- which are basically glorified HashMaps.
I would like to store these JSON objects into DynamoDB as Dynamo's relatively new JSON Document type.
The way the DynamoDBMapper API works is essentially that you write a Java class (typically a POJO), then add some annotations, then pass your objects of that class into DynamoDBMapper so that it can then put items into the database with the structure of the Java class. This works well for many aspects of what I'm doing, but not with the fact that I want these classes to contain arbitrarily-structured JSON documents. This is the way you're meant to store JSON documents using DynamoDBMapper, and as you can see, it doesn't allow for the structure of the documents to be arbitrary.
I realize I could use Dynamo's putItem() to pass the jsons as Strings into Item objects -- I just wanted to see if what I want to do is possible with DynamoDBMapper before I shift my approach.
You can try using the DynamoDB Java document SDK instead of the object mapper. This allows you to serialize and deserialize JSON strings using the fromJSON and toJSON methods in the Item class. Check out http://docs.aws.amazon.com/amazondynamodb/latest/developerguide/JavaDocumentAPIItemCRUD.html.
Here's how I came up with my answer of how to store arbitrary Map objects in DynamoDB. This is extremely useful for archiving REST API responses that have been unmarshaled to foreign objects. I'm personally using this to archive REST responses from the PayPal Payment API. I don't care what variables they use in their REST API or the structure of their POJO / beans. I just want to make sure I save everything.
#DynamoDBTable(tableName = "PaymentResponse")
#JsonIgnoreProperties(ignoreUnknown = true)
#JsonTypeInfo(use = JsonTypeInfo.Id.NAME, include = JsonTypeInfo.As.PROPERTY)
#JsonSubTypes({
#JsonSubTypes.Type(value = PayPalPaymentResponse.class, name = "PayPalPaymentResponse"),
#JsonSubTypes.Type(value = BatchPayPalPaymentResponse.class, name = "BatchPayPalPaymentResponse")}
)
public abstract class PaymentResponse {
// store any arbitrary REST resrponse data in map form so we don't have to worry about the
// structure or the actual response itself
protected Map<String, String> paymentResponseData = Maps.newHashMap();
public PaymentResponse(PaymentResponseType paymentResponseType) {
this.paymentResponseType = paymentResponseType;
}
public Map<String, String> getPaymentResponseData() { return paymentResponseData; }
public void setPaymentResponseData(Map<String, String> paymentResponseData) { this.paymentResponseData = paymentResponseData; }
#Override
public String toString() {
return Arrays.toString(paymentResponseData.entrySet().toArray());
}
}
public class ConverterUtils {
public static BatchPayPalPaymentResponse getBatchPayPalPaymentResponse(PayoutBatch payoutBatch) throws IOException {
//read in the PayoutBatch response data and convert it first to a JSON string and then convert the
//JSON string into a Map<String, String>
Map<String, String> responseData = objectMapper.readValue(objectMapper.writeValueAsString(payoutBatch), new TypeReference<Map<String, String>>() {});
BatchPayPalPaymentResponse batchPayPalPaymentResponse = new BatchPayPalPaymentResponse(responseData);
return batchPayPalPaymentResponse;
}
public static PayPalPaymentResponse getSinglePayPalPaymentResponse(PayoutItemDetails payoutItemDetails) throws IOException {
//read in the paypal PayoutItemDetails response data and convert it first to a JSON string and then convert the
//JSON string into a Map<String, String>
Map<String, String> responseData = objectMapper.readValue(objectMapper.writeValueAsString(payoutItemDetails), new TypeReference<Map<String, String>>() {});
PayPalPaymentResponse payPalPaymentResponse = new PayPalPaymentResponse(responseData);
return payPalPaymentResponse;
}
}
public class BatchPayPalPaymentResponse extends PaymentResponse {
public BatchPayPalPaymentResponse(Map<String, String> responseData) {
super(responseData);
}
....
....
....
}
public class PayPalPaymentResponse extends PaymentResponse {
public PayPalPaymentResponse(Map<String, String> responseData) {
super(responseData);
}
....
....
....
}
Now you can just call mapper.save(instanceOfPaymentResponse). Note that my code also includes how to use a Jackson parser to pick and choose which sub-class of PaymentResponse to unmarshal too. That's because I use a DynamoDBTypeConverter to marshal my class to a string before putting it into the database.
Finally, I'll throw in my converter for completeness so it all hopefully makes sense.
public class PaymentResponseConverter implements DynamoDBTypeConverter<String, PaymentResponse> {
private static final ObjectMapper objectMapper = new ObjectMapper();
static {
objectMapper.setSerializationInclusion(JsonInclude.Include.NON_NULL);
}
#Override
public String convert(PaymentResponse object) {
try {
return objectMapper.writeValueAsString(object);
} catch (JsonProcessingException e) {
throw new IllegalArgumentException(String.format("Received invalid instance of PaymentResponse and cannot marshal it to a string (%s)", e.getMessage()));
}
}
#Override
public PaymentResponse unconvert(String object) {
try {
return objectMapper.readValue(object, PaymentResponse.class);
} catch (IOException e) {
throw new IllegalArgumentException(String.format("Unable to convert JSON to instance of PaymentResponse. This is a fatal error. (%s)", e.getMessage()));
}
}
}
I had the same problem and went the route of serializing and deserializing objects to json string by myself and then just store them as strings. The whole Document concept of DynamoDB is IMHO just a glorified object serializer. Only if you need to access attributes inside your object in dynamodb actions (eg. scans, projections) it makes sense to use the json document type. If our data is opaque to dynamodb, you are better off with strings.
I am trying to be able to define the following code:
public class MyObject {
private String name;
... // Other attributes
}
#Path(...)
#Stateless
public class MyRestResource {
#POST
#Consumes(MediaType.APPLICATION_JSON)
public Response create(List<MyObject> myObjects) {
// Do some stuff there
}
}
I know that I need to use:
DeserializationConfig.Feature.ACCEPT_SINGLE_VALUE_AS_ARRAY, true
to setup correctly my object mapper to be able to accept single value as array on my rest resources. I succeed to setup that part.
My problem with this approach is that the following content is not differentiable:
{
"name": "a name",
... // other attributes
}
and
[{
"name": "a name",
... // other attributes
}]
will result into a list (List) of size one. Then, in the method create(List myObjects), I will not be able to do the difference between the List and the Single Object sent to the Rest Resource.
Then, my question is how to do something like that. The idea is to have only one #POST that accepts both Arrays and Single values?
Ideally, I will get rid of the configuration of the ObjectMapper to avoid letting the possibility to set Single Object into the other level of the JSON document. For example, I do not want to allow that:
{
...
"attributes": {
...
}
}
where normally this format should be mandatory:
{
...
"attributes": [{
...
}]
}
Based on that, I tried to put in place an object wrapper of my List to set if I am able to the difference between the list and the object. With something like that:
public class ObjectWrapper<T> {
private List<T> list;
private T object;
public boolean isObject() {
return list == null;
}
}
with the resource that becomes:
#Path(...)
#Stateless
public class MyRestResource {
#POST
#Consumes(MediaType.APPLICATION_JSON)
public Response create(ObjectWrapper myObjects) {
// Do some stuff there
}
}
and trying to put in place the deserialization of my content through the JAX-RS/Jersey/Jackson mechanisms. If I let the solution as it is now, the deserialization fails due to the fact that the JSON format expected is the following:
{
"list": [{
"name": "a name",
... // other attributes
}]
}
Then I tried to write a custom deserializer but I am a bit lost in this task. I have something like that:
public class ObjectWrapperDeserializer<T> extends JsonDeserializer<T> {
#Override
public T deserialize(JsonParser jp, DeserializationContext ctxt) throws IOException, JsonProcessingException {
... // What to put there to deserialize Array or Object
}
}
I just want to deserialize the root level to set the content deserialized into the object wrapper. I also want to keep the feature configured in a class annotated with #ApplicationPath when the configuraiton of the different #Provider are done.
I hope that all the info will give a sufficient picture of what I want to do and what I already tested.
Waiting for suggestion on how to do a resource that accept Arrays or Objects on the same path.
Thanks a lot in advance.
Ok, finally I succeed to put in place a mechanism that do exactly what I am looking for. But, I am not sure if there are negative consequences such the performance or such things.
First, I defined a class that can accept both List or Single Object:
public class RootWrapper<T> {
private List<T> list;
private T object;
}
Then, I need a custom deserializer that is able to know which kind of T type to deserialize and to handle the collection or the single object.
public class RootWrapperDeserializer extends JsonDeserializer<CollectionWrapper<?>> {
private Class contentType;
public RootWrapperDeserializer(Class contentType) {
this.contentType = contentType;
}
#Override
public RootWrapper deserialize(JsonParser jp, DeserializationContext ctxt)
throws IOException, JsonProcessingException {
// Retrieve the object mapper and read the tree.
ObjectMapper mapper = (ObjectMapper) jp.getCodec();
JsonNode root = mapper.readTree(jp);
RootWrapper wrapper = new RootWrapper();
// Check if the root received is an array.
if (root.isArray()) {
List list = new LinkedList();
// Deserialize each node of the array using the type expected.
Iterator<JsonNode> rootIterator = root.getElements();
while (rootIterator.hasNext()) {
list.add(mapper.readValue(rootIterator.next(), contentType));
}
wrapper.setList(list);
}
// Deserialize the single object.
else {
wrapper.setObject(mapper.readValue(root, contentType));
}
return wrapper;
}
}
As far as I know, I try to only deserialize the root level manually and then let Jackson take the next operations in charge. I only have to know which real type I expect to be present in the Wrapper.
At this stage, I need a way to tell Jersey/Jackson which deserializer to use. One way I found for that is to create a sort of deserializer registry where are stored the type to deserialize with the right deserializer. I extended the Deserializers.Base class for that.
public class CustomDeserializers extends Deserializers.Base {
// Deserializers caching
private Map<Class, RootWrapperDeserializer> deserializers = new HashMap<>();
#Override
public JsonDeserializer<?> findBeanDeserializer(JavaType type,
DeserializationConfig config, DeserializerProvider provider,
BeanDescription beanDesc, BeanProperty property) throws JsonMappingException {
// Check if we have to provide a deserializer
if (type.getRawClass() == RootWrapper.class) {
// Check the deserializer cache
if (deserializers.containsKey(type.getRawClass())) {
return deserializers.get(type.getRawClass());
}
else {
// Create the new deserializer and cache it.
RootWrapperDeserializer deserializer =
new RootWrapperDeserializer(type.containedType(0).getRawClass());
deserializers.put(type.getRawClass(), deserializer);
return deserializer;
}
}
return null;
}
}
Ok, then I have my deserializers registry that create new deserializer only on demand and keep them once created. What I am not sure about that approach is if there is any concurrency issue. I know that Jackson do a lot of caching and do not call every time the findBeanDeserializer once it was called a first time on a specific deserialization context.
Now I have created my different classes, I need to do some plumbing to combine everything together. In a provider where I create the ObjectMapper, I can setup the deserializers registry to the created object mapper like below:
#Provider
#Produces(MediaType.APPLICATION_JSON)
public class JsonObjectMapper implements ContextResolver<ObjectMapper> {
private ObjectMapper jacksonObjectMapper;
public JsonObjectMapper() {
jacksonObjectMapper = new ObjectMapper();
// Do some custom configuration...
// Configure a new deserializer registry
jacksonObjectMapper.setDeserializerProvider(
jacksonObjectMapper.getDeserializerProvider().withAdditionalDeserializers(
new RootArrayObjectDeserializers()
)
);
}
#Override
public ObjectMapper getContext(Class<?> arg0) {
return jacksonObjectMapper;
}
}
Then, I can also define my #ApplicationPath that is my REST application like following:
public abstract class AbstractRestApplication extends Application {
private Set<Class<?>> classes = new HashSet<>();
public AbstractRestApplication() {
classes.add(JacksonFeature.class);
classes.add(JsonObjectMapper.class);
addResources(classes);
}
#Override
public Set<Class<?>> getClasses() {
return classes;
}
#Override
public Set<Object> getSingletons() {
final Set<Object> singletons = new HashSet<>(1);
singletons.add(new JacksonJsonProvider());
return singletons;
}
private void addResources(Set<Class<?>> classes) {
classes.add(SomeRestResource.class);
// ...
}
}
Now, everything is in place and I can write a REST resource method like that:
#POST
#Path("somePath")
#Consumes(MediaType.APPLICATION_JSON)
#Produces(MediaType.APPLICATION_JSON)
public Response create(RootWrapper<SpecificClass> wrapper) {
if (wrapper.isObject()) {
// Do something for one single object
SpecificClass sc = wrapper.getObject();
// ...
return Response.ok(resultSingleObject).build();
}
else {
// Do something for list of objects
for (SpecificClass sc = wrapper.getList()) {
// ...
}
return Response.ok(resultList).build();
}
}
That's all. Do not hesitate to comment the solution. Feedbacks are really welcome especially around the way of deserialization process where I am really not sure that it is safe for performance and concurrency.