I'm trying to get schedules data from mongoDb.
I created the appropriate aggregation and tried to convert it within Spring Framework.
db.theaters.aggregate([
{ $match: { 'city_id': <someCityId>, 'theatreRooms.schedules.spectacle_id': <someSpecId> } },
{ $unwind: '$theatreRooms' },
{ $unwind: '$theatreRooms.schedules' },
{ $group: { _id: { name: '$name', room: '$theatreRooms.name' }, schedules: { $addToSet: '$theatreRooms.schedules.time' } } },
{ $group: { _id: '$_id.name', schedules: { $addToSet: { room: '$_id.room', schedules: '$schedules' } } } }
])
I've created properly match and unwind operations. But I've got problem with first group operation.
It seems that the operation is well interpreted, but for some reason I am not able to properly map the _id object.
Here is my code example:
public class TheaterProject {
private TheaterId _id;
private List<String> schedules;
public TheaterId get_id() {
return _id;
}
public void set_id(TheaterId _id) {
this._id = _id;
}
public List<String> getSchedules() {
return schedules;
}
public void setSchedules(List<String> schedules) {
this.schedules = schedules;
}
}
public class TheaterId {
#Field("name")
private String name;
#Field("room")
private Integer room;
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public Integer getRoom() {
return room;
}
public void setRoom(Integer room) {
this.room = room;
}
}
public Document getRawSchedules(String cityId, String spectaclesId){
MatchOperation match = Aggregation.match(Criteria.where("city_id").is(cityId).and("theatreRooms.schedules.spectacle_id").is(spectaclesId));
UnwindOperation theaterUnwind = Aggregation.unwind("theatreRooms");
UnwindOperation schedulesUnwind = Aggregation.unwind("theatreRooms.schedules");
GroupOperation firstGroup = Aggregation.group(Fields.from(
Fields.field("name", "name"),
Fields.field("room", "theatreRooms.name")))
.addToSet("theatreRooms.schedules.time").as("schedules");
Aggregation agg = Aggregation.newAggregation(match,theaterUnwind,schedulesUnwind,firstGroup);
Document theaters = mongoTemplate.aggregate(agg, Theater.class, TheaterProject.class).getRawResults();
return theaters;
}
public List<TheaterProject> getSchedules(String cityId, String spectaclesId){
MatchOperation match = Aggregation.match(Criteria.where("city_id").is(cityId).and("theatreRooms.schedules.spectacle_id").is(spectaclesId));
UnwindOperation theaterUnwind = Aggregation.unwind("theatreRooms");
UnwindOperation schedulesUnwind = Aggregation.unwind("theatreRooms.schedules");
GroupOperation firstGroup = Aggregation.group(Fields.from(
Fields.field("name", "name"),
Fields.field("room", "theatreRooms.name")))
.addToSet("theatreRooms.schedules.time").as("schedules");
Aggregation agg = Aggregation.newAggregation(match,theaterUnwind,schedulesUnwind,firstGroup);
List<TheaterProject> theaters = mongoTemplate.aggregate(agg, Theater.class, TheaterProject.class).getMappedResults();
return theaters;
}
When I've invoked method getSchedules which return mapped objects, _id field is equal to null.
[
{
"_id": null,
"schedules": [
"5:15"
]
},
{
"_id": null,
"schedules": [
"6:55",
"4:35",
"10:15"
]
}
]
But when I've invoked getRawSchedules which used getRawResults it's looking properly.
{
"results": [
{
"_id": {
"name": "Pinokio",
"room": 2
},
"schedules": [
"5:15"
]
},
{
"_id": {
"name": "Roma",
"room": 1
},
"schedules": [
"6:55",
"4:35",
"10:15"
]
}
]
}
I don't have any idea why it's working like that.
I didn't find any information about this problem in the documentation and here. But I have a solution. You may just rename the field from _id to something else. theaterId for example. I don't know all requirements for your issue but you may do it just on mapping level.
Fix the mapping
import org.springframework.data.mongodb.core.mapping.Field;
import java.util.List;
public class TheaterProject {
#Field("theaterId")
private TheaterId _id;
private List<String> schedules;
public TheaterId get_id() {
return _id;
}
public void set_id(TheaterId _id) {
this._id = _id;
}
public List<String> getSchedules() {
return schedules;
}
public void setSchedules(List<String> schedules) {
this.schedules = schedules;
}
}
But it requires additional projection step
public List<TheaterProject> getSchedules(String cityId, String spectaclesId){
...
GroupOperation firstGroup = Aggregation.group(Fields.from(
Fields.field("name", "name"),
Fields.field("room", "theatreRooms.name")))
.addToSet("theatreRooms.schedules.time").as("schedules");
ProjectionOperation projection = Aggregation.project(Fields.from(
Fields.field("theaterId", "_id"),
Fields.field("schedules", "schedules")));
Aggregation agg = Aggregation.newAggregation( ... ,firstGroup, projection);
List<TheaterProject> theaters = mongoTemplate.aggregate(agg, "collectionName", TheaterProject.class).getMappedResults();
return theaters;
}
Related
I'm trying to fix a problem I have with DynamoDB when creating a StaticTableSchema with Spring Native. Inside the cloudwatch logs I found this error:
Exception in thread "pool-4-thread-1" java.lang.ClassCastException: byte[] cannot be cast to com.beta80.movidapp.models.Event
DynamoDB table:
{
"id": {
"S": ""
},
"attendees": {
"L": [
{
"S": ""
},
{
"S": ""
}
]
},
"comments": {
"L": [
{
"M": {
"author": {
"S": ""
},
"comment_date": {
"S": ""
},
"comment_id": {
"S": ""
},
"content": {
"S": ""
}
}
}
]
}
}
Comment class:
#Data
#Builder(toBuilder = true)
#NoArgsConstructor
#AllArgsConstructor
public class Comment {
#Getter(onMethod_={#DynamoDbAttribute("comment_id")})
#Setter(onMethod_={#DynamoDbAttribute("comment_id")})
private String commentId;
private String content;
#Getter(onMethod_={#DynamoDbAttribute("comment_date")})
#Setter(onMethod_={#DynamoDbAttribute("comment_date")})
private String commentDate;
private String author;
}
Comment StaticTableSchema:
public class CommentMapper {
private TableSchema<Comment> tableSchema;
public CommentMapper() {
this.tableSchema = StaticTableSchema.builder(Comment.class)
.newItemSupplier(Comment::new)
.addAttribute(String.class, a -> a.name("commentId")
.getter(Comment::getCommentId)
.setter(Comment::setCommentId)
.tags(primaryPartitionKey()))
.addAttribute(String.class, a -> a.name("content")
.getter(Comment::getContent)
.setter(Comment::setContent))
.addAttribute(String.class, a -> a.name("commentDate")
.getter(Comment::getCommentDate)
.setter(Comment::setCommentDate))
.addAttribute(String.class, a -> a.name("author")
.getter(Comment::getAuthor)
.setter(Comment::setAuthor))
.build();
}
public TableSchema<Comment> getTableSchema() {
return tableSchema;
}
}
Event class:
#Data
#Builder(toBuilder = true)
#NoArgsConstructor
#AllArgsConstructor
public class Event {
#Getter(onMethod_={#DynamoDbPartitionKey})
private String id;
private List<String> attendees;
private List<Comment> comments;
}
EventMapper class:
public class EventMapper {
private TableSchema<Event> tableSchema;
public EventMapper() {
CommentMapper commentMapper = new CommentMapper();
this.tableSchema = StaticTableSchema
.builder(Event.class)
.newItemSupplier(Event::new)
.addAttribute(String.class, a -> a.name("id")
.getter(Event::getId)
.setter(Event::setId)
.tags(primaryPartitionKey()))
.addAttribute(EnhancedType.listOf(String.class), a -> a.name("attendees")
.getter(Event::getAttendees)
.setter(Event::setAttendees)
.attributeConverter(
ListAttributeConverter.builder(EnhancedType.listOf(String.class))
.collectionConstructor(ArrayList::new)
.elementConverter(StringAttributeConverter.create())
.build()))
.addAttribute(
EnhancedType.listOf(
EnhancedType.documentOf(Comment.class,commentMapper.getTableSchema()),
),
a -> a.name("comments")
.getter(Event::getComments)
.setter(Event::setComments)
)
.build();
}
public TableSchema<Event> getTableSchema() {
return tableSchema;
}
}
I have been trying to solve the problem for several days but I cannot find a solution. Be patient, I have just started developing in java recently.
Firstly, my project has MinIO server where I use minio api to upload my files via spring boot application. The MinIO server integrated with Elasticsearch. When I uploaded a file, MinIO automatically update Elasticsearch minio_events index (I have configured this settings before). I want to run following query in my spring boot application and match result into File.java:
POST http://localhost:9200/minio_events/_search
{
"query": {
"bool": {
"must": [
{
"match": {
"Records.s3.object.userMetadata.X-Amz-Meta-Filename": "myfile.txt"
}
},
{
"match": {
"Records.s3.object.userMetadata.X-Amz-Meta-User_id": "40b3c4e0-fea8-4aca-9dec-b4b905f33df0"
}
}
]
}
},
"fields": [
"Records.s3.object.userMetadata.X-Amz-Meta-Filename",
"Records.s3.object.userMetadata.X-Amz-Meta-Description",
"Records.s3.object.userMetadata.X-Amz-Meta-Foldername",
"Records.s3.object.userMetadata.X-Amz-Meta-Tags",
"Records.s3.object.userMetadata.X-Amz-Meta-User_id"
],
"_source": false
}
The query result is:
{
"took": 2,
"timed_out": false,
"_shards": {
"total": 1,
"successful": 1,
"skipped": 0,
"failed": 0
},
"hits": {
"total": {
"value": 1,
"relation": "eq"
},
"max_score": 1.7260925,
"hits": [
{
"_index": "minio_events",
"_type": "_doc",
"_id": "AUuxte4_W8625RK6e6oT7tCJmJkSQJ0L9LGx6eAf0Dw=",
"_score": 1.7260925,
"fields": {
"Records.s3.object.userMetadata.X-Amz-Meta-Foldername": [
"helloworld"
],
"Records.s3.object.userMetadata.X-Amz-Meta-Description": [
"des"
],
"Records.s3.object.userMetadata.X-Amz-Meta-User_id": [
"40b3c4e0-fea8-4aca-9dec-b4b905f33df0"
],
"Records.s3.object.userMetadata.X-Amz-Meta-Filename": [
"MyFile.txt"
],
"Records.s3.object.userMetadata.X-Amz-Meta-Tags": [
"hello,world"
]
}
}
]
}
}
In my Spring Boot App, I wrote following repository class to fetch Elasticsearch results.
CustomFileRepository.java:
package com.oktaykcr.fileservice.repository;
import com.oktaykcr.fileservice.model.File;
import org.elasticsearch.index.query.BoolQueryBuilder;
import org.elasticsearch.index.query.QueryBuilders;
import org.springframework.data.elasticsearch.core.ElasticsearchOperations;
import org.springframework.data.elasticsearch.core.SearchHit;
import org.springframework.data.elasticsearch.core.SearchHits;
import org.springframework.data.elasticsearch.core.query.NativeSearchQuery;
import org.springframework.data.elasticsearch.core.query.Query;
import org.springframework.stereotype.Component;
import java.util.Collections;
import java.util.List;
import java.util.stream.Collectors;
#Component
public class CustomFileRepository {
private final ElasticsearchOperations elasticsearchOperations;
private final List<String> fields = List.of(
"Records.s3.object.userMetadata.X-Amz-Meta-Filename",
"Records.s3.object.userMetadata.X-Amz-Meta-Description",
"Records.s3.object.userMetadata.X-Amz-Meta-Foldername",
"Records.s3.object.userMetadata.X-Amz-Meta-Tags",
"Records.s3.object.userMetadata.X-Amz-Meta-User_id"
);
public CustomFileRepository(ElasticsearchOperations elasticsearchOperations) {
this.elasticsearchOperations = elasticsearchOperations;
}
public List<File> findByFileNameAndUserId(String fileName, String userId) {
BoolQueryBuilder queryBuilder = QueryBuilders.boolQuery()
.must(QueryBuilders.matchQuery("Records.s3.object.userMetadata.X-Amz-Meta-Filename", fileName))
.must(QueryBuilders.matchQuery("Records.s3.object.userMetadata.X-Amz-Meta-User_id", userId));
Query query = new NativeSearchQuery(queryBuilder);
query.setFields(fields);
SearchHits<File> result = elasticsearchOperations.search(query, File.class);
if(result.isEmpty()) {
return Collections.emptyList();
}
List<File> files = result.getSearchHits().stream().map(SearchHit::getContent).collect(Collectors.toList());
return files;
}
}
File.java:
package com.oktaykcr.fileservice.model;
import org.springframework.data.annotation.Id;
import org.springframework.data.elasticsearch.annotations.Document;
import org.springframework.data.elasticsearch.annotations.Field;
import org.springframework.data.elasticsearch.annotations.FieldType;
import java.util.List;
#Document(indexName = "minio_events")
public class File {
#Id
private String id;
#Field(type = FieldType.Object, value = "fields")
private Fields fields;
public File() {
}
public String getId() {
return id;
}
public void setId(String id) {
this.id = id;
}
public Fields getFields() {
return fields;
}
public void setFields(Fields fields) {
this.fields = fields;
}
static class Fields {
#Field(type = FieldType.Nested, value = "Records.s3.object.userMetadata.X-Amz-Meta-Foldername")
public List<String> folderName;
#Field(type = FieldType.Nested, value = "Records.s3.object.userMetadata.X-Amz-Meta-Description")
public List<String> description;
#Field(type = FieldType.Nested, value = "Records.s3.object.userMetadata.X-Amz-Meta-User_id")
public List<String> userId;
#Field(type = FieldType.Nested, value = "Records.s3.object.userMetadata.X-Amz-Meta-Filename")
public List<String> fileName;
#Field(type = FieldType.Nested, value = "Records.s3.object.userMetadata.X-Amz-Meta-Tags")
public List<String> tags;
public Fields() {
}
public List<String> getFolderName() {
return folderName;
}
public void setFolderName(List<String> folderName) {
this.folderName = folderName;
}
public List<String> getDescription() {
return description;
}
public void setDescription(List<String> description) {
this.description = description;
}
public List<String> getUserId() {
return userId;
}
public void setUserId(List<String> userId) {
this.userId = userId;
}
public List<String> getFileName() {
return fileName;
}
public void setFileName(List<String> fileName) {
this.fileName = fileName;
}
public List<String> getTags() {
return tags;
}
public void setTags(List<String> tags) {
this.tags = tags;
}
}
}
However, the result of List<File> files = customFileRepository.findByFileNameAndUserId(fileName, userId); is:
result = {ArrayList#15401} size = 1
0 = {File#15403}
id = "AUuxte4_W8625RK6e6oT7tCJmJkSQJ0L9LGx6eAf0Dw="
fields = null
id was mapped by #Document model but fields was not.
I want to create this JSON using jakson annotated POJOS. The issue I have when I create a new class without #JsonProperty annotation to represent the last {"id":"123ccc","role":"dddd"}, it by default take the class name and create something like "customer":{"id": "123ccc","role":"dddd"}.
The JSON Structure I indent to build
{
"relatedParty": [
{
"contact": [
{
"mediumType": "xxx",
"characteristic": {
"city": "xxx",
"country": "xxx"
}
},
{
"mediumType": "yyy",
"characteristic": {
"emailAddress": "yyy#yy.yyy"
}
}
],
"role": "ccc",
"fullName": "ccc"
},
{
"id": "123ccc",
"role": "dddd"
}
]
}
The JSON I'm receiving from the below code.
{
"relatedParty": [
{
"contact": [
{
"mediumType": "xxx",
"characteristic": {
"city": "xxx",
"country": "xxx"
}
},
{
"mediumType": "yyy",
"characteristic": {
"emailAddress": "yyy#yy.yyy"
}
}
],
"role": "ccc",
"fullName": "ccc"
},
"customer" : {
"id": "123ccc",
"role": "dddd"
}
]
}
What would be a workaround to get the exact JSON format as the image. Current Implementation is below.
import com.fasterxml.jackson.annotation.JsonProperty;
import java.util.List;
public class RelatedParty {
#JsonProperty(value = "contact")
private List<Contact> contact;
#JsonProperty(value = "role")
private String role;
#JsonProperty(value = "fullName")
private String fullName;
private Customer customer;
public List<Contact> getContact() {
return contact;
}
public void setContact(List<Contact> contact) {
this.contact = contact;
}
public String getRole() {
return role;
}
public void setRole(String role) {
this.role = role;
}
public String getFullName() {
return fullName;
}
public void setFullName(String fullName) {
this.fullName = fullName;
}
public Customer getCustomer() {
return customer;
}
public void setCustomer(Customer customer) {
this.customer = customer;
}
}
public class Customer {
#JsonProperty(value = "id")
private String id;
#JsonProperty(value = "role")
private String role;
public String getId() {
return id;
}
public void setId(String id) {
this.id = id;
}
public String getRole() {
return role;
}
public void setRole(String role) {
this.role = role;
}
}
You need to create additional and different POJO classes to model your JSON correctly. Basically, JSON arrays will be handle in Java lists, and JSON objects will be handled in Java classes.
Starting from the inside (most nested level) of the JSON, and working our way out:
NOTE: getters and setters not shown here
Characteristic.java
#JsonInclude(JsonInclude.Include.NON_NULL)
public class Characteristic {
#JsonProperty("city")
private String city;
#JsonProperty("country")
private String country;
#JsonProperty("emailAddress")
private String emailAddress;
}
Contact.java (contains our characteristics):
#JsonInclude(JsonInclude.Include.NON_NULL)
public class Contact {
#JsonProperty("mediumType")
private String mediumType;
#JsonProperty("characteristic")
private Characteristic characteristic;
}
The above two classes handle the innermost objects. If we remove them from your target JSON, that leaves the following:
{
"relatedParty": [{
"contact": [...],
"role": "ccc",
"fullName": "ccc"
}, {
"role": "dddd",
"id": "123ccc"
}]
}
Note that the contact field is a JSON array, not an object - so we do not create a Java Contact class (which would be for a JSON object).
To handle the above I create two more classes:
RelatedPartyInner.java (contains a list of contacts)
#JsonInclude(JsonInclude.Include.NON_NULL)
public class RelatedParty_ {
#JsonProperty("contact")
private List<Contact> contact = null;
#JsonProperty("role")
private String role;
#JsonProperty("fullName")
private String fullName;
#JsonProperty("id")
private String id;
}
RelatedParty.java (wraps everything in an outer object):
#JsonInclude(JsonInclude.Include.NON_NULL)
public class RelatedParty {
#JsonProperty("relatedParty")
private List<RelatedPartyInner> relatedParty = null;
}
To test this I create the following data:
Characteristic chr1 = new Characteristic();
chr1.setCity("xxx");
chr1.setCountry("xxx");
Characteristic chr2 = new Characteristic();
chr2.setEmailAddress("yyy#yy.yyy");
Contact con1 = new Contact();
con1.setMediumType("xxx");
con1.setCharacteristic(chr1);
Contact con2 = new Contact();
con2.setMediumType("yyy");
con2.setCharacteristic(chr2);
List<Contact> cons = new ArrayList<>();
cons.add(con1);
cons.add(con2);
RelatedPartyInner rpi1 = new RelatedPartyInner();
rpi1.setContact(cons);
rpi1.setRole("ccc");
rpi1.setFullName("ccc");
RelatedPartyInner rpi2 = new RelatedPartyInner();
rpi2.setId("123ccc");
rpi2.setRole("dddd");
List<RelatedPartyInner> rpis = new ArrayList<>();
rpis.add(rpi1);
rpis.add(rpi2);
RelatedParty rp = new RelatedParty();
rp.setRelatedParty(rpis);
Finally, we can generate the JSON:
ObjectMapper objectMapper = new ObjectMapper();
objectMapper.writeValue(new File("rp.json"), rp);
The resulting file contains the following:
{
"relatedParty": [{
"contact": [{
"mediumType": "xxx",
"characteristic": {
"city": "xxx",
"country": "xxx"
}
}, {
"mediumType": "yyy",
"characteristic": {
"emailAddress": "yyy#yy.yyy"
}
}],
"role": "ccc",
"fullName": "ccc"
}, {
"role": "dddd",
"id": "123ccc"
}]
}
I am reading one file through which I need to create a specific JSON structure to pass it on UI, but I am unable to write a code for JSON formation. So I created one Java class initialized my class, tried a lot but still now able to find a way, how to form it. Here is my main Java class where I am adding an element like this
public class DynamicListForming {
public static void main(String[] args) {
NodeInfo cab = new NodeInfo("saurabh", "South");
NodeInfo cab1 = new NodeInfo("South", "ZONE1");
NodeInfo cab2 = new NodeInfo("ZONE1", "Street-1");
NodeInfo cab3 = new NodeInfo("ZONE1", "Street-2");
NodeInfo cab4 = new NodeInfo("ZONE1", "Street-3");
NodeInfo cab5 = new NodeInfo("ZONE1", "Street-4");
List<NodeInfo> NodeInfos = new LinkedList<NodeInfo>();
NodeInfos.add(cab);
NodeInfos.add(cab1);
NodeInfos.add(cab2);
NodeInfos.add(cab3);
NodeInfos.add(cab4);
NodeInfos.add(cab5);
}
}
My NodeInfo class looks like this
public class NodeInfo {
private String nodeName;
private String parentName;
private List<NodeInfo> children;
public NodeInfo(String parentName, String nodeName) {
super();
this.parentName = parentName;
this.nodeName = nodeName;
}
public String getNodeName() {
return nodeName;
}
public void setNodeName(String nodeName) {
this.nodeName = nodeName;
}
public String getParentName() {
return parentName;
}
public void setParentName(String parentName) {
this.parentName = parentName;
}
public List<NodeInfo> getChildren() {
return children;
}
public void setChildren(List<NodeInfo> children) {
this.children = children;
}
}
I need to form JSON structure like below
"nodeInfo": {
"name": "saurabh",
"children": [
{
"name": "SOUTH",
"children": [
{
"name": "Zone-1",
"children": [
{
"name": "Street-1"
},
{
"name": "Street-2"
},
{
"name": "Street-3"
},
{
"name": "Street-4"
}
]
}
]
}
]
}
Any suggestion on how to form this type of JSON structure, I struggled a lot and unable to find a way to dynamically create a list and object to form this structure.
I am storing a custom java object called Device into a mongo collection. This works fine. But when I am receiving the database entry all inner class objects are assigned with null and not with the database value.
This is a device:
public class Device {
public String id;
public String serialNumber;
public String name;
public int status;
public String errorReport;
public List<Sensor> sensors=new ArrayList<>();
public List<Action> actions=new ArrayList<>();
public List<String> filterTags= new ArrayList<>();
public List protocols;
}
This is an example entry from the database, as you can see the values are saved well:
{
"_id": "7_openHabsamsungtv:tv:cbaf7d7d_4e10_41e6_9c1d_864988057bda",
"actions": [
{
"_id": "samsungtv:tv:cbaf7d7d_4e10_41e6_9c1d_864988057bda:volume",
"deviceId": "7_openHabsamsungtv:tv:cbaf7d7d_4e10_41e6_9c1d_864988057bda",
"errorReport": "Value wurde nicht initialisiert",
"name": "Lautst�rke",
"state": 0,
"value": 0,
"valueOption": {
"maximum": 0,
"minimum": 0,
"percentage": true
},
"valueable": true
}
],
"filterTags": [],
"name": "[TV] Chine ",
"sensors": [
{
"_id": "samsungtv:tv:cbaf7d7d_4e10_41e6_9c1d_864988057bda:sourceId",
"name": "Source ID"
},
{
"_id": "samsungtv:tv:cbaf7d7d_4e10_41e6_9c1d_864988057bda:programTitle",
"name": "Titel"
},
{
"_id": "samsungtv:tv:cbaf7d7d_4e10_41e6_9c1d_864988057bda:channelName",
"name": "Kanal"
}
],
"status": 0
}
And this is what it looks like when I am receiving it from the database again:
{
"id": "7_openHabsamsungtv:tv:cbaf7d7d_4e10_41e6_9c1d_864988057bda",
"serialNumber": null,
"name": "[TV] Chine ",
"status": 0,
"errorReport": null,
"sensors": [
{
"id": null,
"name": null,
"errorReport": null
},
{
"id": null,
"name": null,
"errorReport": null
},
{
"id": null,
"name": null,
"errorReport": null
}
],
"actions": [
{
"id": null,
"name": null,
"deviceId": null,
"state": 0,
"states": null,
"valueOption": null,
"value": 0,
"errorReport": "Value wurde nicht initialisiert",
"valueable": false
}
],
"filterTags": [],
"protocols": null
}
So when I am pulling the entries from my db collection it sets the values of the Sensor and Action to null. This is my Java Code for receiving a device:
MongoClientURI connectionString = new MongoClientURI(dummy);
MongoClient mongoClient = new MongoClient(connectionString);
CodecRegistry pojoCodecRegistry = fromRegistries(MongoClient.getDefaultCodecRegistry(),
fromProviders(PojoCodecProvider.builder().automatic(true).build()));
MongoCollection<Device> devices = database.withCodecRegistry(pojoCodecRegistry).getCollection("devices", Device.class);
Device device = devices.find().first();
I am using the standard MongoDB Java Driver.
Could anyone tell me what I am missing here? Thanks in advance.
TL;DR: You need setters and getters per default config.
I wrote this Unit-Test to reproduce the error, but it does not reproduce except when I play around with the Action and Sensor classes. I am using Mongodb 3.4 and Java driver 3.6.
import com.mongodb.MongoClient;
import com.mongodb.MongoClientURI;
import com.mongodb.client.MongoCollection;
import com.mongodb.client.MongoDatabase;
import org.bson.codecs.configuration.CodecRegistry;
import org.bson.codecs.pojo.PojoCodecProvider;
import org.junit.Test;
import static org.bson.codecs.configuration.CodecRegistries.fromProviders;
import static org.bson.codecs.configuration.CodecRegistries.fromRegistries;
import static org.hamcrest.MatcherAssert.assertThat;
import static org.hamcrest.core.Is.is;
import static org.junit.Assert.assertNotNull;
public class MongoDeser {
#Test
public void testDeser() {
MongoClientURI connectionString = new MongoClientURI("mongodb://localhost:27017");
MongoClient mongoClient = new MongoClient(connectionString);
MongoDatabase database = mongoClient.getDatabase("sotest");
PojoCodecProvider codecProvider = PojoCodecProvider.builder()
.automatic(true)
.build();
CodecRegistry pojoCodecRegistry = fromRegistries(MongoClient.getDefaultCodecRegistry(), fromProviders(codecProvider));
MongoCollection<Device> devices = database.withCodecRegistry(pojoCodecRegistry).getCollection("device", Device.class);
Device device = devices.find().first();
assertNotNull(device.getActions());
assertThat(device.getActions().size(), is(1));
assertThat(device.getActions().get(0).getDeviceId(), is("7_openHabsamsungtv:tv:cbaf7d7d_4e10_41e6_9c1d_864988057bda"));
assertThat(device.getStatus(), is(0));
assertThat(device.getName(), is("[TV] Chine "));
}
}
Do you have missing getters and/or setters in your POJOs?
Mongodb Java driver uses reflection for mapping POJOs from BSON. It needs these Getters and Setters per default configuration. If you don't have them, it may behave erratically. In my testing, sometimes it couldn't find a codec and threw an exception, sometimes the fields were just nulled as in your case. My recommendation would be to use annotations instead and give the Java driver a convention for it.
Sensor class
import java.io.Serializable;
public class Sensor implements Serializable
{
private String id;
private String name;
private final static long serialVersionUID = 8244091126694748358L;
public String getId() {
return id;
}
public void setId(String id) {
this.id = id;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
}
Action class
import java.io.Serializable;
public class Action implements Serializable
{
private String id;
private String deviceId;
private String errorReport;
private String name;
private long state;
private long value;
private ValueOption valueOption;
private boolean valueable;
private final static long serialVersionUID = 3493217442158516855L;
public String getId() {
return id;
}
public String getDeviceId() {
return deviceId;
}
public String getErrorReport() {
return errorReport;
}
public String getName() {
return name;
}
public long getState() {
return state;
}
public long getValue() {
return value;
}
public ValueOption getValueOption() {
return valueOption;
}
public boolean isValueable() {
return valueable;
}
public static long getSerialVersionUID() {
return serialVersionUID;
}
public void setId(String id) {
this.id = id;
}
public void setDeviceId(String deviceId) {
this.deviceId = deviceId;
}
public void setErrorReport(String errorReport) {
this.errorReport = errorReport;
}
public void setName(String name) {
this.name = name;
}
public void setState(long state) {
this.state = state;
}
public void setValue(long value) {
this.value = value;
}
public void setValueOption(ValueOption valueOption) {
this.valueOption = valueOption;
}
public void setValueable(boolean valueable) {
this.valueable = valueable;
}
}
CodecRegistry codecRegistry = CodecRegistries.fromRegistries(
CodecRegistries.fromProviders(new pojoCodecRegistryCodecProvider()),
CodecRegistries.fromProviders(new ActionCodecProvider()),
CodecRegistries.fromProviders(new SensorCodecProvider()));