I'm using MongoDB, latest version, in Java & Spring.
I want to be able to fill a class of type NotA from a collection which stores class type A.
Class A and Class NotA are exactly the same, with one difference: class NotA's name is, well, not A :)
for the sake of argument, class A looks like so:
public class A {
String name;
String domain;
}
And the A collection has objects which look like so:
{ "_id" : "b7990a90-7d95-4879-bb4a-5ec2fd13e262", "_class" : "com.someservice.A", "name" : "Dan", "domain":"global"}
For reasons unrelated to this question I can't read into A and then copy to NotA, I have to read directly to NotA (or some other object which is NOT A in between, and then to NotA, if there's no other choice).
I suppose I can read a DBObject and then manually copy all the fields, but would rather let the default reflection mechanism do its thing.
I've set the #Document annotation of NotA to #Document(collection = "A") but that's not enough, I need the rest of the way.
I'm guessing there's a simple solution to this problem, I just can't figure it out. Help?
If it's possible on your side - you can update your documents directly in Mongo collection to adjust _class value:
{ "_id" : "b7990a90-7d95-4879-bb4a-5ec2fd13e262", "_class" : "com.someservice.NotA", "name" : "Dan", "domain":"global"}
Related
I am trying to create a JSON object in Java. Object structure looks like this:
{
"a" : {
"b" : {
"c" : {
"d" : [ {
"value" : false
} ]
}
},
"id" : "123"
},
"type" : "test"
}
I am using com.fasterxml.jackson.databind.ObjectMapper and usually I just create a corresponding domain objects that can be converted to JSON string using ObjectMapper.writeValueAsString() method.
The problem is that in order to match the JSON structure above I will have to create a bunch of domain objects with only one field and on top of that creating a complete JSON object will involve a lot of boilerplate code just to set 2 or 3 JSIN properties.
I am wondering if there is a better approach that I am not familiar with. Maybe, using JSONPath or another library.
Gson (com.google.gson) allows to register custom Serializer, Deserializers and TypeAdapters, with which you can handle a structure without boilerplate classes. This allows for a better conversion from JSON-DTOs to your actual Domain Objects.
See here for an example: https://www.baeldung.com/gson-deserialization-guide
(I'm also quite interested in additional answers to this question.)
In a Java/Spring ReST application, I'm using swagger-annotations 1.3.7 I have a number of small classes (for example, GenderCode) that I use as properties in my ReST models. These classes have a single public property, called value. Using Jackson, my APIs can accept a simple String s and construct an instance of, say, GenderCode with its value set to s. Similarly, it can serialize a GenderCode as a simple String (which of course represents the value of value).
I would like my Swagger documentation to represent these objects as simple strings, since that represents what the JSON will look like. Instead it represents an complex type with a "value" key:
{
"genderCode": {
"value": ""
},
...
}
It should look simply like this:
{
"genderCode": "",
...
}
Here's what the Java model would look like:
public class Person {
#JsonProperty("genderCode")
#Valid
#KnownEnumValue
#ApiModelProperty(value = "GenderCode", dataType="string", required = false,
allowableValues=GenderCode.POSSIBLE_VALUES_DISPLAY)
private GenderCode genderCode;
...
}
Here's the definition of that property within the API definition file that Swagger generates:
"genderCode":{"enum":["ANY","M","F"],"description":"GenderCode","required":false,"type":"GenderCode"}
I've tried using an OverrideConverter, but that had no effect. Any thoughts on how this can be done?
I'm using Jackson to serialize a heterogeneous list. My list is declared like this:
List<Base> myList = new LinkedList<>();
I have classes Aggregate and Source in there:
myList.add(new Aggregate("count"));
myList.add(new Aggregate("group"));
myList.add(new Source("reader"));
Those classes both implement the Base interface. Each class has just a single property with a get/set method: Aggregate has "type", and Source has "name".
I use this code to try to serialize the list:
ObjectMapper om = new ObjectMapper();
om.configure(SerializationFeature.INDENT_OUTPUT, true);
StringWriter c = new StringWriter();
om.writeValue(c, myList);
System.out.println(c);
but I find the output JSON doesn't have any indication of what type of object was serialized:
[ {
"type" : "count"
}, {
"type" : "group"
}, {
"name" : "reader"
} ]
As such, I don't think I can possibly de-serialize the stream and have it work as I expect. How can I include class information on the serialized representation of each object in a heterogeneous collection such that the collection can be correctly de-serialized?
This is exactly described in http://wiki.fasterxml.com/JacksonPolymorphicDeserialization. Read it, but here are the most relevant parts.
To include type information for the elements, see section 1. There are two alternatives:
om.enableDefaultTyping() will store the class name for elements stored as Object or abstract types (including interfaces). See documentation for overloads. This will work with collections automatically.
Annotate Base with #JsonTypeInfo (e.g. #JsonTypeInfo(use=JsonTypeInfo.Id.CLASS, include=JsonTypeInfo.As.PROPERTY, property="#class")). To make this work with collections, you'll also need to tell Jackson you want to store a List<Base> (and see section 5):
om.writerWithType(new TypeReference<List<Base>>() {}).writeValue(...);
or
JavaType listBase = objectMapper.constructCollectionType(List.class, Base.class);
om.writerWithType(listBase).writeValue(...);
I've gotten my code working.
At first, I thought I could use WRAP_ROOT_VALUE to get the class information:
om.writer().with(SerializationFeature.WRAP_ROOT_VALUE).writeValue(c, myList);
That didn't work because it only made the root item have class information, and nothing else. I had hoped it would apply recursively.
And so I had to also quit using List<> directly and wrap my List<> object in its own class. Then, the base interface needs a decoration to tell it to write class info:
#JsonTypeInfo(use=JsonTypeInfo.Id.CLASS, include= JsonTypeInfo.As.PROPERTY, property="class")
public interface Base {
}
This gets the serializer to write a "class" property with the class value in it.
I have a java application that connects to a MongoDB Database through the Morphia library. My POJO that I store in the database has String field named _id and annotated with the #Id annotation (com.google.code.morphia.annotations.Id;).
I'm generating a new object ( it has null _id).
I call save(object) on the datastore provided by morphia.
The object gets updated after being stored and now has an _id value.
I call save(object) again and a new entry is created in the database with the same _id.
All consecutive save() operations on the object overwrite the old one and do not produce any new entries in the database.
So for example, after 10 save() calls on the same object my database ends up looking like this:
{ "_id" : { "$oid" : "539ade7ee4b0451f28ba0e2e"} , "className" : "blabla" , blabla ...}
{ "_id" : "539ade7ee4b0451f28ba0e2e" , "className" : "blabla" , blabla ...}
As seen those two entries have the same _id but with different representation. One has it as an object the other as a string. Normally I should have only one entry shouldn't I ?
Do not use a string for the _id. This will fix your problem:
#Id
protected ObjectId id;
While you could use protected String id (this shouldn't create duplicates IMHO), you'll have problems if you use #Reference and might run into weird edge cases elsewhere, so avoid it if possible.
I am trying to have a consistent db where the username and email are unique.
http://www.mongodb.org/display/DOCS/Indexes#Indexes-unique%3Atrue
http://code.google.com/p/morphia/wiki/EntityAnnotation
My user class looks like this:
public class User {
#Indexed(unique = true)
#Required
#MinLength(4)
public String username;
#Indexed(unique = true)
#Required
#Email
public String email;
#Required
#MinLength(6)
public String password;
#Valid
public Profile profile;
public User() {
...
I used the #Indexed(unique=true) annotation but it does not work. There are still duplicates in my db.
Any ideas how I can fix this?
Edit:
I read about ensureIndexes but this seems like a wrong approach, I don't want to upload duplicate data, just to see that its really a duplicate.
I want to block it right away.
somthing like
try{
ds.save(user);
}
catch(UniqueException e){
...
}
A unique index cannot be created if there are already duplicates in the column you are trying to index.
I would try running your ensureIndex commands from the mongo shell:
db.user.ensureIndex({'username':1},{unique:true})
db.user.ensureIndex({'email':1},{unique:true})
.. and also check that the indexes are set:
db.user.getIndexes()
Morphia should have WriteConcern.SAFE set by default, which will throw an exception when trying to insert documents that would violate a unique index.
There is good explanation about unique constraint right here Unique constraint with JPA and Bean Validation , does this help you at all? So what I would do is just to validate your data at controller level (or bean validate()) when checking other errors as well. That will do the job, but its not as cool than it would be with annotation.
Edit
Alternatively see this post Inserting data to MongoDB - no error, no insert which clearly describes that mongodb doesn't raise error by default of unique indexes if you don't tell it so, try configuring your mongodb to throw those errors too and see if you can work on with solution :(
Edit 2
It also crossed my mind that play 2 has a start up global class where you could try to access your database and run your indexed column commands like this db.things.ensureIndex({email:1},{unique:true}); ?? see more at http://www.playframework.org/documentation/2.0/JavaGlobal
I had the same issue, with play framework 1.2.6 and morphia 1.2.12.
The solution for the #Indexed(unique = true) annotation, is to let morpha to re create the collection.
So if I already had the "Account" collection in mongo, and annotated the email column, and re started the play app, nothing changed in the Account indexes.
If I dropped the Account ollection, morphia re crated it, and now the email column is unique:
> db.Account.drop()
true
After play restart: (I have a job to create initial accounts...)
> db.Account.getIndexes()
[
{
"v" : 1,
"key" : {
"_id" : 1
},
"ns" : "something.Account",
"name" : "_id_"
},
{
"v" : 1,
"key" : {
"email" : 1
},
"unique" : true,
"ns" : "something.Account",
"name" : "email_1"
}
]
Now, after an insert with an already existing email, I get a MongoException.DuplicateKey exception.
To create indexes, the Datastore.ensureIndexes() method needs to be called to apply the indexes to MongoDB. The method should be called after you have registered your entities with Morphia. It will then synchronously create your indexes. This should probably be done each time you start your application.
Morphia m = ...
Datastore ds = ...
m.map(Product.class);
ds.ensureIndexes(); //creates all defined with #Indexed
Morphia will create indexes for the collection with either the class name or with the #Entity annotation value.
For example if your class name is Author:
Please make sure you have #Indexed annotation in you Entity class and you have done these two steps:
m.map(Author.class);
ds.ensureIndexes();
Check indexes on mongo db
b.Author.getIndexes()
I am adding this answer, to emphasize that you can not create indexes with a custom collection name(Entity class is Author, but your collection name is different)
This scenario is obvious in many cases, where you want to reuse the Entity class if the schema is same