I have some data stored in MongoDb, that currently I do not want to map on POJO,
How we can get all unstructured fields of Documents in a single map.
I found this link
http://www.carfey.com/blog/using-mongodb-with-morphia/
that says you can mapped all unstructured fields using
// .. our base attributes here
private Map<String, Object> attributes
but its not working as I am contiguously getting it null,
I checked the Morphia code, It iterates on Morphia entity class fields, so how can we get all DBObject unmapped fields in attributes Map.
Out of interest: Why would you do that? You'll need to map the class and the one field, in which you want to add your data, to Morphia anyway. Or am I misunderstanding you?
How to do it:
You'll need to annotate all fields you want to store in the map with #Transient and add / load them in your custom #PrePersist and #PostLoad methods.
Related
I have some fields in Solr that I'm currently mapping to Java like so:
#Field("file_*") #Dynamic private final Map<String, List<String>> fileDetails;
this matches a number of dynamic fields like:
file_name_1
file_type_1
file_size_1
...
file_name_2
file_type_2
file_size_2
...
...
I then convert that fileDetails map to a FileAttachments object to be used elsewhere in the code.
That is currently resulting in this class having some specific code for converting from how the data is stored in Solr, into the POJO.
Ideally, I'd want to map it more like so:
#Field("file_*") private final FileAttachments attachments;
or even better (but possibly even harder to accomplish):
#Field("file_*") private final List<FileAttachment> attachments;
Then (aside from annotations) the class knows nothing about how the data is stored in Solr.
However, the problem, is, when I try and use a custom convertor for the field (as detailed here), the convertor doesn't get passed a Map<String, List<String>> instance, holding all the fields that matched file_*, instead it simply gets passed a String instance, holding the value of the first field that matched the pattern.
Is there any way to get it to pass the Map<String, List<String>> instance, or any other thoughts as to how I can get this conversion to happen outside of the class?
In an ideal world I'd change the solr document so it was a child object (well, list of them), but that isn't possible as it's part of a legacy system.
I am having a field of type "text" in mysql and storing json data in it.
Eg:- "["android_app","iphone_app","windows_app"]";
I am interacting with mysql using hibernate and while reading this field I am deserializing it to an arraylist in Java.
My question is, is this the best and fastest way to handle such cases or there are some better ways of doing it.
If you're able to take advantage of some JPA 2.1 features, you could use anAttributeConverter to handle this for you automatically without having to deal with this in your business code.
public class YourEntity {
// other stuff
#Convert(converter = StringArrayToJsonConverter.class)
List<String> textValues;
}
Then you just define the converter as follows:
public class StringArraytoJsonConverter
implements AttributeConverter<List<String>, String> {
#Override
public string convertToDatabaseColumn(List<String> list) {
// convert the list to a json string here and return it
}
#Override
public List<String> convertToEntityAttribute(String dbValue) {
// convert the json string to the array list here and return it
}
}
The best part is this becomes a reusable component that you can simply place anywhere you need to represent a json array as a List<> in your java classes but would rather store it as JSON in the database using a single text field.
Another alternative would be to avoid storing the data as JSON but instead use a real table where that it would allow you to actually query on the JSON values. To do this, you'd rewrite that mapping using JPA's #ElementCollection
#ElementCollection
private List<String> textValues;
Internally, Hibernate creates a secondary table where it stores the text string values for the array with a reference to the owning entity's primary key, where the entity primary key and the string value are considered the PK of this secondary table.
You then either handle serializing the List<> as a JSON array in your controller/business code to avoid mixing persistence with that type of medium, particularly given that most databases have not yet introduced a real JSON data-type yet :).
Idea:
Convert the POJO into flat map using
ObjectToMapTransformer
and form a solrinputdocument from the map and store it in Solr.
While retrieving get the document from the Solr and convert the Document into map and to POJO using MapToObjectTransformer
Problem:
While saving the SolrinputDocument to Solr the flatten key like A.B[0].c of the POJO is getting converted to A.B_0_.c in Solr.
This alternate form of storage in Solr makes it difficult to deserialize the solrDocument to POJO.
How to solve this problem? Or what is the alternate way of storing queryable Document in Solr which can be deserialized and serialized easily.
You usually wrap the fields in your POJO with the appropriate Solr fields that you're indexing that field into. See Mapping a POJO for Solr.
If you really want to serialize the complete object into Solr, serialize it into a single field, and if possible, use a string field (as that will store your object directly). If you want to actually search string values inside the object as well, you can use a text field - but since everything is imported into a single field, that'll have a few limitations (like if you want to score different fields or search for values in a single property from the objects).
So: Use the #Field annotation from SolrJ.beans to do specific POJO handling, or mangle it into a single field and search for strings in that field.
I have a MongoDB database that represents snippets of public gene information like so:
{
_id: 1,
symbol: "GENEA",
db_references: {
"DB A": "DBA000123",
"DB B" ["ABC123", "DEF456"]
}
}
I am trying to map this to a #Document-annotated POJO class, like this:
#Document
Public class Gene {
#Id
private int id;
private String symbol;
private Map<String,Object> db_references;
//getters and setters
}
Because of the nature of MongoDB's schema-less design, the db_references field can contain a long list of possible keys, with values sometimes being arrays or other key-value pairs. My primary concern is the speed at which I can fetch multiple Gene documents and slice up their db_references.
My question: what is the best way to represent this field to optimize fetching performance? Should I define a custom POJO and map this field to it? Should I make it a BasicDBObject? Or would it be best not map the documents at all with Spring Data and just use the MongoDB Java driver and parse the DBObjects returned?
Sorry to see your question hasn't been answered yet.
If db_references represent an actual concept within the domain you are much better off capturing this domain knowledge in a class. It is always a good idea and MongoDB helps with it a lot.
Thus, you can store this list of nested objects inside the MongoDB document and fetch the whole aggregate in a single query. Spring Data should also handle deserializing as well.
my case is the following. I have a JSF form with three outputtexts & the corresponding inputtexts lets say:
outputtext1 - inputtext1
outputtext2 - inputtext2
outputtext3 - inputtext3
Currently i use a backbean method 'Save' in order to store them into the database (hibernate object lets say table1 with primary key table1.id) into fields table1.field1, table1.field2, table1.field3.
So each record in the table have the values inserted in the inputtexts.
My question is how am i going to store form data in the database, in a form like the following:
{"outputtext1:inputtext1","outputtext2:inputtext2"."outputtext3:inputtext3"}
and then fetch them again, parse and rebuild the form. I am thinking of manipulating form data as JSON object... But i am new to both Java+JSON so some guideliness would be really useful for me!
This is an indicative example, form are going to by dynamic and created on the fly.
Why would you want to serialize/deserialize a JSON to send it directly to the database? Deserialization has its own issues and multiple deserializations could (not will) be a source of issues.
You should save the fields as attributes from a given entity and then, with the help of libraries like Gson, generate the JSON from the entity.
Update
Since your form is dynamic, you can use some adaptable entity structure to hold your data.
Your entities can either have a Map<String,String> attribute or a collection of, say, FieldRecord entities that contain a key - value pair.
I suggest this because a JSON in the database can lead to complex issues, in particular if you'll have to query that data later. YOu'll have to process the JSONs in order to either report or find which records have a particular field. This is just an example, things can get more complex.
Simple, you need a BLOB type column in your table to store the json and when you retrieve it in java you just need to decode the json and i recomend using https://code.google.com/p/json-simple/ its very simple
Convert JSONObject into a Stringform and then store.
And when your read it back, convert it back into JSONObject like below :
JSONObject obj = new JSONObject(stringRepresentationOfJSON);
change hibernate-mapping like this
JSONObject obj = new JSONObject(stringRepresentationOfJSON);