JPA: Use String fields, but modify them and save as byte array - java

we are using JSON String in our application to store a lot of configuration data. Now we want to save this to a database as BLOB. The JSON String will be converted to a binary representation (BSON) and then we want to store this.
Entity:
#Entity
#Table(name="TBL_CONFIG")
public class ConfigEntity {
[...]
#Column(name = "CONFIG")
#JSON
private String config;
[...]
}
Global EntityListener:
public class JsonBinaryConversionListener {
#PrePersist
public void process() {
// Check #JSON fields and for every field
[...]
field.set(objectConversion.toBson(field.get()));
[...]
// The field is String, the conversion gives us a byte array
}
}
The column CONFIG is set as BLOB. Only use the #Lob annotation wont work, because we want to change the value manually.
Is there a way that we can realize this through JPA?

If you are using EclipseLink you can use a Converter,
see,
http://www.eclipse.org/eclipselink/documentation/2.4/jpa/extensions/a_converter.htm#CHDEHJEB
Otherwise you can use property access (get/set) methods to convert the data,
http://en.wikibooks.org/wiki/Java_Persistence/Basic_Attributes#Conversion
The JPA 2.1 draft has also proposed Converter support.

Related

Validating field length by making use of Jooq and JPA's #Column annotation

I am trying to validate the data input by the user by making use of JSR 303 validations. One validation that I am trying to implement is to check that the size of the inputted value for each field does not exceed the maximum size of the corresponding column.
In order to map a field to a database column I am making use of JPA's #Column annotation as follows:
#ComplexValidation
public class Person {
#Column(table = "PERSON_DETAILS", name = "FIRST_NAME")
private String firstName;
}
The #ComplexValidation annotation on the Person class, is a JSR 303 custom constraint validator that I am trying to implement, which basically tries to carry out the following steps:
Retreives all fields in the class annotated with #Column annotation
It extracts the table name from the annotation and uses it to load the corresponding JOOQ generated class representing the table
It extracts the field name from the annotation and uses it to load the data type and size for the corresponding column
Is there any way in Jooq where I can retrieve the Jooq Generated class based on the table name? My first attempt can be found below, however it does not work since table(tableName) returns an SQLTable not a TableImpl object:
Column columnAnnotation = field.getDeclaredAnnotation(Column.class);
if (columnAnnotation != null) {
String tableName = columnAnnotation.table();
String fieldName = columnAnnotation.name();
TableField tableField = (TableField) ((TableImpl) table(tableName)).field(fieldName);
int columnLength = tableField.getDataType().length();
if (fieldValue.length() > columnLength) {
constraintValidatorContext
.buildConstraintViolationWithTemplate("exceeded maximum length")
.addPropertyNode(field.getName())
.addConstraintViolation();
}
}
Any other suggestions are welcome :)
Assuming you only have one generated schema (e.g. PUBLIC), you can access tables from there:
Table<?> table = PUBLIC.getTable(tableName);
See Schema.getTable(String)

MongoItemReader: how to deal with custom fields?

MongoItemReader provided by spring batch has method setFields:
public void setFields(java.lang.String fields)
JSON defining the fields to be returned from the matching documents by
MongoDB.
Parameters:
fields - JSON string that identifies the fields to sort by.
I have a class:
public class Raw {
private String id;
private String version;
private String client;
private String appName;
private String os;
// getters & setters
}
And I have data in mongodb like that:
{
"_id" : ObjectId("58a3373e1e041a1191cd5d6d"),
"Version" : "123",
"Client" : "SomeClient",
"MobilePlatform" : "iphoneos",
"AppName" : "MyAppName",
"Os" : "Windows 10"
}
- so as you can see all fields names start with capital letter.
Now I need to read data from mongo with spring batch.
And I need to map somehow fields in my Raw class to data in mongo DB so I will be able to fetch data.
I suspect that setFields method is just for such cases.
But I am relatively new to mongo and spring batch also,
so I would like to ask how to do that?
Which JSON should I put into setFields method?
Or probably there are some other options?
Any help is greatly appreciated.
I found the anwer by myself in documentation: http://docs.spring.io/spring-data/data-mongo/docs/1.4.2.RELEASE/reference/html/mapping-chapter.html
#Field annotation can be used and it works.

Indexing enum in compass-lucene

I am using Compass for filtering data out from a DTO object. I mark fields with #SearchableComponent if it is a POJO object, with SearchableProperty if it is a String. That works perfectly: I get the object and String indexed.
My question is how would I annotate an ENUM data type?
Example of enums I have:
public enum FooBar {
FOO("foo"),
BAR("bar");
private final String value;
..(constructor)..
public String value() {
return value;
}
}
Where in this snippet I should put an annotation and which annotation I should put?
From version 2.1 forward this works out of the box using #SearchableProperty annotation to the field that is of the type of this enum eg.
#SearchableProperty
FooBar foobar;
Search uses enum name as the returned type of filtering. Dealing with the value thing that was on the question is to be handled after search is made with names.
See release notes of Compass 2.1.0.

Need to know if each field has changed, how should I model this in Hibernate

So I have a class with three fields that maps to a table using hibernate
Class Widget
{
String field1;
String field2;
String field3;
}
On application startup a number of instances these widgets will be added to the database from an external files, but when I exit the application I need to know which (if any) of these fields have been changed by the user since the application was started, so the changes can be saved back to the files. I also need to store the original value for logging purposes.
I can't work whether I need a status field in the table or whether there is already a way of doing this using Hibernate/Database.
EDIT:A good solution to the program was given below . however the main reason I am using Hibernate is to reduce memory consumption so storing the original values when changed is not a good solution for me , I want everthing stored in the database. So I have create this new question How do I store a copy of each entity I add to database in Hibernate
Given an entity like the following you can track changes on one of it's field (while preserving its original value too).
#Entity
#Table(schema = "test", name = "test")
public final class Test {
private static final int ORIGINAL = 0;
private static final int CURRENT = 1;
private Integer id;
// holds the original and current state of the field
private final AtomicReferenceArray<String> field = new AtomicReferenceArray<>(2);
#Id
public Integer getId() {
return id;
}
public void setId(Integer id) {
this.id = id;
}
#Transient
public String getOriginalField() {
return field.get(ORIGINAL);
}
#Basic
public String getField() {
return field.get(CURRENT);
}
public void setField(String field) {
this.field.compareAndSet(ORIGINAL, null, field);
this.field.set(CURRENT, field);
}
#PreUpdate
public void preUpdate() {
System.out.format("Original: %s, New: %s\n", getOriginalField(), getField());
}
...
}
If there is a single row in a database like this:
id: 1
field: a
version: 2011-12-02 11:24:00
before the field gets updated (say, from a to b) you'll get the following output.
Original: d, New: b
The original value gets preserved even if the the entity is updated multiple times and both state can be accessed through the corresponding getters (getField and getOriginalField—you can get more creative than me in the naming :).
This way, you can spare yourself from creating version columns in your database and also can hide the implementation details from clients.
Instead of an AtomicReferenceArray you could use arrays, lists, etc, to track all changes like this way.
The #PreUpdate isn't necessary of course, but this way you can be notified of changes in the entity's state and atomically save the updated fields into file. There more annotations like these: see the documentation for javax.persistence for other annotation types.
If you are using MySql then you can get table's last update time from information_schema database like
SELECT UPDATE_TIME FROM `information_schema`.`tables`
WHERE TABLE_SCHEMA = 'dbName' AND TABLE_NAME = 'tableName'
Or else simple solution will be to add a column for update time stamp. By this you can even monitor which particular row has been updated.
If you need to synchronize with files as soon as you save into database, You can use the Hibernate event mechanism to intercept any save to database and save it to file, here's a sample doing that.

Best practice for storing global data in J2EE App using Hibernate

I'm looking for the best solution to store Java EE application's global data using Hibernate. It will consist of key value pairs. Example:
projectStarted = "10-11-11"
developerNumber = 3
teamLeader = "John"
As you see, all of this entries have different types.
For now I see two options:
1) Create GlobalData entity. Each field of it will be represented as unique column in the table and will contain unique setting. This way I have no problems with type casting, but I would like to avoid it in case where there will be big amount of settings.
2) Create Setting entity. Each of it will contain two fields: key(Primary key) and value and will be represented as unique record in the table. This is preferable solution, but It's seems to me that I will get a lot of type casting, because settings can be any type.
So basically, I'm looking for the way to implement second solution without getting a lot of troubles from different types. Can anybody help me?
Thanks.
Edit 1.
Yeah, thanks Christian. Just got similar idea.
What if I will have Settings entity, which will be like:
#Entity
#Table(name = "settings")
public class Setting {
#Column
private String key;
#Column
private String value;
#Column
private String converterClassFullName; //example by.lugovsky.MyConverter
//Getters, setters
}
And GlobalData class.
public class GlobalData {
private Date projectStarted;
private int developerNumber;
private String teamLeader;
Set<Setting> settings;
//Getters and setters for all, except settings.
}
So basically my idea is to convert Setting entity before persisting/updating/ etc. I can do this in my DAO, but I was wondering, if I could annotate GlobalData class with #Entity annotation as well without creating new table. This way I can set OneToMany annotation to Setting's set and Perform conversions in the internal #PrePersist etc. methods.
Will Hibernate allow me to do this?
Thanks again
You could store a Converter-Class into the db and the let it run through the given converter for a property before using the value. JSF offers Converter API:
public interface Converter{
public Object getAsObject(FacesContext fc, UIComponent component, String value) throws ConverterException;
public String getAsString(FacesContext fc, UIComponent component, Object obj) throws ConverterException;
}
If you have a schema with
name: String
value: String
converter: Class
then you could do something like this:
PropertyEntry pe = // Get from OR-Mapper
Converter c = (Converter) pe.getConverter().newInstance();
Object o = c.getAsObject(null, null, pe.getValue());
// use the object o instead of value
For even more coolness you could also define a field in the class which will not be persisted which you could use to hold the converted value within the object.

Categories

Resources