In my Oracle database, I have amount values which are stored as VARCHAR. When retrieving records from the database, I wish to map them to a POJO in which the amount values are represented as double. Unfortunately, I cannot use forced types as in the database everything is stored as VARCHAR and there is no pattern which identify a column as one which contains an amount value.
I was taking a look at the jOOQ converters which seem to be what I want. Therefore, I created a jOOQ converter for this purpose:
public class DoubleConverter implements Converter<String, Double> {
#Override
public Double from(String stringValue) {
return new Double(stringValue);
}
#Override
public String to(Double doubleValue) {
DecimalFormat df = new DecimalFormat();
df.setMinimumFractionDigits(0);
df.setGroupingUsed(false);
return df.format(doubleValue);
}
#Override
public Class<String> fromType() {
return String.class;
}
#Override
public Class<Double> toType() {
return Double.class;
}
}
However, I want to trigger this converter whenever I want to map the database record to my POJO using record.into(MyClass.class) and then trigger back whenever I am writing back to the database. How do you recommend to achieve this please?
While I strongly suggest you somehow get a list of all double columns and a apply a forced type configuration in your code generation configuration, in your particular case, you do not have to do much in order to be able to call record.into(MyClass.class).
Your record will contain only String values and your MyClass class will contain matching double attributes. jOOQ will auto-convert from String to double and vice versa. By default, the DefaultRecordMapper and DefaultRecordUnmapper will be applied.
You can override those by specifying a new RecordMapperProvider and a RecordUnmapperProvider as is documented here:
https://www.jooq.org/doc/latest/manual/sql-execution/fetching/pojos-with-recordmapper-provider
It will be a bit of work to do that properly.
Related
for a Table , lets say myTable, I have a column myColumn..
the default JOOQ generation creates myColumn as BigInteger,
I want to create it as BigDecimal.
This is the converter I am using.
public class myConverter extends AbstractConverter<BigInteger, BigDecimal> {
public BigIntToBigDecConverter(Class<BigInteger> fromType, Class<BigDecimal> toType) {
super(fromType, toType);
}
#Override
public BigDecimal from(BigInteger databaseObject) {
return new BigDecimal(databaseObject);
}
#Override
public BigInteger to(BigDecimal userObject) {
return new BigInteger(String.valueOf(userObject.intValue()));
}
}
How should be the forceType configuration look like in the XML file?
The reason why your columns are generated as BigInteger is because they're of type NUMERIC(n, 0), NUMBER(n, 0), or DECIMAL(n, 0) depending on the dialect you're using, but the key here is that the scale is 0, and n is large enough that it doesn't fit in a Long anymore.
The reason why your <forcedType/> configuration hasn't worked is because the type you've listed in <types/> is the Java type BigInteger, not the SQLDataType, see documentation here. In order to rewrite all zero scale decimals to produce BigDecimal, just write this:
<forcedType>
<name>NUMERIC</name>
<inputTypes>(?i:(NUMERIC|NUMBER|DECIMAL)\(\d+,0\))</inputTypes>
</forcedType>
You won't need a custome converter for that
I'm starting to play with Realm.io in an Android app that I'm writing. In one of my data objects, I'm required to store a currency value. Previously I had stored the value internally as a BigDecimal value and then converted that too and from a double value when moving in and out of the database.
I have always been told that it is a bad idea to store currency values in a double because of the way that they are handled.
Unfortunately, Realm.io doesn't support the storage and retrieval of BigDecimal objects.
Is the best solution to write my own currency class that extends RealmObject and keeps that as a member variable of by data object?
Emanuele from Realm here.
You are right, using floats or doubles for currency is a bad idea.
We don't support BigDecimal for now, and before we do we will have to see how that plays in relation to all other language bindings since we want realm files to be compatible across all the supported platforms.
Christian's idea is good, but I see the conversion to and from String to be a bit slow. If you don't need the arbitrary precision property of BigDecimal you could use long and multiply/divide by the factor your required precision calls for. This would also save a lot of space in terms of the size of the Realm file since integer values are bit packed.
That could work, but would probably be suboptimal if do calculations on your current BigDecimal objects.
You could also use the #Ignore annotation to provide a wrapper method for your custom objects like this:
public class Money extends RealmObject {
private String dbValue;
#Ignore private BigDecimal value;
public String getDbValue() {
return dbValue;
}
public void setDbValue(String dbValue) {
this.dbValue = dbValue;
}
public BigDecimal getValue() {
return new BigDecimal(getDbValue());
}
public void setValue(BigDecimal value) {
setDbValue(value.toString());
}
}
It is not perfect as you need to expose the *dbValue() methods, but it should work.
I would also suggest going to https://github.com/realm/realm-java/issues and make a feature request for this as BigDecimal is probably one of those java classes used by so many that it could warrant native Realm support, just like Date has.
What I do is store it as long
I have defined in my application a constant like so:
public static final BigDecimal MONEY_PRECISION = new BigDecimal(1000);
and when I need to store a big decimal it goes like this:
public class MoneyClass extends RealmObject {
long _price = 0;
public void set_price(BigDecimal price) {
this._price = price.longValue() * App.MONEY_PRECISION.longValue();
}
public BigDecimal get_price() {
return new BigDecimal(_price).divide(App.MONEY_PRECISION, 0, 0);
}
}
In theory this should be faster than saving it on strings , but I haven't really looked at the realm code much
My solution:
Define Interface:
public interface RealmConvert {
void convertToDB();
void convertToObj();
}
Define Entity:
#Ignore
private BigDecimal balance;
private String balanceStr;
#Override public void convertToDB() {
if (getBalance() != null) {
setBalanceStr(getBalance().toString());
}
}
#Override public void convertToObj() {
if (getBalanceStr() != null) {
setBalance(new BigDecimal(getBalanceStr()));
}
}
Before you copyToRealm:call method convertToDB
When you need to use the entity: call method convert obj
It's not an elegant solution, but it works.
Christian Melchior's answer doesn't work in my app.
I like to create a class representation of database tables in java.
A column is designed as a generic class so that it can handle all different datatypes table columns can possible have.
public class TableColumn<T> {
...
}
A table has 0 ... n TableColumns, so my table class does look like that:
public class Table {
protected ArrayList<TableColumn<T>> columns =
new ArrayList<TableColumn<T>>();
...
}
The idea is to add columns in the following way.
Table t = new Table();
t.addColumn(String.class);
t.addColumn(Integer.class);
t.addColumn(Date.class);
t.addColumn(String.class);
And then i can manipulate data in the following way:
String a = t.Cols(2).Row(3);
t.Col(2).Row(3) = "b";
But i am loosing type safty with my current way of achiving that ... My problem is how to implement columns because of the different data types columns potential can get.
Does someone has a clue?
Why not just create a different object for each table you have? Something like:
Class Players with fields:
String name;
int points;
int number;
Class Stadium with fields:
String location;
Date dateBuilt;
Class Team with fields:
String name;
ArrayList<Players> roster;
Then you could just hold all the values in a list or arraylist and have them separated by the database tables and not have to guess at which table you are in. You'd have to hold onto more objects, but you would be able to know more what you're dealing with.
If there is a limited amount of Type combinations you could use interfaces to be those combinations. This would allow you to be able to store the column in the same way, and you wouldn't need any special casting.
t.addColumn(MyInterface.class);
Another method, which would still wouldn't be quite as clean as what you want but is kind of unavoidable, is to use a new Class that allows you to take the burden of some of the casting and type checking away.
Example:
public static class MyWrapper{
Class<?>[] validPossibleClasses;
Object o;
public MyWrapper(Class<?> ...classes){
this.validPossibleClasses = classes;
}
public boolean validateClass(Class<?> clazz){
for (Class<?> c : validPossibleClasses){
if (!c.isAssignableFrom(clazz))
return false;
}
return true;
}
public void set(Object o) throws Exception{
if (!validateClass(o.getClass()))
throw new Exception("Bad Cast");
this.o = o;
}
public String getString(){
return (String) o;
}
public Integer getInt(){
return (Integer) o;
}
...
// more specific getters
}
The usage would be like this
String a = t.Cols(2).Row(3).getString();
t.Col(2).Row(3).set("b");
Let's say I have some json like this in mongo:
{"n":"5"}
and a java class like this:
#Entity
public class Example {
Integer n;
}
This works (I know that the json should store the value as an int not a string but I don't control that part).
Now when I have data like this morphia throws:
{"n":""}
I'm looking for a workaround (the behavior I'd like is for empty string to be treated same as null).
The only workaround I have so far is:
public class Example {
String n;
public Integer getN() {
return NumberUtils.isNumber(n) ? NumberUtils.createInteger(n) : null;
}
}
But I'm hoping for some way to hang an annotation on the Integer property that customizes the deserialization behavior.
So I asked this on the morphia google group and I thought I'd share the answer. Using the lifecycle annotation #PreLoad allows me to modify the DBObject before conversions into POJO takes place. So this should do it:
#PreLoad void fixup(DBObject obj) {
if (StringUtils.isEmpty(obj.get("n"))) {
obj.put("n",null);
}
}
I have a database table with a field that I need to read from and write to via Hibernate. It is string field, but the contents are encrypted. And for various reasons (e.g. a need to sort the plain text values), the encrypt/decrypt functions are implemented inside the database, not in Java.
The problem I'm struggling with now is finding a way to invoke the encrypt/decrypt functions in Hibernate-generated SQL everywhere that the field is referenced and in a way that's transparent to my application code. Is this possible? I've looked into Hibernate's support for "derived" properties, but unfortunately, that approach doesn't support read-write fields. Any ideas appreciated.
I don't think there's a way to make encryption like you've described it completely transparent to your application. The closest thing you can get is to make it transparent outside of entity. In your entity class:
#Entity
#SQLInsert(sql="INSERT INTO my_table(my_column, id) VALUES(encrypt(?),?)")
#SQLUpdate( sql="UPDATE my_table SET my_column = encrypt(?) WHERE id = ?")
public class MyEntity {
private String myValue;
....
#Formula("decrypt(my_column)")
public String getValue() {
return myValue;
}
public void setValue(String value) {
myValue = value;
}
#Column (name="my_column")
private String getValueCopy() {
return myValue;
}
private void setValueCopy(String value) {
}
}
value is mapped as derived property, you should be able to use it in queries.
valueCopy is private and is used to get around derived property being read-only.
SQLInsert and SQLUpdate is black voodoo magic to force encryption on insert / update. Note that parameter order IS important, you need to find out what order Hibernate would generate parameters in without using custom insert / update and then replicate it.
You could have a trigger internal to the database that, on retrieval, decrypts the value and replaces the returned result and on insert encrypts the value and replaces the stored result with the encrypted value. You could also do this with a view wrapper - i.e. have an insert trigger on the view, and have the view automatically decrypt the value.
To better explain: have a view that decrypts the value, and an on insert trigger that encrypts the value that is linked to the view.
Actually, in the end, I went a different route and submitted a patch to Hibernate. It was committed to trunk last week and so I think it will be in the next release following 3.5. Now, in property mappings, you can specify SQL "read" and "write" expressions to call SQL functions or perform some other kind of database-side conversion.
Assuming you have access to the encrypt/decrypt algorithm from within Java, I would set up my mapped class something like
public class encryptedTable {
#Column(name="encrypted_field")
private String encryptedValue;
#Transient
private String value;
public String getEncryptedValue() {
return encryptedValue;
}
public String getValue() {
return value;
}
public void setEncryptedValue(String encryptedValue) {
this.encryptedValue = encryptedValue;
this.value = decrypt(encryptedValue);
}
public void setValue(String value) {
this.value = value;
this.encryptedValue = encrypt(value);
}
}
And then use get/set Value as the accessor within your program and leave the get/set EncryptedValue for Hibernates use when accessing the database.
Why not just use the SQl server encryption that seems to already be in place by calling a stored proc in Hibernate instead of letting Hibernate generate a query?