JPA: Parameterized instances of AttributeConverter - java

We are developing an application connected to a legacy database. This is very "untyped", using strings for almost all data. What is worse is that is far of being homogeneous: it uses different patterns for dates or times ('YYDDMM', 'HHMMSS', milliseconds) and booleans ('Y'/'N', 'X'/' '), for example.
We want to use JPA (EclipseLink) and custom converters. The problem is that #Convert expects a class implementing AttributeConverter, so we have to do new classes for each pattern. What I'd like is a BooleanConverter class, which can be instantiated with values 'Y'/'N' or 'X'/' '.
This is obviously out of JPA spec, but maybe it's possible using EclipseLink annotations/configuration. Looking at its #Convert annotation, a converter can be specified by name. This sounds good to me if I can register a ynBooleanConverter and xSpaceBooleanConverter:
// Unfortunately, this method does not exist :(
Session.addConverter('ynBooleanConverter', new BooleanConverter("Y", "N"));
#Entity
public class MyEntity {
#Convert("ynBooleanConverter")
private Boolean myBoolean;
...
}
Is it possible? What other options do we have?

Try #ObjectTypeConverter:
#Entity
#ObjectTypeConverters({
#ObjectTypeConverter(name = "ynBooleanConverter", objectType = Boolean.class, dataType = String.class,
conversionValues = {
#ConversionValue(objectValue = "true", dataValue = "Y"),
#ConversionValue(objectValue = "false", dataValue = "N") }),
#ObjectTypeConverter(name = "xSpaceBooleanConverter", objectType = Boolean.class, dataType = String.class,
conversionValues = {
#ConversionValue(objectValue = "true", dataValue = "X"),
#ConversionValue(objectValue = "false", dataValue = " ") }),
})
public class MyEntity {
#Convert("ynBooleanConverter")
private boolean ynBoolean;
#Convert("xSpaceBooleanConverter")
private boolean xSpaceBoolean;
}

So your Converter behaves different depending on some state in the context? I think I would try to bind the context info to a threadlocal variable which I can read back in the Converter implementation.
Do you have access to a CDI-implementation? Then its even more elegant to inject some bean with your context info into your Converter-implementation. You mentioned that you are missing some session-Methods? Maybe a #SessionScope'ed bean will help you.
Sadly #Inject is not specified in a converter class. You will need to lookup the bean "by hand" like mentioned in this post.

Too late to this thread, but here is a blog post which shows how JPA converters are to be written. Has working code for String and LocalDate conversions.

Related

Hibernate mapping between Postgres array of varchar and a Java/Kotlin collection of enum

Basically everything is in the title.
I have a column in my DB which is a varchar[].
I really would like to map it to a Java/Kotlin enum. We've already got this working to fetch it as a list of Strings (through com.vladmihalcea:hibernate-types and StringArrayType), but not with a mapping to an enum. Do you know if this is possible?
Since we know how to map a varchar to an enum, and a varchar[] to a collection of String, I would be tempted to think that this should possible, but I didn't succeed yet.
Here would be a simple sample of my current configuration:
CREATE TABLE test(my_values varchar[]) ;
INSERT INTO test(my_values) values ('{VAL1, VAL2}')
#Entity
#Table(name = "test")
data class DbTest(
#Column(name = "my_values")
val myValues: List<Values>
)
enum class Values {
VAL1, VAL2
}
I tried this: https://vladmihalcea.com/map-postgresql-enum-array-jpa-entity-property-hibernate/ which looks pretty good but you have to define the enum in the DB and we don't want that.
Thanks!
Previous answer does not work for me, but this working:
TypeDef(
name = "enums-array",
typeClass = ListArrayType::class,
parameters = [Parameter(name = EnumArrayType.SQL_ARRAY_TYPE, value = "varchar")]
)
I'm posting my solution, I didn't succeed to get a List<Values>, although I got an Array<Values> which was fine with me.
#Entity
#Table(name = "test")
#TypeDef(
name = "values-array",
typeClass = EnumArrayType::class,
defaultForType = Array<Values>::class,
parameters = [
Parameter(
name = EnumArrayType.SQL_ARRAY_TYPE,
value = "varchar"
)
]
)
data class DbTest(
#Type(type = "values-array")
#Column(name = "my_values", columnDefinition = "varchar[]")
val myValues: Array<Values>
)
enum class Values {
VAL1, VAL2
}
This is working like a charm and I can map my array to a list and vice versa quite easily, which is ok.
Hoping this will help someone someday ;)

Json to object deserialization issue in Graphql-spqr

Json to GraphQLArgumetn object conversion failing in graphql-spqr.
tried adding GraphQLInterface(with autodiscovery true and scanpackage) to above abstract classes
and GraphQLtype type all concrete classes.
My graph query:
query contactsQuery($searchQuery : QueryInput) { contacts(searchQuery:$searchQuery){id}}
variables:{"searchQuery":{"bool":{"conditions":[{"must":{"matches":[{"singleFieldMatch":{"boost":null,"field":"firstname","value":"siddiq"}}],"bool":null}}]}})
Java code:
#JsonTypeInfo(use=JsonTypeInfo.Id.NAME,include=JsonTypeInfo.As.WRAPPER_OBJECT)
#JsonSubTypes({#type(value = Must.class, name="must"),#type(value = MustNot.class, name="mustNot")})
public abstract class Condition
#JsonTypeInfo(use=JsonTypeInfo.Id.NAME,include=JsonTypeInfo.As.WRAPPER_OBJECT)
#JsonSubTypes({#type(value = SingleFieldMatch.class, name="singleFieldMatch"),#type(value = MultiFieldMatch.class, name="multiFieldMatch")})
public abstract class Match
#GraphQLQuery(name = "contacts")
public List getContacts(#GraphQLArgument(name ="searchQuery") Query query)
Still it's throwing error unknown field error etc. Not sure which configuration is missing.
Building GraphQLSchema with AnnotatedResolvedBuilder, base package configured JacksonValueMappperFactory and singleton services.
Hi this may be a similar issue to what I ended up having.
Initially I had the following
#JsonTypeInfo(use = Id.NAME, include = As.PROPERTY, property = "type")
#GraphQLInterface(name = "AbstractClass", implementationAutoDiscovery = true)
public abstract class AbstractClass{
with the following query called
addNewObject(object: {name: "soft2", id: "asdas"})
To get conversion functioning what I needed to do was the following change
#JsonTypeInfo(use = Id.NAME, include = As.EXISTING_PROPERTY, property = "type")
#GraphQLInterface(name = "AbstractClass", implementationAutoDiscovery = true)
public abstract class AbstractClass{
private String type = this.getClass().getSimpleName();
/**
* #return the type
*/
#GraphQLQuery(name = "type", description = "The concrete type of the node. This should match the initialised class. E.g. \"Concrete\", \"DecafCoffee\"")
public String getType() {
return type;
}
with the query now being
addNewConcreteObject(concrete: {name: "soft2", id: "asdas", type: "Concrete"})
Why this worked (I think):
When converting from JSON to objects in my code using the Jackson converter (ObjectMapper). I had previously noticed that the JSON required knowledge of what class to convert to. Thus the initial use of #JsonTypeInfo(use = Id.NAME, include = As.PROPERTY, property = "type") put a type property in the JSON when it was written to string.
The inclusion of the #JSON tag may be picked up by SPQR and it then seems to use a Jackson converter to try to convert your query to the required object.
If I am right, here is the issue.
As the query doesn't contain type the query can not be correctly converted. Moreover as the type property was not a member variable of the object but was instead only added by the ObjectMapper, SPQR didn't pick it up and so it wasn't part of the schema for the object. Thus to get around it, I added type as a member variable which is always equal to the actual class, then changed my JsonTypeInfo to look for an existing property.
I appreciate this isn't a direct answer to your question (and definitely isn't a pretty answer), but hopefully it will help you find your solution.

Spring data mongoDB partial index with constraint

I would like to create a very simple annotated java POJO and save it into mongodb. Basically, it is:
#Component("vehicle")
#Scope("prototype")
#Document(collection = "vehicle")
#CompoundIndexes({
#CompoundIndex(name = "plateNumber_idx", def = "{ 'plateNumber' : 1 }", unique = true),
#CompoundIndex(name = "vin_idx", def = "{ 'vin' : 1 }", unique = true),
#CompoundIndex(name = "motorNumber_idx", def = "{ 'motorNumber' : 1 }", unique = true)
})
public class Vehicle {
private String plateNumber;
private String vin;
private String motorNumber;
... getters, setters, equal, hash etc. ....
}
It is working properly, but in my case I need to add a partial index to motorNumber field. The reason is: not necessary fill this field in, therefore this field can be null. But the other hand, not allowed to be two or more similar motorNumber - except, when those are null. I can add partial index(s) to vehicle collection by hand, but it will be more elegant way to do it by annotations. For example, here is my partial index:
{"motorNumber" : {"$exists" : true}}
My question is: How can I add this option to #CompoundIndex ? Or there are any other options ?
I found your question while trying to do much the same thing.
As far as I can tell, neither spring-data-mongodb for spring-boot 1.5.x or 2.0.x supports Partial Indexes via the usual annotations.
However, spring-data-mongodb does allow you to create them programatically:
Index myIndex = new Index()
.background()
.unique()
.named("my_index_name")
.on("indexed_field_1", Sort.Direction.ASC)
.on("indexed_field_2", Sort.Direction.DESC)
.partial(PartialIndexFilter.of(
Criteria.where("criteria_field_1")
.is("BAR")));
DefaultIndexOperations indexOperations = new DefaultIndexOperations(mongoTemplate, "my_collection");
indexOperations.ensureIndex(myIndex);

Is there a way to get the declared value of constraint in hibernate validator in another class?

Using the hibernate validator i declare something like this
public class TestSomething{
#Length(max=30, message="Error Message.")
private String name;
getter and setter here
}
is it possible to get the maximum number of character in this case 30
something like
TestSomething ts = new TestSomething();
int maxValue = ts.getName.getThatMaximumNumberOrSomethng
will java reflection on this kind of situation?
You should use the Bean Validation metadata API. Provided you have a Validator instance you can get hold of a so called ConstraintDescriptor:
BeanDescriptor beanDescriptor = getBeanDescriptor( TestSomething.class );
PropertyDescriptor propertyDescriptor = beanDescriptor.getConstraintsForProperty( "name" );
Set<ConstraintDescriptor<?>> constraintDescriptors = propertyDescriptor.getConstraintDescriptors();
Once you have the right ConstraintDescriptor you can either call
constraintDescriptor.getAnnotation(); // to get the actual constraint annotation
constraintDescriptor.getAttributes().get("max"); // to retrieve the attribute from the attributes map provided by the descriptor as convenience

Parametrizable JSR-303 validation values

I use JSR-303 bean validation and Spring 3 and I need to provide different values for the annotation depending on the use-case.
For example, the value of min parameter in #Size(min=?) must be 1 for some validation and 5 for another case and I want to read this values from a properties file.
I know the message parameter can be read from ValidationMessages.properties file if provided as a key but what about the other parameter?
As outlined by dpb you can use validation groups to specify the same constraint with different attribute values.
If you're working with Hibernate Validator as BV implementation, based on that you could use the programmatic API instead of annotations to define your constraints. That way you could retrieve the concrete constraint values at runtime like this:
int minValue1 = ...; //read from properties file etc.
int minValue2 = ...;
//programmatically define the constraints for the Test type
ConstraintMapping mapping = new ConstraintMapping();
mapping.type( Test.class )
.property( "prop", FIELD )
.constraint( new NotNullDef() )
.constraint( new SizeDef().min( minValue1 ).groups( GroupOne.class ) )
.constraint( new SizeDef().min( minValue2 ).groups( GroupTwo.class ) );
//retrieve a validator using the programmatic constraint mapping
HibernateValidatorConfiguration config =
Validation.byProvider( HibernateValidator.class ).configure();
config.addMapping( mapping );
ValidatorFactory factory = config.buildValidatorFactory();
Validator validator = factory.getValidator();
The values for annotation parameters can only be compile time expressions. This means that for the #Size(min=X, max=Z) X and Z must be resolvable at compile time.
Since min and max are declared as int on #Size, you are stuck.
If you need different values for min, I personally see two ways of doing it.
First, you could use a grouping on the validators. Use one group for min=1 and one group for min=5. For example, lets consider a Test class:
public class Test {
#NotNull
#Size.List({
#Size(min = 1, groups = GroupOne.class),
#Size(min = 5, groups = GroupTwo.class)
})
private String prop;
public String getProp() {
return prop;
}
public void setProp(String prop) {
this.prop = prop;
}
}
You must declare the groups:
public interface GroupOne {}
public interface GroupTwo {}
Then create some testing object plus the validator to go with it:
Test test = new Test();
test.setProp("XY");
ValidatorFactory factory = Validation.buildDefaultValidatorFactory();
Validator validator = factory.getValidator();
Then validate using the groups:
Set<ConstraintViolation<Test>> resultOne = validator.validate(test, GroupOne.class);
Set<ConstraintViolation<Test>> resultTwo = validator.validate(test, GroupTwo.class);
First case is valid since min=1 and "XY".length() == 2 but second will fail because min=5.
This method involves doing the validation manually and I don't think you can just rely on #Valid on a #RequestMapping annotated method to do the validation (since #Valid is just a trigger for the validation with no way of mentioning the required group for validation). Luckly Spring is very flexible and it won't be much overhead to call the validator yourself.
The second option I see involves creating your own validation annotation with a custom validator to match. Here you can find a simple example to get you started. With this solution you can declare min and max as string keys that your validator will resolve in the bundles prior to validation. This solution though is more overhead than the first.

Categories

Resources