I'm having trouble converting between java.sql.Timestamp and java.time.Instant using JOOQ converters.
Here's a simplified version of the code I'm working with.
public class User {
private static final Converter<Timestamp, Instant> MY_CONVERTER= Converter.of(
Timestamp.class,
Instant.class,
t -> t == null ? null : t.toInstant(),
i -> i == null ? null : Timestamp.from(i)
)
public static Table<?> table = DSL.table("user");
public static Field<String> name = DSL.field(DSL.name(table.getName(), "name"), String.class);
public static Field<Instant> name = DSL.field(DSL.name(table.getCreated(), "created"), SQLDataType.TIMESTAMP.asConvertedDataType(Converter.of(MY_CONVERTER)));
}
private class UserDto {
private String name;
private Instant created;
// getters, setters, etc.
}
public class UserWriter {
// constructor with injected DefaultDSLContext etc..
public void create(UserDto user) {
dslContext.insertInto(User.table, User.firstName, User.lastName)
.values(user.getName(), user.getCreated())
.execute();
}
}
public class UserReader {
// constructor with injected DefaultDSLContext etc..
public Result<Record> getAll() {
return dslContext.select().from(User.table).fetch();
}
}
public class UserService {
// constructor with injected UserReader etc..
public Collection<UserDto> getAll() {
return userReader
.getAll()
.stream()
.map(Users::from)
.collect(Collectors.toList());
}
}
public class Users {
public static UserDto from(Record record) {
UserDto user = new UserDto();
user.setName(record.get(User.name));
user.setCreated(record.get(User.created);
return user;
}
}
When I create a new User the converter is called and the insertion works fine. However, when I select the Users the converter isn't called and the record.get(User.created) call in the Users::from method returns a Timestamp (and therefore fails as UserDto.setCreated expects an Instant).
Any ideas?
Thanks!
Why the converter isn't applied
From the way you phrased your question (you didn't post the exact SELECT statement that you've tried), I'm assuming you didn't pass all the column expressions explicitly. But then, how would jOOQ be able to find out what columns your table has? You declared some column expressions in some class, but that class isn't following any structure known to jOOQ. The only way to get jOOQ to fetch all known columns is to make them known to jOOQ, using code generation (see below).
You could, of course,let User extend the internal org.jooq.impl.TableImpl class and use internal API to register the Field values. But why do that manually, if you can generate this code?
Code generation
I'll repeat the main point of my previous question, which is: Please use the code generator. I've now written an entire article on why you should do this. Once jOOQ knows all of your meta data via code generation, you can just automatically select all columns like this:
UserRecord user = ctx
.selectFrom(USER)
.where(USER.ID.eq(...))
.fetchOne();
Not just that, you can also configure your data types as INSTANT using a <forcedType>, so you don't need to worry about data type conversion every time.
I cannot stress this enough, and I'm frequently surprised how many projects try to use jOOQ without code generation, which removes so much of jOOQ's power. The main reason to not use code generation is if your schema is dynamic, but since you have that User class, it obviously isn't dynamic.
Related
I'm experimenting with Spring Data REST and so far it's going relatively well. I'm able to query and manipulate the entities, and I have reached a point where I'd like to filter the retrieved data by a variable number of parameters. For this purpose I've been reading and decided on QueryDSL which is integrated nicely with Spring, and it works (almost) flawlessly when using fields from the entities.
However, my filtering form contains some parameters which have no direct mapping to the entity, leading to this question. For the sake of brevity, I'll be using an over-simplified example, hence my using of a persons's age instead of birth-date & etc.
Supposing we have the following Person entity:
#Data
#NoArgsConstructor
#Entity
public class Person {
#Id
#GeneratedValue
private UUID id;
private String name;
private String lastName;
private Integer age;
}
... and the appropriate repo
#RepositoryRestResource
public interface PersonRepository extends CrudRepository<Person, UUID>, QuerydslPredicateExecutor<Person>, QuerydslBinderCustomizer<QPerson> {
#RestResource
Page<Person> findAll(#QuerydslPredicate Predicate predicate, Pageable pageable);
#Override
default void customize(QuerydslBindings bindings, QPerson person) {
bindings.bind(String.class).first((SingleValueBinding<StringPath, String>) StringExpression::containsIgnoreCase);
}
}
... one can access and filter persons by name or last name (case insensitive) via http://<server>/persons?name=whatever, so far so good.
Next step, I would like to see only the people that are "pensionable", let's say over 65 years old, so the URL would look like http://<server>/persons?pensionable=true. However, pensionable is not an attribute in the Person entity, so adding it as a request param doesn't do anything.
I've been trying to figure out how this can be achieved or if this is currently a limitation of the framework(s), but my searches haven't been successful so far. Eventually via trial and error, I've come up with something that seems to work but feels more like a hack:
Create a different PersonExtendedFilter bean (not entity) which includes the extra/arbitrary params:
#Data
#NoArgsConstructor
public class PersonExtendedFilter{
private Boolean pensionable;
}
... create a BooleanPath using the above, and use it to define a binding inside the repo's customize method:
#Override
default void customize(QuerydslBindings bindings, QPerson person) {
bindings.bind(String.class).first((SingleValueBinding<StringPath, String>) StringExpression::containsIgnoreCase);
BooleanPath pensionable = new PathBuilder<>(PersonExtendedFilter.class, "personExtendedFilter").getBoolean("pensionable");
bindings.bind(pensionable).first((path, value) -> new BooleanBuilder().and(value ? person.age.gt(65) : person.age.loe(65)));
}
Bottom line, I'm wondering whether there is an elegant way of doing this or if I missing something, be it from a logical POV, a RTFM one, or something else.
I am considering moving from Hibernate to jOOQ but I can't find e.g.
how to have Pattern-Constraints on a String like this in Hibernate:
#NotEmpty(message = "Firstname cannot be empty")
#Pattern(regexp = "^[a-zA-Z0-9_]*$", message = "First Name can only contain characters.")
private String firstname;
How would I do that in jOOQ?
The "jOOQ way"
The "jOOQ way" to do such validation would be to create either:
A CHECK constraint in the database.
A trigger in the database.
A domain in the database.
After all, if you want to ensure data integrity, the database is where such constraints and integrity checks belong (possibly in addition to functionally equivalent client-side validation). Imagine a batch job, a Perl script, or even a JDBC statement that bypasses JSR-303 validation. You'll find yourself with corrupt data in no time.
If you do want to implement client-side validation, you can still use JSR-303 on your DTOs, which interact with your UI, for instance. But you will have to perform validation before passing the data to jOOQ for storage (as artbristol explained).
Using a Converter
You could, however, use your own custom type by declaring a Converter on individual columns and by registering such Converter with the source code generator.
Essentially, a Converter is:
public interface Converter<T, U> extends Serializable {
U from(T databaseObject);
T to(U userObject);
Class<T> fromType();
Class<U> toType();
}
In your case, you could implement your annotations as such:
public class NotEmptyAlphaNumericValidator implements Converter<String, String> {
// Validation
public String to(String userObject) {
assertNotEmpty(userObject);
assertMatches(userObject, "^[a-zA-Z0-9_]*$");
return userObject;
}
// Boilerplate
public String from(String databaseObject) { return databaseObject; }
public Class<String> fromType() { return String.class; }
public Class<String> toType() { return String.class; }
}
Note that this is more of a workaround, as Converter hasn't been designed for this use-case, even if it can perfectly implement it.
Using formal client-side validation
There's also a pending feature request #4543 to add more support for client-side validation. As of jOOQ 3.7, this is not yet implemented.
I recommend you don't try to use jOOQ in a 'hibernate/JPA' way. Leave the jOOQ generated classes as they are and map to your own domain classes manually, which you are free to annotate however you like. You can then call a JSR validator before you attempt to persist them.
For example, jOOQ might generate the following class
public class BookRecord extends UpdatableRecordImpl<BookRecord> {
private String firstname;
public void setId(Integer value) { /* ... */ }
public Integer getId() { /* ... */ }
}
You can create your own domain object
public class Book {
#NotEmpty(message = "Firstname cannot be empty")
#Pattern(regexp = "^[a-zA-Z0-9_]*$", message = "First Name can only contain characters.")
private String firstname;
public void setId(Integer value) { /* ... */ }
public Integer getId() { /* ... */ }
}
and map by hand once you've retrieved a BookRecord, in your DAO layer
Book book = new Book();
book.setId(bookRecord.getId());
book.setFirstname(bookRecord.getFirstname());
This seems quite tedious (and ORM tries to spare you this tedium) but actually it scales quite well to complicated domain objects, in my opinion, and it's always easy to figure out the flow of data in your application.
I have a java api which performs an external resource lookup and then maps the values to a Pojo. To do this, the api needs the field names of the Pojo as string values, something like:
public <F> F populatePojoFields(String primaryField, String secondaryField);
This works fine, however passing the pojo field names as String to the api does not feel right. I was able to change this by writing marker annotations for the pojo, so now it is like
public class POJO {
#Primary //custom marker annotation
private int mojo;
#Secondary //custom marker annotation
private String jojo;
}
String primaryField = getFieldNameUsingReflection(Pojo.class, Primary.class)
String secondryField = getFieldNameUsingReflection(Pojo.class, Secondary.class)
Pojo pojo = populatePojoFields(primaryField, secondaryField);
This way I don't have to keep track of string values, I can just add marker annotations to the Pojo fields. This works fine, but I'm worried about performance. Is this a standard way to do things? as keeping hardcoded string values is more efficient than looking up the field names every time we need to call the api. Is there a better way to do this?
If you call getFieldNameUsingReflection often you can think to cache the result of this call.
You can use a singleton class with internal Map with a code like the following:
public class SingletonMapPrimarySecondary {
Map<Class, String> mapPrimary;
Map<Class, String> mapSecondary;
// TODO: Handle mapPrimary and mapSecondary creation and singleton pattern
public String getPrimary(Class clazz) {
String primary = mapPrimary.get(clazz);
if (primary == null) {
primary = getFieldNameUsingReflection(clazz, Primary.class);
mapPrimary.put(clazz, primary);
}
return primary;
}
public String getSecondary(Class clazz) {
// TODO: Similar to getPrimary
}
}
I am persisting a object:
#Document
public class PotentialCandidates {
#Id
private String jobid;
#CreatedDate
private DateTime created;
#LastModifiedDate
private DateTime modified;
private DBObject potentialcandidates;
public String getJobid() {
return this.jobid;
}
public void setJobid(String jobid) {
this.jobid = jobid;
}
public DBObject getPotentialcandidates() {
return this.potentialcandidates;
}
public void setPotentialcandidates(DBObject potentialcandidates) {
this.potentialcandidates = potentialcandidates;
}
}
where potentialCandidates are set from a JSON string as so:
potentialCandidatesObj.setPotentialcandidates((DBObject)JSON.parse(valStr));
This persists fine to my mongodb and gives me an object on the DB I can drill down into, however when I try to retrieve my db object:
public PotentialCandidates getPotentialCandidatesByJobid(String jobid) throws NoSuchPotentialCandidatesException , SystemException{
PotentialCandidates Jobid = null;
try {
Query query = new Query();
query.addCriteria(Criteria.where("_id").is(jobid));
Jobid = mongoTemplateJobs.findOne(query, PotentialCandidates.class,
COLLECTION_NAME);
return Jobid;
} catch (Exception ex) {
throw new SystemException(ex);
} finally {
if (Jobid == null) {
throw new NoSuchPotentialCandidatesException("No User with jobid: "
+ jobid + "found..");
}
}
}
I encounter the following error:
org.springframework.core.convert.ConversionFailedException: Failed to convert from type java.util.ArrayList<?> to type com.mongodb.DBObject for value 'myString'; nested exception is org.springframework.core.convert.ConverterNotFoundException: No converter found capable of converting from type java.util.LinkedHashMap<?, ?> to type com.mongodb.DBObject
So it would seem I need some sort of logic to handle retrieves from mongo. I could use a different return class in my findOne query but that seems a little messy. Is there a standard approach to dealing with this?
your error is probably exactly what it says in your exception: a ConversionFailed Exception caused by someone/something trying to convert from ArrayList to a LinkedHashMap; but there is just no fitting converter for that (ConverterNotFoundException).
where exactly this is happening is impossible to say since you only posted very little code. i can not find the String "myString" in your code, yet it is mentioned in the error.
Is there a standard approach to dealing with this?
spring data usually uses converters in its mapping process. to have more control over the mapping process some people prefer to implement and register a custom converter for their classes.
you can read about converters here
http://docs.spring.io/spring-data/data-mongo/docs/current/reference/html/mongo.core.html#mongo.custom-converters
and here
http://docs.spring.io/spring/docs/current/spring-framework-reference/html/validation.html#core-convert
maybe this will already be enough for you to fix the error yourself.
Edit: a short comment about this line:
potentialCandidatesObj.setPotentialcandidates((DBObject)JSON.parse(valStr));
you are casting to DBObject before calling the setter, because the setter takes a DBObject. this is bad, you should create another setter for JSON and do the casting there, or you will end up doing that casting operation everywhere in your code; that's not very DRY.
there is also something called DBRefs in spring data:
The mapping framework doesn't have to store child objects embedded within the document. You can also store them separately and use a DBRef to refer to that document. When the object is loaded from MongoDB, those references will be eagerly resolved and you will get back a mapped object that looks the same as if it had been stored embedded within your master document.
you might prefer this over a embedded DBObject.
Abstract
I have a working application in Appengine using Java and JDO 3.
I found these arguments (auto_now and auto_now_add) which correspond exactly what I want to implement in Java. So essentially the question is: How to convert AppEngine's Python DateTimeProperty to Java JDO?
Constraints
Converting my application to Python is not an option.
Adding two Date properties and manually populating these values whenever a create/update happens is not an option.
I'm looking for a solution which corresponds to what JDO/Appengine/Database authors had in mind for this scenario when they created the APIs.
It would be preferable to have a generic option: say I have 4 entities in classes: C1, C2, C3, C4 and the solution is to add a base class C0, which all 4 entities would extend, so the 4 entities don't even know they're being "audited".
[update] I tried (using a simple entity)
#PersistenceCapable public class MyEntity {
#Persistent(valueStrategy = IdGeneratorStrategy.IDENTITY, primaryKey = "true")
private Long id;
#Persistent private String name;
...
1. #Persistent public void getLastUpdate() { return new Date(); }
As suggested by answer, but it seems to always update the value, even when I just load the value from the datastore or just modify an unrelated field (e.g. String name).
You can easily enough have a property (setter/getter) on a java class and have the property persistable (rather than the field). Within that getter you can code whatever you want to control what value goes into the datastore.
If I didn't do the following hack, I can't read the value stored in the datastore [neither with the hack :( ]:
#Persistent public Date getLastUpdate() { return new Date(); }
private Date prevUpdate;
public void setLastUpdate(Date lastUpdate) { this.prevUpdate = lastUpdate; }
public Date getPrevUpdate() { return prevUpdate; }
Is there any way to differentiate if a persistence operation is in progress or my code is calling the getter?
2. #Persistent(customValueStrategy = "auto_now_add") private Date lastUpdate;
I modeled auto_now_add after org.datanucleus.store.valuegenerator.TimestampGenerator replacing Timestamp with java.util.Date.
But it was only populated once at the first makePersistent call, regardless of how many times I modified other fields and called makePersistent. Also note that it doesn't seem to behave as the documentation says (or my English is rusty):
Please note that by defining a value-strategy for a field then it will, by default, always generate a value for that field on persist. If the field can store nulls and you only want it to generate the value at persist when it is null (i.e you haven't assigned a value yourself) then you can add the extension "strategy-when-notnull" as false
3. preStore using PersistenceManager.addInstanceLifecycleListener
Works as expected, but I could make it work across multiple entities using a base class.
pm.addInstanceLifecycleListener(new StoreLifecycleListener() {
#Override public void preStore(InstanceLifecycleEvent event) {
MyEntity entity = (MyEntity)event.getPersistentInstance();
entity.setLastUpdate(new Date());
}
#Override public void postStore(InstanceLifecycleEvent event) {}
}, MyEntity.class);
4. implements StoreCallback and public void jdoPreStore() { this.setLastUpdate(new Date()); }
Works as expected, but I could make it work across multiple entities using a base class.
To satisfy my 4th constraint (using solutions 3 or 4)
Whatever I do I can't make the following structure work:
public abstract class Dateable implements StoreCallback {
#Persistent private Date created;
#Persistent private Date lastUpdate;
public Dateable() { created = new Date(); }
public void jdoPreStore() { this.setLastUpdate(new Date()); }
// ... normal get/set properties for the above two
}
#PersistenceCapable public class MyEntity extends Dateable {
#Persistent(valueStrategy = IdGeneratorStrategy.IDENTITY, primaryKey = "true") private Long id;
#Persistent private String name;
The problems when the enhancer runs:
public abstract class Dateable:
DataNucleus.MetaData Registering class "[...].Dateable" as not having MetaData.
public abstract class Dateable with the above log, but running the code anyway:
Creation date changes whenever I create or read the data from datastore.
#PersistenceCapable public abstract class Dateable:
DataNucleus.MetaData Class "[...].MyEntity" has been specified with 1 primary key fields, but this class is using datastore identity and should be application identity.
JDO simply provides persistence of Java classes (and its fields/properties) so don't see what the design of JDO has to do with it.
You can easily enough have a property (setter/getter) on a java class and have the property persistable (rather than the field). Within that getter you can code whatever you want to control what value goes into the datastore. Either that or you use a preStore listener to be able to set things just before persistence so the desired value goes into the datastore.