JPA 2.1: Introducing Java 8 Date/Time API - java

I'd like to add support for the Java 8 Date/Time API (JSR-310) in my JPA-enabled application.
It's clear that JPA 2.1 does not support the Java 8 Date/Time API.
As a workaround, the most common advise is to use an AttributeConverter.
In my existing application, I changed my entities to use LocalDate/LocalDateTime types for the column mapping fields and added legacy setter/getters for java.util.Date to them.
I created corresponding AttributeConverter classes.
My application does now fail when using Query.setParameter() with java.util.Date instances (it worked before the transition to the new API). It seems that JPA expects the new date types and does not convert it on the fly.
I expected that if passing an argument to setParameter() of a type for which an AttributeConverter has been registered, it would be automatically converted by the converter.
But this seems to be not the case, at least not using EclipseLink 2.6.2:
java.lang.IllegalArgumentException: You have attempted to set a value of type class java.util.Date for parameter closeDate with expected type of class java.time.LocalDate from query string SELECT obj FROM [...]
at org.eclipse.persistence.internal.jpa.QueryImpl.setParameterInternal(QueryImpl.java:937) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
at org.eclipse.persistence.internal.jpa.EJBQueryImpl.setParameter(EJBQueryImpl.java:593) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
at org.eclipse.persistence.internal.jpa.EJBQueryImpl.setParameter(EJBQueryImpl.java:1) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
[...]
Questions:
Is this behavior expected? Did I miss something?
Is there a way to use the new date types as fields without breaking existing code?
How did you deal with the transition to the new Date/Time API within JPA?
UPDATE:
However, It seems that at least using EclipseLink, custom types for which an AttributeConverter exists, are not fully supported:
Within JPQL queries, neither the actual field type nor the converted database type can be used as a parameter.
When using the converted database type, the exception described above occurs.
When using the actual field type (e.g. LocalDate), it's directly passed to the jdbc driver which doesn't know this type:
Caused by: java.sql.SQLException: Invalid column type
at oracle.jdbc.driver.OraclePreparedStatement.setObjectCritical(OraclePreparedStatement.java:10495)
at oracle.jdbc.driver.OraclePreparedStatement.setObjectInternal(OraclePreparedStatement.java:9974)
at oracle.jdbc.driver.OraclePreparedStatement.setObjectInternal(OraclePreparedStatement.java:10799)
at oracle.jdbc.driver.OraclePreparedStatement.setObject(OraclePreparedStatement.java:10776)
at oracle.jdbc.driver.OraclePreparedStatementWrapper.setObject(OraclePreparedStatementWrapper.java:241)
at org.eclipse.persistence.internal.databaseaccess.DatabasePlatform.setParameterValueInDatabaseCall(DatabasePlatform.java:2506)
I would expect that EclipseLink converts the field type to the java.sql type using the AttributeConverter.
(see also this bug report: https://bugs.eclipse.org/bugs/show_bug.cgi?id=494999 )
Which leads us to the most important question #4:
Is there a workaround/solution to support java 8 date fields using EclipseLink, including the possibility to use a query parameters on such a field?
ADDITIONAL INFO
AttributeConverter used (for LocalDateTime conversion)
Additional information to reproduce exception

Some time ago, I converted a Java EE 7 web app from Java 7 to Java 8, and replaced java.util.Date in entities with LocalDate and LocalDateTime.
Yes, that behavior is expected, because an AttributeConverter only converts between the type used for entity fields and the database type (ie, java.sql.Date, etc.); it does not convert between an entity field type and a java.util.Date used in a query parameter.
As far as I know, no, there is no way to continue using java.util.Date in existing code, after introducing the java.time types into JPA entities.
Besides creating the necessary AttributeConverter implementations, I changed all occurrences of java.util.Date to the appropriate java.time types, not only in entities but also in JPA-QL queries and in business methods.
For item 2, of course you can go some way by using utility methods and getters/setters that convert between java.util and java.time, but it's not going to go all the way. More importantly, I don't quite see the point of introducing java.time types into JPA entity attributes, if you are not willing to convert the remaining code that uses these
attributes. After the conversion work I did in that Java EE app, there were no uses of java.util.Date left anywhere (though I also had to create converters for JSF).

With so many bugs in the provider itself I don't think you have much choice but to use java.util.Date on the mapping level and java 8 dates on the API level.
Assuming you write a utility class for conversion to/from java.util dates called DateUtils, you could define your mappings as follows:
#Entity
public class MyEntity {
#Column("DATE")
private Date date; // java.util.Date
public void setDate(LocalDateTime date) {
this.date = DateUtils.convertToDate(date);
}
public LocalDateTime getDate() {
return DateUtils.convertFromDate(date);
}
}
Then to filter by date in JPQL:
public List<MyEntity> readByDateGreaterThan(LocalDateTime date) {
Query query = em.createQuery("select e from MyEntity e where e.date > :date");
query.setParameter("date", DateTuils.convertToDate(date));
return query.getResultList();
}
So, java.util dates would be used in entities and DAOs (Repositories) internally, while the API exposed by entities and DAOs would take/return java 8 dates, thus enabling the rest of the application to operate with java 8 dates only.

I have the following setup:
EclipseLink v2.6.2
h2 Database v1.4.191
Java 8
The entity class is like:
#Entity
public class MeasuringPoint extends BaseEntity {
#Column(nullable = false)
private LocalDateTime when;
public void setWhen(LocalDateTime when) {
this.when = when;
}
public LocalDateTime getWhen() {
return when;
}
}
The required converter for JPA 2.1 is:
#Converter(autoApply = true)
public class LocalDateTimeConverter implements AttributeConverter<LocalDateTime, Timestamp> {
#Override
public Timestamp convertToDatabaseColumn(LocalDateTime attribute) {
return attribute == null ? null : Timestamp.valueOf(attribute);
}
#Override
public LocalDateTime convertToEntityAttribute(Timestamp dbData) {
return dbData == null ? null : dbData.toLocalDateTime();
}
}
Now I can make a query
List<?> result = em.createQuery(
"SELECT p FROM MeasuringPoint p WHERE p.when = :custDate")
.setParameter("custDate", LocalDateTime.now())
.getResultList();
and it works like a charm. The result contains the expected entities. The conversion to TIMESTAMP is done automatically. When you have queries using the Criteria API, have a look at this answer which shows how to use LocalDateTime within Criteria API queries.
I wonder why it doesn't work with your code. Maybe the H2 JDBC driver does support something your Oracle doesn't.

AttributeConverter is working as designed, as the converter is meant to handle the back and forth between your entity type and the database type. Validation is verifying the type of the parameter doesn't match the type within the entity - just because your attribute converter can handle it, doesn't mean it fits the contract of the attribute converter types. JPA only says it will go through the converter before going to the database, and in this case it doesn't go to the database.
If you don't like Rogério's suggestions, you can
1) modify the EclipseLink code to relax the validation to allow it to go through to your converter, or
2) change your attribute type to 'Object' instead so that all parameter types you may pass in will go to your converter.

Related

Spring Data MongoDB Field Converter: Any way to pass a parameter?

When our application is loading a DBOject/Document instance from our MongoDB, we need to convert all of the UTC dates in it into ZoneDateTime values using the zoneIdName, which is within the DBObject/Document instance. We have a lot of Date fields in the DBObject, so I'd like to avoid having to implement a DBOject level Spring Data MongoDB Converter.
Is it possible to use a DBOject field level Spring Data MongoDB Converter, such as the following, which uses a field in the DBObject (i.e. zoneIdName), to be able to perform the conversion to a ZoneDateTime? If not, I will have to implement an object-level Converter, but it will be tedious...
Thank you for your interest and time.
class ZonedDateTimeReadConverter :
Converter<Date, ZonedDateTime> {
override fun convert(date: Date): ZonedDateTime {
// Need to replace "America/LosAngeles" with
// a value from a field in the object being processed.
return date.toInstant().atZone(ZoneId.of("America/LosAngeles"))
}
}
seems that converter for whole object is only option

Neo4j OGM #Properties which entry types are supported?

I try to persist the following entity into a Neo4J database with Spring Data Neo4j (SDN). The entity has a property java.util.Map<CustomEnum,Instant>.
Check the following example code:
public enum CustomEnum {
TREE, LEAVE, FLOWER;
}
#NodeEntity
public class ExampleEntity {
#Id
#GeneratedValue
private Long id;
// omitted simple properties of type String
#Properties(allowCast = true)
Map<CustomEnum,Instant> myMapProperty = new HashMap<>();
}
The problem I have is, that Neo4J OGM complains that it is not able to persist the Map<CustomEnum, Instant> because of an unsupported Type.
org.neo4j.ogm.exception.core.MappingException:
I located the source of the exception to come from the MapCompositeConverter: Link to Github.
If my analysis is correct, the core issue lies in OGM only allowing the default Cypher types as defined in AbstractConfigurableDriver: Link to Github
This would be a different behavior as explained in the documentation here, which explains that many native Java types (including the temporal types Instant, LocalDate, Period) should be supported.
I would be very happy about a pointer in the right direction.
Thank you in advance for your help.
Spring-Data-Neo4j supports basic types like String, Integer, Long, and so on.
Some more complex types like Instant and Date are also supported, but only because Spring-Data-Neo4j uses OGM, which comes with a set of AttributeConverters, that implicitly convert Instant and Date to String.
You can define your own converters and put them on your #Property attributes to use this converter.
For example you can build the FooToStringConverter as follows:
public class FooToStringConverter implements AttributConverter<Foo, String>{
String convertToDatabaseColumn(Foo foo){
return foo.toString();
}
Foo convertToEntityAttribute(String fooString){
return Foo.fromString();
}
}
And then annotate your entity as
#Property
#Converter(converter=FooToStringConverter.class)
private Foo foo;
However the extensive use of Converters kinda kills the Traverse-The-Graph advantage you get with Neo4j since now you need to use indexes.
It'll work that way, but maybe you need to overthink the architecture to use more nodes instead of embedded compley properties.
Hope this helps.

How to upgrade database with newer JPA timestamp fields?

I have a Spring Boot application running using JPA and Hibernate to automagically manage my entities. When I created this application, I used an older version of JPA that didn't have support for Java 8 DateTime API. However, without a lot of knowledge about JPA, I used LocalDateTime in my entities and it worked! Not having to know about the underlying database structure was great!
Until now...
I am upgrading JPA to a version that does support LocalDateTime, and I am facing an error with the way JPA is using this field. It used to save this object as a VARBINARY (tinyblob) field in my MySQL database, but now it is smart and expects it to be a TIMESTAMP type. Which means that when I start my application using the configuration spring.jpa.hibernate.ddl-auto=validate I get the error:
...
Caused by: org.hibernate.tool.schema.spi.SchemaManagementException:
Schema-validation: wrong column type encountered in column [answer_time] in table [user_answer];
found [tinyblob (Types#VARBINARY)], but expecting [datetime (Types#TIMESTAMP)]
So now I am kinda lost on how to convert these fields to their new timestamp types. I was thinking about using FlyWay to write a migration script, but I have no idea how JPA stored the object as blob. When print a VARBINARY field as string this is what it looks like:
’ sr
java.time.Ser]º"H² xpw ã
!;:;Ö#x
This is how my entity looks like (which was unchanged during the upgrade):
#Entity
#Table(name = "user_answer")
public class UserAnswer {
private Long id;
private LocalDateTime answerTime;
#Id
#GeneratedValue(strategy = GenerationType.AUTO)
public Long getId() {
return id;
}
public void setId(Long id) {
this.id = id;
}
public LocalDateTime getAnswerTime() {
return answerTime;
}
public void setAnswerTime(LocalDateTime answerTime) {
this.answerTime = answerTime;
}
}
How can I update my database so it converts the old VARBINARY fields that it used to store LocalDateTime data to TIMESTAMP fields?
What I would try (after backing up the DB!) :
Keep the old JPA API + implementation (Hibernate) versions.
Keep the old LocalDateTime field.
Add another java.sql.Date field to your entity. Make sure to annotate it properly etc. so that Hibernate knows exactly how the column should be defined.
For all entities:
Load each entity, read the LocalDateTime, convert and store it to the DateTime field, merge().
Remove the DateTime field.
Remove the column for the old LocalDateTime from the table.
Change the type of the DateTime field to LocalDateTime.
Upgrade the JPA API + implementation (Hibernate) versions.
JPA impl (Hibernate?) should store the DateTime as TIMESTAMP.
The JDBC driver should be able to pick up the TIMESTAMP into LocalDateTime.
Also, consider ZonedDateTime rather than LocalDateTime.

Spring optimistic lock for MongoDB document with java.util.Date field

I'm trying to implement optimistic locking for documents in an existing MongoDB database. Currently there is no version field and I would like to avoid adding it because we'll have to stop the application.
But there is a lastModified date field and it seems that I can make use of it like this:
#LastModifiedDate
#Version
private Date lastModified;
But when I marked this field as #Version and tried to save an item, I got the following exception:
No converter found capable of converting from type [java.lang.Date] to type [java.lang.Number]
So, I also added Date to Number and Long to Date converters to my configuration:
#Configuration
public class MongoConfig extends AbstractMongoConfiguration {
...
#Override
public CustomConversions customConversions() {
return new CustomConversions(CustomConversions.StoreConversions.NONE,
Arrays.asList(
new DateToNumberConverter(),
new LongToDateConverter()
));
}
}
This works like a charm for existing documents. But when I try to add a new document, I get:
No converter found capable of converting from type [java.lang.Integer] to type [java.util.Date]
If I then add an Integer to Date converter, then the new document is saved in the DB, but all the dates are now NumberLong instead of ISODate, i.e. was "lastModified" : ISODate("2018-10-02T07:30:12.005Z") and now "lastModified" : NumberLong("1538465479364"). This breaks the consistency between existing documents and new ones.
So the questions are:
Is there any possibility to use java.util.Date with #Version so that all dates are stored as ISODate in MongoDB?
Could anyone point to documentation on optimistic locking in Spring Data for MongoDB apart from this: https://docs.spring.io/spring-data/mongodb/docs/current/reference/html/#mongo-template.optimistic-locking ?
Seems like currently there is no possibility to use Date as a version field because it is cast to Number inside MongoTemplate.
I solved the issue by using my custom MongoTemplate that extends the Spring Data MongoTemplate and overrides required methods. Unfortunately I had to copy-paste a lot of code because the overriden logic is in private methods.

JPQL and date comparison (constraint in the query)

My Application model object contains a date field (time stamp):
#Entity
#Table(name = "MYTABLE")
public class Application {
private Date timeStamp;
...
}
I'm trying to construct a JPQL query that would select all applications that were changed today (i.e. their time stamp was changed anytime today). What is the best way to do this?
There is no perfect way with standard JPQL, JPQL doesn't provide Date arithmetic functions. But your provider might provide extensions (e.g. Hibernate and EclipseLink do). Or use a native SQL.

Categories

Resources