When our application is loading a DBOject/Document instance from our MongoDB, we need to convert all of the UTC dates in it into ZoneDateTime values using the zoneIdName, which is within the DBObject/Document instance. We have a lot of Date fields in the DBObject, so I'd like to avoid having to implement a DBOject level Spring Data MongoDB Converter.
Is it possible to use a DBOject field level Spring Data MongoDB Converter, such as the following, which uses a field in the DBObject (i.e. zoneIdName), to be able to perform the conversion to a ZoneDateTime? If not, I will have to implement an object-level Converter, but it will be tedious...
Thank you for your interest and time.
class ZonedDateTimeReadConverter :
Converter<Date, ZonedDateTime> {
override fun convert(date: Date): ZonedDateTime {
// Need to replace "America/LosAngeles" with
// a value from a field in the object being processed.
return date.toInstant().atZone(ZoneId.of("America/LosAngeles"))
}
}
seems that converter for whole object is only option
Related
We have upgraded the Spring Data Mongo recently and since then some read and update queries are failing with the following error,
org.springframework.data.mapping.MappingException: Expected to read Document Document{} into type class java.lang.Object but didn't find a PersistentEntity for the latter!
While debugging, found that the type is Object and it doesn't have _class then this error is thrown.
It was working fine before 3.2.
When checked in the 3.1's source code, the read method in class MappingMongoConverter has a logic to cast the bson to an Object type if the ClassTypeInfomartion is Object.
In a few of our object models, we store additional information about the document as an Object.
Is it possible to configure in Spring to convert the bson to Object instead of failing?
Thanks in advance.
Spring Data Mongo can take the type information from _class, but if the application cannot rely on that field, it needs to tell the driver what type the data it expects to get.
My current project uses this for that purpose:
public interface BusinessesRepository extends MongoRepository<SomeApplicationType, String>
(Spring Data will autogenerate a subclass with the proper class selection logic for such an interface.)
Slightly lower-level you have MongoTemplate with its find(Query, Class<T>) function, which accepts something like SomeApplicationType.class as the second parameter and uses that information to instantiate an object to populate the fields with.
I don't know what function your code calls to retrieve and deserialize your data, so I can't be more specific.
In my view I send async request to controller with Json Data as following:
{
"filters":{
"someField":"someValue",
"someField":"someValue",
"someField":null,
"someField":null,
}
}
But data can be different.
And I have Order Entity that has same fields, so I can convert It from Json to POJO
After that using JPA I can do following:
Example<Order> orderExample = Example.of(orderFromJson);
orderRepository.findAll(orderExample);
But I use spring-data-jdbc which doesn't support it, What can replace it?
For cases like this where no direct support is offered, the correct approach is to get a JdbcTemplate or NamedParameterJdbcTemplate injected, and construct the required SQL from your filter information. You may make the method a custom repository method.
I'm trying to implement optimistic locking for documents in an existing MongoDB database. Currently there is no version field and I would like to avoid adding it because we'll have to stop the application.
But there is a lastModified date field and it seems that I can make use of it like this:
#LastModifiedDate
#Version
private Date lastModified;
But when I marked this field as #Version and tried to save an item, I got the following exception:
No converter found capable of converting from type [java.lang.Date] to type [java.lang.Number]
So, I also added Date to Number and Long to Date converters to my configuration:
#Configuration
public class MongoConfig extends AbstractMongoConfiguration {
...
#Override
public CustomConversions customConversions() {
return new CustomConversions(CustomConversions.StoreConversions.NONE,
Arrays.asList(
new DateToNumberConverter(),
new LongToDateConverter()
));
}
}
This works like a charm for existing documents. But when I try to add a new document, I get:
No converter found capable of converting from type [java.lang.Integer] to type [java.util.Date]
If I then add an Integer to Date converter, then the new document is saved in the DB, but all the dates are now NumberLong instead of ISODate, i.e. was "lastModified" : ISODate("2018-10-02T07:30:12.005Z") and now "lastModified" : NumberLong("1538465479364"). This breaks the consistency between existing documents and new ones.
So the questions are:
Is there any possibility to use java.util.Date with #Version so that all dates are stored as ISODate in MongoDB?
Could anyone point to documentation on optimistic locking in Spring Data for MongoDB apart from this: https://docs.spring.io/spring-data/mongodb/docs/current/reference/html/#mongo-template.optimistic-locking ?
Seems like currently there is no possibility to use Date as a version field because it is cast to Number inside MongoTemplate.
I solved the issue by using my custom MongoTemplate that extends the Spring Data MongoTemplate and overrides required methods. Unfortunately I had to copy-paste a lot of code because the overriden logic is in private methods.
I'd like to add support for the Java 8 Date/Time API (JSR-310) in my JPA-enabled application.
It's clear that JPA 2.1 does not support the Java 8 Date/Time API.
As a workaround, the most common advise is to use an AttributeConverter.
In my existing application, I changed my entities to use LocalDate/LocalDateTime types for the column mapping fields and added legacy setter/getters for java.util.Date to them.
I created corresponding AttributeConverter classes.
My application does now fail when using Query.setParameter() with java.util.Date instances (it worked before the transition to the new API). It seems that JPA expects the new date types and does not convert it on the fly.
I expected that if passing an argument to setParameter() of a type for which an AttributeConverter has been registered, it would be automatically converted by the converter.
But this seems to be not the case, at least not using EclipseLink 2.6.2:
java.lang.IllegalArgumentException: You have attempted to set a value of type class java.util.Date for parameter closeDate with expected type of class java.time.LocalDate from query string SELECT obj FROM [...]
at org.eclipse.persistence.internal.jpa.QueryImpl.setParameterInternal(QueryImpl.java:937) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
at org.eclipse.persistence.internal.jpa.EJBQueryImpl.setParameter(EJBQueryImpl.java:593) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
at org.eclipse.persistence.internal.jpa.EJBQueryImpl.setParameter(EJBQueryImpl.java:1) ~[eclipselink-2.6.2.jar:2.6.2.v20151217-774c696]
[...]
Questions:
Is this behavior expected? Did I miss something?
Is there a way to use the new date types as fields without breaking existing code?
How did you deal with the transition to the new Date/Time API within JPA?
UPDATE:
However, It seems that at least using EclipseLink, custom types for which an AttributeConverter exists, are not fully supported:
Within JPQL queries, neither the actual field type nor the converted database type can be used as a parameter.
When using the converted database type, the exception described above occurs.
When using the actual field type (e.g. LocalDate), it's directly passed to the jdbc driver which doesn't know this type:
Caused by: java.sql.SQLException: Invalid column type
at oracle.jdbc.driver.OraclePreparedStatement.setObjectCritical(OraclePreparedStatement.java:10495)
at oracle.jdbc.driver.OraclePreparedStatement.setObjectInternal(OraclePreparedStatement.java:9974)
at oracle.jdbc.driver.OraclePreparedStatement.setObjectInternal(OraclePreparedStatement.java:10799)
at oracle.jdbc.driver.OraclePreparedStatement.setObject(OraclePreparedStatement.java:10776)
at oracle.jdbc.driver.OraclePreparedStatementWrapper.setObject(OraclePreparedStatementWrapper.java:241)
at org.eclipse.persistence.internal.databaseaccess.DatabasePlatform.setParameterValueInDatabaseCall(DatabasePlatform.java:2506)
I would expect that EclipseLink converts the field type to the java.sql type using the AttributeConverter.
(see also this bug report: https://bugs.eclipse.org/bugs/show_bug.cgi?id=494999 )
Which leads us to the most important question #4:
Is there a workaround/solution to support java 8 date fields using EclipseLink, including the possibility to use a query parameters on such a field?
ADDITIONAL INFO
AttributeConverter used (for LocalDateTime conversion)
Additional information to reproduce exception
Some time ago, I converted a Java EE 7 web app from Java 7 to Java 8, and replaced java.util.Date in entities with LocalDate and LocalDateTime.
Yes, that behavior is expected, because an AttributeConverter only converts between the type used for entity fields and the database type (ie, java.sql.Date, etc.); it does not convert between an entity field type and a java.util.Date used in a query parameter.
As far as I know, no, there is no way to continue using java.util.Date in existing code, after introducing the java.time types into JPA entities.
Besides creating the necessary AttributeConverter implementations, I changed all occurrences of java.util.Date to the appropriate java.time types, not only in entities but also in JPA-QL queries and in business methods.
For item 2, of course you can go some way by using utility methods and getters/setters that convert between java.util and java.time, but it's not going to go all the way. More importantly, I don't quite see the point of introducing java.time types into JPA entity attributes, if you are not willing to convert the remaining code that uses these
attributes. After the conversion work I did in that Java EE app, there were no uses of java.util.Date left anywhere (though I also had to create converters for JSF).
With so many bugs in the provider itself I don't think you have much choice but to use java.util.Date on the mapping level and java 8 dates on the API level.
Assuming you write a utility class for conversion to/from java.util dates called DateUtils, you could define your mappings as follows:
#Entity
public class MyEntity {
#Column("DATE")
private Date date; // java.util.Date
public void setDate(LocalDateTime date) {
this.date = DateUtils.convertToDate(date);
}
public LocalDateTime getDate() {
return DateUtils.convertFromDate(date);
}
}
Then to filter by date in JPQL:
public List<MyEntity> readByDateGreaterThan(LocalDateTime date) {
Query query = em.createQuery("select e from MyEntity e where e.date > :date");
query.setParameter("date", DateTuils.convertToDate(date));
return query.getResultList();
}
So, java.util dates would be used in entities and DAOs (Repositories) internally, while the API exposed by entities and DAOs would take/return java 8 dates, thus enabling the rest of the application to operate with java 8 dates only.
I have the following setup:
EclipseLink v2.6.2
h2 Database v1.4.191
Java 8
The entity class is like:
#Entity
public class MeasuringPoint extends BaseEntity {
#Column(nullable = false)
private LocalDateTime when;
public void setWhen(LocalDateTime when) {
this.when = when;
}
public LocalDateTime getWhen() {
return when;
}
}
The required converter for JPA 2.1 is:
#Converter(autoApply = true)
public class LocalDateTimeConverter implements AttributeConverter<LocalDateTime, Timestamp> {
#Override
public Timestamp convertToDatabaseColumn(LocalDateTime attribute) {
return attribute == null ? null : Timestamp.valueOf(attribute);
}
#Override
public LocalDateTime convertToEntityAttribute(Timestamp dbData) {
return dbData == null ? null : dbData.toLocalDateTime();
}
}
Now I can make a query
List<?> result = em.createQuery(
"SELECT p FROM MeasuringPoint p WHERE p.when = :custDate")
.setParameter("custDate", LocalDateTime.now())
.getResultList();
and it works like a charm. The result contains the expected entities. The conversion to TIMESTAMP is done automatically. When you have queries using the Criteria API, have a look at this answer which shows how to use LocalDateTime within Criteria API queries.
I wonder why it doesn't work with your code. Maybe the H2 JDBC driver does support something your Oracle doesn't.
AttributeConverter is working as designed, as the converter is meant to handle the back and forth between your entity type and the database type. Validation is verifying the type of the parameter doesn't match the type within the entity - just because your attribute converter can handle it, doesn't mean it fits the contract of the attribute converter types. JPA only says it will go through the converter before going to the database, and in this case it doesn't go to the database.
If you don't like Rogério's suggestions, you can
1) modify the EclipseLink code to relax the validation to allow it to go through to your converter, or
2) change your attribute type to 'Object' instead so that all parameter types you may pass in will go to your converter.
I have a jersey web service that takes post data and maps it (using jackson) to a data structure that looks like
public class MyObject {
String name
Object dateOrPrimitive
}
On the javascript client that calls the web service, it could send an object that could take either of the following forms
{ name : "Jeff", dateOrPrimitive : "someOtherString" }
{ name : "Jeff", dateOrPrimitive : new Date() }
If the dateOrPrimitive field has a date in it, it is deserialized into a string representation of the date. But what I would like to do is deserialize it to a date if it is a valid date string.
I wrote a custom deserializer that checks if the value is a valid date and returns a date if it is and a primitive otherwise, but I was wondering if there is already a built in a way to do this.
No there isn't such a thing in Jackson. Your custom deserializer is the way to go on this. Personnaly I would have two different properties, one a Date and the other a primitive, but I'm guessing you don't have control over the format of the Json.