hibernate #Formula about 'convert(date)' - java

I am Using SqlServer 2012 and my Entity is
public class Something {
private Date rq;
#Temporal(TemporalType.TIMESTAMP)
#Column(name = "rq")
#Formula("CONVERT(DATE,rq)")
public Date getRq() {
return Rq;
}
public void setRq(Date rq) {
this.Rq = rq;
}
}
Hibernate debug log :
Hibernate:
select
CONVERT(dnypowergr0_.DATE,
dnypowergr0_.rq) as formula0_
from
db.dbo.something dnypowergr0_
I want to get the result of 'rq' that can truly 'convert' but as the log shows, the first argument of 'convert' was added an alias of the table, So this sql is error.
Have I written wrong code or used part of '#Formula' ?

Not sure how to make hibernate to do not insert table alias where it is not needed. But there is a workaround.
You can define a transient attribute (something like convertedRq) and convert value in Java. In this case rq will contain pure value of rq field, convertedRq will be calculated on fly.
Update: solution was posted here Hibernate #formula is not supportinng Cast() as int for teradata database :
public class Oracle10gDialectExtended extends Oracle10gDialect {
public Oracle10gDialectExtended() {
super();
/* types for cast: */
registerKeyword("int");
// add more reserved words as you need
}
}
(c) Sergio M C Figueiredo

Related

Trouble converting between java.sql.Timestamp & java.time.Instant with JOOQ

I'm having trouble converting between java.sql.Timestamp and java.time.Instant using JOOQ converters.
Here's a simplified version of the code I'm working with.
public class User {
private static final Converter<Timestamp, Instant> MY_CONVERTER= Converter.of(
Timestamp.class,
Instant.class,
t -> t == null ? null : t.toInstant(),
i -> i == null ? null : Timestamp.from(i)
)
public static Table<?> table = DSL.table("user");
public static Field<String> name = DSL.field(DSL.name(table.getName(), "name"), String.class);
public static Field<Instant> name = DSL.field(DSL.name(table.getCreated(), "created"), SQLDataType.TIMESTAMP.asConvertedDataType(Converter.of(MY_CONVERTER)));
}
private class UserDto {
private String name;
private Instant created;
// getters, setters, etc.
}
public class UserWriter {
// constructor with injected DefaultDSLContext etc..
public void create(UserDto user) {
dslContext.insertInto(User.table, User.firstName, User.lastName)
.values(user.getName(), user.getCreated())
.execute();
}
}
public class UserReader {
// constructor with injected DefaultDSLContext etc..
public Result<Record> getAll() {
return dslContext.select().from(User.table).fetch();
}
}
public class UserService {
// constructor with injected UserReader etc..
public Collection<UserDto> getAll() {
return userReader
.getAll()
.stream()
.map(Users::from)
.collect(Collectors.toList());
}
}
public class Users {
public static UserDto from(Record record) {
UserDto user = new UserDto();
user.setName(record.get(User.name));
user.setCreated(record.get(User.created);
return user;
}
}
When I create a new User the converter is called and the insertion works fine. However, when I select the Users the converter isn't called and the record.get(User.created) call in the Users::from method returns a Timestamp (and therefore fails as UserDto.setCreated expects an Instant).
Any ideas?
Thanks!
Why the converter isn't applied
From the way you phrased your question (you didn't post the exact SELECT statement that you've tried), I'm assuming you didn't pass all the column expressions explicitly. But then, how would jOOQ be able to find out what columns your table has? You declared some column expressions in some class, but that class isn't following any structure known to jOOQ. The only way to get jOOQ to fetch all known columns is to make them known to jOOQ, using code generation (see below).
You could, of course,let User extend the internal org.jooq.impl.TableImpl class and use internal API to register the Field values. But why do that manually, if you can generate this code?
Code generation
I'll repeat the main point of my previous question, which is: Please use the code generator. I've now written an entire article on why you should do this. Once jOOQ knows all of your meta data via code generation, you can just automatically select all columns like this:
UserRecord user = ctx
.selectFrom(USER)
.where(USER.ID.eq(...))
.fetchOne();
Not just that, you can also configure your data types as INSTANT using a <forcedType>, so you don't need to worry about data type conversion every time.
I cannot stress this enough, and I'm frequently surprised how many projects try to use jOOQ without code generation, which removes so much of jOOQ's power. The main reason to not use code generation is if your schema is dynamic, but since you have that User class, it obviously isn't dynamic.

Hibernate set last_updated column to current_timestamp with every update

Following the in-database example in https://docs.jboss.org/hibernate/orm/5.1/topical/html_single/generated/GeneratedValues.html#__valuegenerationtype_meta_annotation, I am able to get hibernate to produce insert statements that look like
insert into my_table (col_1, col_2,..., col_n, last_updated) values (?, ?,... current_timestamp)
However, I also want it to produce update statements that look like
update my_table set col_1=?, col_2=?,..., col_n=?, last_updated = current_timestamp where ...
but instead I seem to just be getting
update my_table set col_1=?, col_2=?,..., col_n=? where ...
Here's a snippet of Java code I'm using:
public class MyEntity {
...
#LastUpdatedColumn
#Column(name = "last_updated")
Instant last_updated;
#ValueGenerationType(generatedBy = LastUpdatedGeneration.class)
#Retention(RetentionPolicy.RUNTIME)
public #interface LastUpdatedColumn{
}
public static class LastUpdatedGeneration implements
AnnotationValueGeneration<LastUpdatedColumn> {
#Override
public void initialize(LastUpdatedColumn annotation, Class<?> propertyType) {
}
#Override
public GenerationTiming getGenerationTiming() {
return GenerationTiming.ALWAYS;
}
#Override
public ValueGenerator<?> getValueGenerator() {
return null;
}
#Override
public boolean referenceColumnInSql() {
return true;
}
#Override
public String getDatabaseGeneratedReferencedColumnValue() {
return "current_timestamp";
}
}
...
and elsewhere I use Spring Boot/Spring JPA to just do:
myRepository.saveAll(setOfMyEntities)
My main requirement here is to use the insert time from the database (rather than from the application server), but I'd also like to avoid having to use database triggers (though Postgres triggers would be acceptable).
In Hibernate 5.2 and above you can do this:
#Version
#Column(name = "last_updated")
Instant lastUpdated;
This has the additional benefit of rolling back a transaction if the row has been updated by another thread.
IDEs and validators might tell you that this is an invalid type to use the #Version annotation on, because this is a Hibernate-specific support, and not part of JPA 2.2 or below.
This should work as you can see in the following test: https://github.com/hibernate/hibernate-orm/blob/6c5e1726093a5347551b61d9e84ae9dd9f08ece6/documentation/src/test/java/org/hibernate/userguide/mapping/generated/DatabaseValueGenerationTest.java
Try reproducing the issue with plain Hibernate and the latest Hibernate version. Maybe it's a bug or a problem with Spring Data JPA that you are facing.

JPA Attribute Converter being applied despite autoApply=false and property not annotated with #Convert

The system I'm working on has a bunch of legacy data where boolean values have been stored as 'Y' and 'N'. New tables use a BIT column instead and simply store 0 and 1. No table mixes the two approaches.
To support the legacy tables we have the following converter:
#Converter(autoApply = false)
public class BooleanToStringConverter implements AttributeConverter<Boolean, String> {
private Logger.ALogger Log = Logger.of(BooleanToStringConverter.class);
#Override
public String convertToDatabaseColumn(final Boolean attribute) {
Log.debug("Converting the boolean value {}", attribute);
if (attribute == null) {
return "N";
}
return attribute ? "Y" : "N";
}
#Override
public Boolean convertToEntityAttribute(final String dbData) {
return "Y".equalsIgnoreCase(dbData);
}
}
As this only needs to apply to certain entities the autoApply property has been set to false.
I'm now creating a brand new entity, with a new table. It has two boolean properties, both using the BIT column style instead of Y/N:
#Entity
#Table(name = "MyEntity")
public class MyEntity {
#Id
#Column(name = "MyEntityId")
private Long id;
#Column(name = "IsClosed")
private Boolean closed;
...
}
Note that I have not applied the #Convert annotation.
I have a query that needs to filter out any rows where the entity is closed:
query.where().eq(CLOSED, Boolean.FALSE)
It is at this point that my problem arises. Whenever this query is run I see the log message from the BooleanToStringConverter being written to the logs and indeed, if I dump the actual SQL that was executed from the MySQL database then I can see that the converter did actually get applied to the boolean property, creating the following SQL fragment:
select <columns>
from MyEntity t0
where <other predicates>
and t0.IsClosed = 'N'
order by <order clause>
This is obviously wrong - the converter shouldn't have been applied, it's not set to be automatic and the closed property isn't annotated with #Convert.
I tried to work around this by creating a second converter:
#Converter(autoApply = true)
public class BooleanConverter implements AttributeConverter<Boolean, Boolean> {
private Logger.ALogger Log = Logger.of(BooleanConverter.class);
#Override
public Boolean convertToDatabaseColumn(final Boolean attribute) {
Log.debug("Processing the value {}.", attribute);
return attribute;
}
#Override
public Boolean convertToEntityAttribute(final Boolean dbData) {
return dbData;
}
}
This resulted in both converters being applied to the property and I see both debug statements appearing in the logs.
2019-07-29 14:19:53,994 [dispatcher-69] DEBUG BooleanConverter Processing the value false.
2019-07-29 14:19:53,994 [dispatcher-69] DEBUG BooleanToStringConve I'm Converting the boolean value false
Next I tried explicitly setting the converter to use on the entity itself (I hoped this might change the order that the converters were getting applied in so that it'd end up as true/false despite the other converter running):
#Entity
#Table(name = "MyEntity")
public class MyEntity {
#Id
#Column(name = "MyEntityId")
private Long id;
#Convert(converter = BooleanConverter.class)
#Column(name = "IsClosed")
private Boolean closed;
...
}
With exactly the same result; both converters are applied to the value sequentially, with the BooleanToStringConverter having the last laugh and mangling the predicate.
I would rather keep the BooleanToStringConverter as it makes dealing with the legacy data a bit less painful, but unless I can figure out why it's being applied when it shouldn't it's looking likely that I'll have to delete it.
I'm using Ebean version 4.1.3 and Play! 2.6.21
How can I stop this rogue converter from applying itself to properties that it has no right to be touching?
This is a (now) known limitation of Ebean, as described by Issue 1777 on Ebean's GitHub page. It is not planned to be fixed at the time of writing.

Hibernate fetching deleted java enum types from db

I have an Enum class which has some values.
We've decided to remove one of these values and its all implementation from the code.
We dont want to delete any records from DB.
My Enum class is something like this:
public enum CourseType {
VIDEO("CourseType.VIDEO"),
PDF("CourseType.PDF"),
QUIZ("CourseType.QUIZ"),
SURVEY("CourseType.SURVEY"),
POWERPOINT("CourseType.POWERPOINT") //*this one will be removed*
...
}
My Course Entity:
#Entity
#Table(name = "CRS")
public class Course {
#Column(name = "COURSE_TYPE")
#Enumerated(EnumType.STRING)
private CourseType courseType;
#Column(name = "AUTHOR")
private String author;
....
#Override
public CourseType getCourseType() {
return courseType;
}
#Override
public void setCourseType(CourseType courseType) {
this.courseType = courseType;
}
....
}
After I removed the Powerpoint type from the Java Class and tried to fetch some values from the DB,
I get a mapping error for the removed type.
I have a code like this:
Course course = courseService.get(id);
If I gave a course id which its type is 'POWERPOINT' in the database,
the method gets the following error:
java.lang.IllegalArgumentException: Unknown name value [POWERPOINT]
for enum class [com.tst.enums.CourseType] at
org.hibernate.type.EnumType$NamedEnumValueMapper.fromName(EnumType.java:461)
at
org.hibernate.type.EnumType$NamedEnumValueMapper.getValue(EnumType.java:449)
at org.hibernate.type.EnumType.nullSafeGet(EnumType.java:107) at
org.hibernate.type.CustomType.nullSafeGet(CustomType.java:127) at
org.hibernate.type.AbstractType.hydrate(AbstractType.java:106) at
org.hibernate.persister.entity.AbstractEntityPersister.hydrate(AbstractEntityPersister.java:2912)
at org.hibernate.loader.Loader.loadFromResultSet(Loader.java:1673)
Is there any way when I try to retrieve a query result from DB,
hibernate will not fetch if that records' course_type column doesn't match with the any of the enum values in the code?
Do I have to use some kind of filter?
You can try use annotation #filter
#Filter(name = "myFilter", condition = "courseType <> 'CourseType.POWERPOINT'")
and enable it
session.enableFilter("myFilter")
If you can't use filters,
something like the following should work:
Add POWERPOINT back into the enum.
Add a deleted flag to the POWERPOINT enum value.
After the course list is loaded, remove courses that have a deleted courseType value.
New CourseType enum:
public enum CourseType
{
VIDEO("CourseType.VIDEO", false),
POWERPOINT("CourseType.POWERPOINT", true);
private boolean deletedFlag;
public CourseType(
existingParameter, // do whatever you are currently doing with this parameter
deletedFlagValue)
{
// code to handle existing parameter
deletedFlag = deletedFlagValue;
}

How to store created/lastUpdate fields in AppEngine DataStore using Java JDO 3?

Abstract
I have a working application in Appengine using Java and JDO 3.
I found these arguments (auto_now and auto_now_add) which correspond exactly what I want to implement in Java. So essentially the question is: How to convert AppEngine's Python DateTimeProperty to Java JDO?
Constraints
Converting my application to Python is not an option.
Adding two Date properties and manually populating these values whenever a create/update happens is not an option.
I'm looking for a solution which corresponds to what JDO/Appengine/Database authors had in mind for this scenario when they created the APIs.
It would be preferable to have a generic option: say I have 4 entities in classes: C1, C2, C3, C4 and the solution is to add a base class C0, which all 4 entities would extend, so the 4 entities don't even know they're being "audited".
[update] I tried (using a simple entity)
#PersistenceCapable public class MyEntity {
#Persistent(valueStrategy = IdGeneratorStrategy.IDENTITY, primaryKey = "true")
private Long id;
#Persistent private String name;
...
1. #Persistent public void getLastUpdate() { return new Date(); }
As suggested by answer, but it seems to always update the value, even when I just load the value from the datastore or just modify an unrelated field (e.g. String name).
You can easily enough have a property (setter/getter) on a java class and have the property persistable (rather than the field). Within that getter you can code whatever you want to control what value goes into the datastore.
If I didn't do the following hack, I can't read the value stored in the datastore [neither with the hack :( ]:
#Persistent public Date getLastUpdate() { return new Date(); }
private Date prevUpdate;
public void setLastUpdate(Date lastUpdate) { this.prevUpdate = lastUpdate; }
public Date getPrevUpdate() { return prevUpdate; }
Is there any way to differentiate if a persistence operation is in progress or my code is calling the getter?
2. #Persistent(customValueStrategy = "auto_now_add") private Date lastUpdate;
I modeled auto_now_add after org.datanucleus.store.valuegenerator.TimestampGenerator replacing Timestamp with java.util.Date.
But it was only populated once at the first makePersistent call, regardless of how many times I modified other fields and called makePersistent. Also note that it doesn't seem to behave as the documentation says (or my English is rusty):
Please note that by defining a value-strategy for a field then it will, by default, always generate a value for that field on persist. If the field can store nulls and you only want it to generate the value at persist when it is null (i.e you haven't assigned a value yourself) then you can add the extension "strategy-when-notnull" as false
3. preStore using PersistenceManager.addInstanceLifecycleListener
Works as expected, but I could make it work across multiple entities using a base class.
pm.addInstanceLifecycleListener(new StoreLifecycleListener() {
#Override public void preStore(InstanceLifecycleEvent event) {
MyEntity entity = (MyEntity)event.getPersistentInstance();
entity.setLastUpdate(new Date());
}
#Override public void postStore(InstanceLifecycleEvent event) {}
}, MyEntity.class);
4. implements StoreCallback and public void jdoPreStore() { this.setLastUpdate(new Date()); }
Works as expected, but I could make it work across multiple entities using a base class.
To satisfy my 4th constraint (using solutions 3 or 4)
Whatever I do I can't make the following structure work:
public abstract class Dateable implements StoreCallback {
#Persistent private Date created;
#Persistent private Date lastUpdate;
public Dateable() { created = new Date(); }
public void jdoPreStore() { this.setLastUpdate(new Date()); }
// ... normal get/set properties for the above two
}
#PersistenceCapable public class MyEntity extends Dateable {
#Persistent(valueStrategy = IdGeneratorStrategy.IDENTITY, primaryKey = "true") private Long id;
#Persistent private String name;
The problems when the enhancer runs:
public abstract class Dateable:
DataNucleus.MetaData Registering class "[...].Dateable" as not having MetaData.
public abstract class Dateable with the above log, but running the code anyway:
Creation date changes whenever I create or read the data from datastore.
#PersistenceCapable public abstract class Dateable:
DataNucleus.MetaData Class "[...].MyEntity" has been specified with 1 primary key fields, but this class is using datastore identity and should be application identity.
JDO simply provides persistence of Java classes (and its fields/properties) so don't see what the design of JDO has to do with it.
You can easily enough have a property (setter/getter) on a java class and have the property persistable (rather than the field). Within that getter you can code whatever you want to control what value goes into the datastore. Either that or you use a preStore listener to be able to set things just before persistence so the desired value goes into the datastore.

Categories

Resources