Problem persisting a java.util.Date into MySql using Hibernate - java

I've been debugging this problem for the last couple of hours with no success and figured I'd throw it out to SO and see where that goes.
I'm developing a Java program that persists data into a MySql database using Hibernate and the DAO/DTO pattern. In my database, I have a memberprofile table with a firstLoginDate column. In the database, the SQL type of that column is a DateTime. The corresponding section of the Hibernate XML file is
<property name="firstLoginDate" type="timestamp">
<column name="firstLoginDate" sql-type="DATETIME"/>
</property>
However, when I try to save a Date into that table (in Java), the "date" (year/month/day) part is persisted correctly, but the "time of day" part (hours:minutes:seconds) is not. For instance, if I try to save a Java date representing 2009-09-01 14:02:23, what ends up in the database is instead 2009-09-01 00:00:00.
I've already confirmed that my own code isn't stomping on the time component; as far as I can see source code (while debugging) the time component remains correct. However, after committing changes, I can examine the relevant row using the MySql Query Browser (or just grabbing back out from the database in my Java code), and indeed the time component is missing. Any ideas?
I did try persisting a java.sql.Timestamp instead of a java.util.Date, but the problem remained. Also, I have a very similar column in another table that does not exhibit this behavior at all.
I expect you guys will have questions, so I'll edit this as needed. Thanks!
Edit #Nate:
...
MemberProfile mp = ...
Date now = new Date();
mp.setFirstLoginDate(now);
...
MemberProfile is pretty much a wrapper class for the DTO; setting the first login date sets a field of the DTO and then commits the changes.
Edit 2: It seems to only occur on my machine. I've already tried rebuilding the table schema and wiping out all of my local source and re-checking-out from CVS, with no improvement. Now I'm really stumped.
Edit 3: Completely wiping my MySql installation, reinstalling it, and restoring the database from a known good copy also did not fix the problem.

I have a similar setup to you (except mine works), and my mapping file looks like this:
<property name="firstLoginDate" type="timestamp">
<column name="firstLoginDate" length="19"/>
</property>
My database shows the column definition as datetime.
Edit:
Some more things to check...
Check that the mysql driver the same
on your local as on the working machines.
Try dropping the table, and have
hibernate recreate it for you. If
that works, then there's a problem in
the mapping.

This may or may not be your problem, but we have had serious problems with date/time info - if your database server is on a different time zone than the machine submitting the data, you can have inconsistencies in the data saved.
Beyond that, with our annotation configuration, it looks something like the following:
#Column(name="COLUMN_NAME", length=11)

If it is viable for you, consider using the JodaTime DateTime class which is much nicer than the built in classes and you can also persist them using Hibernate with their Hibernate Support
Using them I mark my fields or getters with the annotation for custom Hibernate Types as:
#org.hibernate.annotations.Type(type = "org.joda.time.contrib.hibernate.PersistentDateTime")
#Column(name = "date")
This works fine for me and it also generates correct schema generation sql
This works fine in MySQL

Use TemporalType.TIMESTAMP beside your Temporal annonation.
Please check the example below.
#Temporal(TemporalType.TIMESTAMP)
public Date getCreated() {
return this.created;
}

Related

Java + Postgres: time shift between app and database

One of field of my table has following format:
trackdate TIMESTAMP WITH TIME ZONE NOT NULL,
POJO:
private Timestamp trackDate;
where Timestamp is java.sql.Timestamp.
The problem is that when I have a date, for instance, 2017-05-08 22:16:15.551 in Europe/Kiev time zone, the database adds 3 hours (Europe/Kiev itself) and I have 2017-05-09 01:16:27.551+03 there.
hibernate mapping is pretty simple:
<property name="trackDate" type="timestamp">
<column name="TRACKDATE" not-null="true"/>
</property>
No any additional conversions between app and database are. The Tomcat starts with:
export TOMCAT_TIMEZONE="-Duser.timezone=Europe/Kiev"
Database' time zone is also set to:
timezone = 'Europe/Kiev'
What is the problem? Why I see additional three hours?
Seems like your database thinks stores your time as UTC and converts it to Kiev-Time. Some litterature on hibernate and timestamps: http://in.relation.to/2016/09/12/jdbc-time-zone-configuration-property/ Look at workarounds at the bottom of the page...
Your code and database are working correctly. The time in the database is listing the time in one of the standard Postgre time stamp forms documented here (see section 8.5.1.3, "Time Stamps", also this table might help clear things up as well).
In other words, 2017-05-09 01:16:27.551+03 has the same meaning of 2017-05-08 22:16:15.551 Europe/Kiev, with +03 in 2017-05-09 01:16:27.551+03 indicating the 3-hour offset that the "Europe/Kiev" timezone has from UTC.

JPA CreateNativeQuery returns wrong date

We recently updated a project from using Hibernate 4 to Hibernate 5.2, and with that came the need to update all of our Criteria to use JPA. For the most part things are in working order, but I have one query that is no longer behaving. One of the fields on the table we are querying is of type DATE in the database.
When I query directly on the table I get back the date- say it is "2017-04-20." However, when I run the same query on our development server, using JPA's createNativeQuery, I get back the date "2017-04-19"
I don't think this is an issue with the query as I run the exact same query both through a mysql terminal and through java and get different results. The query that I run is the one that is logged in my below example. I think it may be a timezone issue as I don't have this problem on my local environment, just on my dev server, but it also wasn't a problem until we updated to the new versions of Hibernate.
public List<ResponseDTO> getDashboardData(String date, Integer page, Integer pageSize, AbstractDashboard dashboard) {
List<ResponseDTO> processed = new ArrayList<ResponseDTO>();
String query = getDashboardQuery(date, dashboard);
logger.info("Dashboard Query: " + query);
List<Object[]> raw = createNativeQuery(query).getResultList();
return raw.stream().map(r->new ResponseDTO(r)).collect(Collectors.toList());
}
And the constructor of my DTO object:
public ResponseDTO(Object[] r) {
this.date = ((Date) r[0]).toLocalDate();
System.out.println(this.date.toString());//This date does not match what is in the db.
this.type = (String) r[1];
this.label = (String) r[2];
this.value = (Double) r[3];
}
Edit:
I think it's actually an issue with the java.sql.Date type, because I tried printing that out on my dev server and it also returns "2017-04-19" instead of the 20th. I don't get why this doesn't match the results when I run the query in a mysql console, it seems like they should be the same to me.
Try another JDBC driver
Ok, I got it working. For what it's worth, this seems like complete madness to me. The "Aha!" moment came while reading this answer to a different question.
Since it is the mysql driver that dictates how the date is parsed in the system. I rolled back my mysql driver as that was one of the packages that I updated. Suddenly everything behaved as expected.

IntelliJ IDEA code inspection: HQL custom dialect & registered functions

My question is about
using registered functions for date/time manipulations in Hibernate Query Language and
IntelliJ IDEA's code inspection for these registered functions in HQL.
I'm using Hibernate 4.2.5 with Java 7, and SQL Server 2008 R2 as the database, and IntelliJ IDEA 12.1.6.
In an HQL query I need to perform the TSQL DATEADD function - or the equivalent HQL date operation. This doesn't seem to exist.
Here's what I'd like to achieve:
update MyTable set startTime = GETDATE(), targetTime = DATEADD(HOUR, allocatedTime, GETDATE()), endTime = null where faultReport.faultReportId = :faultReportId and slaTypeId = :slaTypeId
Searching for answers online has been disappointingly no help, and the most common advice (like the comment seen here: https://stackoverflow.com/a/18150333/2753571) seems to be "don't use date manipulation in hql." I don't see how I can get around performing the operation in the SQL statement in the general case (e.g. when you want to update one column based on the value in another column in multiple rows).
In a similar fashion to the advice in this post: Date operations in HQL, I've subclassed a SQLServerDialect implementation and registered new functions:
registerFunction("get_date", new NoArgSQLFunction("GETDATE", StandardBasicTypes.TIMESTAMP)); // this function is a duplication of "current_timestamp" but is here for testing / illustration
registerFunction("add_hours", new VarArgsSQLFunction(TimestampType.INSTANCE, "DATEADD(HOUR,", ",", ")"));
and added this property to my persistence.xml:
<property name="hibernate.dialect" value="my.project.dialect.SqlServerDialectExtended" />
and then I'm testing with a simple (meaningless, admitted) query like this:
select x, get_date(), add_hours(1, get_date()) from MyTable x
The functions appear to be successfully registered, and that query seems to be working because the following SQL is generated and the results are correct:
select
faultrepor0_.FaultReportSLATrackingId as col_0_0_,
GETDATE() as col_1_0_,
DATEADD(HOUR,
1,
GETDATE()) as col_2_0_,
... etc.
But I now have this problem with IntelliJ IDEA: where get_date() is used in the HQL, the code inspection complains "<expression> expected, got ')'". This is marked as an error and the file is marked in red as a compilation failure.
Can someone can explain how to deal with this, please, or explain what a better approach is? Am I using the incorrect SQLFunction template (VarArgsSQLFunction)? If yes, which is the best one to use?
I'd like the usage of the registered function to not be marked as invalid in my IDE. Ideally, if someone can suggest a better way altogether than creating a new dialect subclass, that would be awesome.

ORA-01460: unimplemented or unreasonable conversion requested using Hibernate #Lob

I have a byte[] that I am persisting to a Lob as follows:
#Basic(fetch = FetchType.LAZY)
#Column(name = "ABF", length = Integer.MAX_VALUE)
#Lob
private byte[] abf;
Seems simple enough, but when I attempt to store anything sizable in it (more than 4000 characters) I get the following exception when I try to commit:
java.sql.SQLException: ORA-01460: unimplemented or unreasonable conversion requested
None of the files I am attempting to store are anywhere near 32,000 characters. Is there some other gotcha here?
See this post.
Nutshell:
<property name="hibernate.connection.SetBigStringTryClob">true</property>
<property name="hibernate.jdbc.batch_size">0</property>
It can also be:
Old Oracle JDBC driver (although I think then the limit was 2k)
Driver/DB version mismatch
Wrong Oracle dialect specified in Hibernate config
For DB stuff it's always helpful to supply driver and DB version info :)
i just updated the oracle driver and it worked fine.
it is mainly due to the oracle driver mismatch.
If you have right version of jdbc driver corresponding to you oracle version, it should not be an issue.
Sometimes it helps to do things in that order:
insert the new entity with an empty lob;
commit;
populate the lob on the newly created entity;
update and commit.
#Dave Newton set me on the right path. The answer involved a few things. As Dave pointed out, I added these lines to hibernate.cfg.xml:
<property name="hibernate.connection.SetBigStringTryClob">true</property>
<property name="hibernate.jdbc.batch_size">0</property>
I was previously using hsqldb-2.0.0.jar. I updated this to the current version (hsqldb-2.2.5.jar). I think this was the main culprit, and I swear I've noticed a database performance increase since doing this.
I also updated to the current version of ojdbc14.jar (10.2.0.5). I was previously on some older version, but I don't know exactly which one. It should be noted that even after updating to version 10.2.0.5 the problem did not go away. It wasn't until I updated the hsqldb.jar version that the problem was resolved.

Problem using JPA with Oracle and grouping by hour

I have a problem using JPA with Oracle and grouping by hour.
Here's the scenario:
I have an entiry named EventData. This entity has a java.util.Date field named startOfPeriod. This field maps in a table field of datatype DATE.
The query I'm using is something like:
select min(ed.startOfPeriod) as eventDate,
(...)
from
Event e inner join e.eventDatas ed
(...)
group by
year(ed.startOfPeriod),
month(ed.startOfPeriod),
day(ed.startOfPeriod),
hour(ed.startOfPeriod)
order by 1
If I remove the group by "hour(ed.startOfPeriod)" it works fine (it doesn't produce any errors but it doesn'to do what I want).
When I insert this group by clause it makes this exception:
Caused by: java.sql.SQLException: ORA-30076: campo de extração inválido para origem de extração
at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:112)
at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:331)
at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:288)
at oracle.jdbc.driver.T4C8Oall.receive(T4C8Oall.java:745)
at oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:219)
at oracle.jdbc.driver.T4CPreparedStatement.executeForDescribe(T4CPreparedStatement.java:813)
at oracle.jdbc.driver.OracleStatement.executeMaybeDescribe(OracleStatement.java:1049)
at oracle.jdbc.driver.T4CPreparedStatement.executeMaybeDescribe(T4CPreparedStatement.java:854)
at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1154)
at oracle.jdbc.driver.OraclePreparedStatement.executeInternal(OraclePreparedStatement.java:3370)
at oracle.jdbc.driver.OraclePreparedStatement.executeQuery(OraclePreparedStatement.java:3415)
at org.apache.commons.dbcp.DelegatingPreparedStatement.executeQuery(DelegatingPreparedStatement.java:92)
at org.hibernate.jdbc.AbstractBatcher.getResultSet(AbstractBatcher.java:208)
at org.hibernate.loader.Loader.getResultSet(Loader.java:1812)
at org.hibernate.loader.Loader.doQuery(Loader.java:697)
at org.hibernate.loader.Loader.doQueryAndInitializeNonLazyCollections(Loader.java:259)
at org.hibernate.loader.Loader.doList(Loader.java:2232)
Analysing the error code, it happens when "The extract source does not contain the specified extract field.". But the source of extraction (the startOfPeriod field) is of datatype DATE (which has an hour part).
The same code works like a charm in SQL Server.
Anyone knows what is going on?
Tnhks!
Have you tried TO_CHAR(d, 'HH24') instead? You could also trunc() to hours...
Not quite an answer, but it sounds suspiciously like a driver issue to me (especially as this works fine on SQLServer). What version of the Oracle driver are you using? Have you tried different versions?
I didn't convert all my DATE fields to TIMESTAMP ones.
Instead, I extended the Oracle's dialect to rewrite the hour function.
Like that:
public class Oracle9Dialect extends org.hibernate.dialect.Oracle9Dialect
{
public Oracle9Dialect()
{
super();
registerFunction("hour", new SQLFunctionTemplate(Hibernate.INTEGER, "to_number(to_char(?1, 'hh24'))"));
}
}
Using this dialect (org.hibernate.dialect.Dialect), the hour function will not use the ANSI hour function (extract(hour from <FIELD>)). It will use to_number(to_char(<FIELD>, 'hh24')) instead.
The bad thing is that it always use the custom function (even when it doesn't need to be used, like with a TIMESTAMP field).
This is what I did to solve my problem.

Categories

Resources