ClassCastException: Cannot cast java.sql.Date (id=9010) to java.sql.Timestamp - java

I am having a problem with a Java 1.4 app.
I am executing a JDBC Qyery-which is 2 date fields in an Oracle database (both the 2 fields I am reading is of type DATE with NULLABLE = false)
SELECT effFrom, effTo FROM myTable WHERE ....
And is exceuted with a JDBCConnection
Object[][] result = JDBCConnection.getSqlObjectArray(query, bindVals);
I get the data back in my result and loop through it
for (int i = 0; i < result.length; i++) {
java.sql.Timestamp effFrom = (java.sql.Timestamp ) result[i][0];
java.sql.Timestamp effTo = (java.sql.Timestamp ) result[i][1];
but it fails converting the object to a java.sql.Timestamp with the error ClassCastException: Cannot cast java.sql.Date (id=9010) to java.sql.Timestamp
(java.sql.Timestamp ) result[i][0]
So I changed the convertion to
java.sql.Date effFrom = (java.sql.Date) result[i][0];
And it is working. All perfect.
But the thing is that my code did not change and the data did not change for the last 6 months atleast.
It did work with the Timestamp conversion up to yesterday and today it is not working with the timestamp but with the Date.
I then went back to the source code repo and seen that it was originally java.sql.Date about 2 years ago and now I basically rolled it back. So clearly there is something that I am missing here.
Is there a way that the JDBC query could mis interpret the data type?
As per Rudi's question what is inside JDBCConnection.getSqlObjectArray()
It has a JdbcDaoSupport class called dao
public static Object[][] getSqlObjectArray(String sql, String bind1, String bind2, String bind3) {
Object[][] a = dao.getSqlObjectArrayImpl(sql, bind1, bind2, bind3);
return a;
}

Related

Codec not found for requested operation: [TEXT <-> java.time.LocalDate]

This is the code which I am using to fill the column in the db.
DateTimeFormatter dateFormat = DateTimeFormatter.ofPattern("yyyy-MM-dd'T'HH:mm:ss.SSSSSS");
JSONObject publishedObj = jsonObject.optJSONObject("created");
if(publishedObj != null){
String dateStr = publishedObj.getString("value");
book.setPublishedDate(LocalDate.parse(dateStr,dateFormat));
}
Below is the instance variable of the column where the data needs to go:
#Column("published_date")
#CassandraType(type = CassandraType.Name.DATE)
private LocalDate publishedDate;
Error Message which i am getting:
com.datastax.oss.driver.api.core.type.codec.CodecNotFoundException: Codec not found for requested operation: [TEXT <-> java.time.LocalDate]
Can please someone help.
Thankyou!!
I can reproduce that error with your code above. To remedy it, I have ALTERed the book_by_id table with two new columns:
ALTER TABLE book_by_id ADD pubdate2 TEXT;
ALTER TABLE book_by_id ADD pubdate3 DATE;
My BookEntity class for those columns looks like this:
#Column("pubdate2")
#CassandraType(type = CassandraType.Name.TEXT)
private String publishedDate2;
#Column("pubdate3")
#CassandraType(type = CassandraType.Name.DATE)
private LocalDate publishedDate3;
The code to parse and set the date looks like this:
DateTimeFormatter dateFormat = DateTimeFormatter.ofPattern("yyyy-MM-dd'T'HH:mm:ss.SSSSSS");
String dateStr = "2022-03-03T09:52:33.235555";
musicBook.setPublishedDate2(LocalDate.parse(dateStr,dateFormat).toString());
musicBook.setPublishedDate3(LocalDate.parse(dateStr,dateFormat));
template.insert(musicBook);
tl;dr;
Redefine published_date as a DATE type, and it will work. Besides, dates/times should be stored in date/time types in databases.
Note that Cassandra won't allow you to modify a column's data type. Also, the process of dropping and adding a column with the same name in quick succession has proven to be problematic with Cassandra in the past. I'd advise adding a newly named column of a DATE type, and reloading its data. Or recreate the table (with the correct data types) and reload the data.

Converting a row in a column to LocalDate in Spark

I'm having a problem when I try to filter data where data < todayData
If I use this code, I get the wrong results
Code:
val todayData = LocalDate.now.format(
DateTimeFormatter.ofPattern("dd/MM/yyyy")) //22/09/2021
val filtredDF = sampleData.where(sampleData("data_riferimento_condizioni") < todayData)
One of reuslt:
+--------+--------+---------------------------+-----------+
|istituto|servizio|data_riferimento_condizioni| stato|
+--------+--------+---------------------------+-----------+
| 62952| 923| 02/12/2022|in progress|
+--------+--------+---------------------------+-----------+
As you can see I get data that > todayDate, I want to bring data_riferimento_condizioni to LocalDate so I can use public boolean isBefore(ChronoLocalDate other)
At first you need to convert "data_riferimento_condizioni" to DateType or TimestampType instead StringType with to_date() or to_timestamp() functions from there and then filter your data
For spark 3 and newer you can filter out you rows comparing them with instances of java.time.LocalDate or java.time.Instant
val filtredDF = sampleData
.withColumn("converted", to_date(col("data_riferimento_condizioni"), "dd/MM/yyyy"))
.where(col("converted") < LocalDate.now)
But if you're using spark 2, you have to convert your LocalDate or Instant to java.sql.Date or java.sql.Timestamp
val filtredDF = sampleData
.withColumn("converted", to_date(col("data_riferimento_condizioni"), "dd/MM/yyyy"))
.where(col("converted") < Date.valueOf(LocalDate.now))
You can read more about using dates in spark and differences between spark2 and spark3 there

Caused by: com.datastax.driver.core.exceptions.InvalidQueryException: Expected 8 or 0 byte long for date (25)

I developing application using Cassandra and SpringBoot.
I have written Cassandra query in Java;
String userName="testUser";
String lastUpdatedDate="2018-11-29 13:00:43.400";
String tenantName="demo";
Select select = QueryBuilder.select().all()
.from(tenantName,getGenericClass().getSimpleName())
.where(QueryBuilder.eq("user_Name", userName))
.and(QueryBuilder.gt("last_updateddate", lastUpdatedDate))
.allowFiltering()
.limit(100);
return (List<T>) cassandraOperations.select(select, getGenericClass());
last_updateddate is timestamp data type column in Cassandra. userName and last_updateddate columns are composite key in database and using latest version of Cassandra.
while executing getting the following error.
Caused by: com.datastax.driver.core.exceptions.InvalidQueryException: Expected 8 or 0 byte long for date (25)
but
Issue got resolved after below change.
SimpleDateFormat sdf = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss.SSS");
sdf.setTimeZone(TimeZone.getTimeZone("UTC"));
Date date = sdf.parse(lastUpdatedDate);
long timeInSec = date.getTime();
Timestamp ts=new Timestamp(timeInSec);
Date date1=ts;
Select select = QueryBuilder.select().all()
.from(tenantName,getGenericClass().getSimpleName())
.where(QueryBuilder.eq("user_Name", userName))
.and(QueryBuilder.gt("last_updateddate", date1))
.allowFiltering()
.limit(100);
return (List<T>) cassandraOperations.select(select, getGenericClass());
you have to check data type of Cassandra table and use corresponding datatype of java and use it in java code when declaring variable to pass value of column .
Example : In Cassandra LocalDate then define LocalDate of cassandra.core package in java instead Date or LocalDate of util package.
check this type mapping between Cassandra vs Java

How to insert date in mongo db from java

There are many similar questions asked. But not exactly similar to the issue i am facing. I have seen almost all the questions and answers around it
So the problem is
I got to insert a date field in my mongo collection
But I can't access the collection directly. I got to use a service. The service takes a string and returns me oid.
So once i construct the BasicDBObject I call toString on it and pass it on to my service.. I even tried inserting it directly in a test collection and mongo is complaining.
BasicDBObject document = new BasicDBObject();
long createdAtSinceEpoch = 0;
long expiresAtSinceEpoch = 0;
createdAtSinceEpoch = System.nanoTime();
Date createdAt = new Date(TimeUnit.NANOSECONDS.toMillis(createdAtSinceEpoch));
document.append("createdAt", createdAt);
expiresAtSinceEpoch = createdAtSinceEpoch + +TimeUnit.SECONDS.toNanos(30);
Date expiresAt = new Date(TimeUnit.NANOSECONDS.toMillis(expiresAtSinceEpoch));
document.append("expiresAt", expiresAt);
service.storeRecord(document.toString());
and the generated JSON String looks like
{
"createdAt": {
"$date": "2015-09-01T20:05:21.641Z"
},
"expiresAt": {
"$date": "2015-09-01T20:05:51.641Z"
}
and Mongo complains that
Unable to parse JSON : Date expecting integer milliseconds, at (3,17)
So If i pass milliseconds alone instead of date object in the document.append() method then it DOES NOT recognize this field as date and considers it as String but inserts into the collection
I need 2 things
1) I want the data to be inserted
2) I am planning to expire that row by adding an index to the expiresAt field. So I want mongo to recognize that its a date field
JSON makes a difference between a numeric field and a text field containing a number. The latter one is only recognized as a String; I assume that this is what you did when you thought you were giving your service the date as an integer. Unfortunately you didn’t show us the relevant code.
When I save the Date info as a non String format, I annotate the field in my DTO as below. This helps the MongoDB know that the field is to be treated as an ISO date which then would be useful for making range search etc.,
#DateTimeFormat(iso = ISO.DATE_TIME) private Date date;
Date date = new Date();
BasicDBObject date= new BasicDBObject("date", date);
Data.insert(date);

Oracle DATE and Hibernate/JPA mapping

One of the Oracle tables looks like this, I dont have the option of changing this:
REPORTING_PERIOD | REPORTING_DATE (Oracle DATE type)
-------------------------------
1140 01-FEB-12
1139 01-JAN-12
The JPA entity (with Hibernate as the provider) which looks like this :
#Column(name="REPORTING_PERIOD")
private long reportingPeriod;
#Temporal( TemporalType.DATE)
#Column(name="REPORT_DATE")
private Date reportDate; //java.util.Date
Now, let us go through my unit tests:
(I am using Spring Data JPA for repositories)
The below line queries the DB by REPORTING_PERIOD column
ReportingPeriod period1 = reportingPeriodRepository.findByMaxReportingPeriod();
assertNotNull(period1); // works fine and entity is fetched
System.out.println(period1.getReportDate());
The out put of SOP is 2012-02-01 - Notice the automatic conversion from value in db 01-FEB-12
Now, If I query directly by date using '01-FEB-12', as I am doing below, I dont get any results:
ReportingPeriod period2 = reportingPeriodRepository.findByReportDate(period1.getReportDate());
assertNotNull(period2);
Notice that, I am using date field from the same entity which I could successfully fetch in the previous statement.
Nor this works :
DateFormat df = new SimpleDateFormat("yyyy-MM-dd");
ReportingPeriod period3 = reportingPeriodRepository.findByReportDate(df.parse("2012-02-01"));
assertNotNull(period3);
Any help on how can I query ( with HQL will also be ok) by REPORTING_DATE as the param when the value in db is 01-FEB-12 is greatly appreciated.
I think there is some explicit date format conversion while obtaining the result in reportingPeriodRepository.findByMaxReportingPeriod();
Hence we can check whether we get the data using the same format as database format
Change
DateFormat df = new SimpleDateFormat("yyyy-MM-dd");
ReportingPeriod period3 = reportingPeriodRepository.findByReportDate(df.parse("2012-02-01"))
to
DateFormat df = new SimpleDateFormat("dd-MMM-yy");
ReportingPeriod period3 = reportingPeriodRepository.findByReportDate(df.parse("01-FEB-12"))

Categories

Resources