In java I get this error inserting into cassandra,
I also tried timestamp data type,
So is issue date type, or java code ? Can you suggest code change
Reading a csv file.
Error:
Exception in thread "main" com.datastax.driver.core.exceptions.InvalidQueryException: Invalid STRING constant (2012/11/11) for "fl_date" of type timeuuid
Structure:
fl_date timeuuid,
Code:
System.out.println("Processing CSV file ...");
List<Flight> flightList = ProcessFlightsCSV.processFlights("flights_from_pg.csv");
for (Flight flight : flightList) {
System.out.println(flight);
Insert query = QueryBuilder.insertInto("flights")
.value("id", flight.getId())
.value("year", flight.getYear())
.value("fl_date", flight.getFlDate())
.value("airline_id", flight.getAirlineId())
.value("carrier", flight.getCarrier())
.value("fl_num", flight.getFlNum())
.value("origin_airport_id", flight.getOriginAirportId())
.value("origin", flight.getOrigin())
.value("origin_city_name", flight.getOriginCityName())
.value("origin_state_abr", flight.getOriginStateAbr())
.value("dest", flight.getDest())
.value("day_of_month", flight.getDayOfMonth())
.value("dest_city_name", flight.getDestCityName())
.value("dest_state_abr", flight.getDestStateAbr())
.value("dep_time", flight.getDepTime())
.value("arr_time", flight.getArrTime())
.value("distance", flight.getDistance())
;
session.execute(query.toString());
}
}
}
There is no direct method exists in Cassandra that would convert your string-date-time into the TimeUUID. So you need to make timeuuid from date-time value. Here is an simple example:
long fl_date_in_mili = // some code to get millisecond time from flight.getFlDate(); DO it yourself, As we don't know what kind of datatype it is.
...
...
UUID fl_date_time_uuid = UUIDs.endOf(fl_date_in_mili);
// or
UUID fl_date_time_uuid = UUIDs.startOf(fl_date_in_mili);
...
...
...
Insert query = QueryBuilder.insertInto("flights")
.value("id", flight.getId())
......
.value("fl_date", flDateTimeUUID)
.value("airline_id", flight.getAirlineId())
.....
Here we use: com.datastax.driver.core.utils.UUIDs; which could be found in datastax cassandra driver.
You need to convert the date from string YYYY/MM/DD format to long format in ms and pass it on. You can convert that using java date utility functions. (How to convert a string Date to long millseconds).
Related
This is the code which I am using to fill the column in the db.
DateTimeFormatter dateFormat = DateTimeFormatter.ofPattern("yyyy-MM-dd'T'HH:mm:ss.SSSSSS");
JSONObject publishedObj = jsonObject.optJSONObject("created");
if(publishedObj != null){
String dateStr = publishedObj.getString("value");
book.setPublishedDate(LocalDate.parse(dateStr,dateFormat));
}
Below is the instance variable of the column where the data needs to go:
#Column("published_date")
#CassandraType(type = CassandraType.Name.DATE)
private LocalDate publishedDate;
Error Message which i am getting:
com.datastax.oss.driver.api.core.type.codec.CodecNotFoundException: Codec not found for requested operation: [TEXT <-> java.time.LocalDate]
Can please someone help.
Thankyou!!
I can reproduce that error with your code above. To remedy it, I have ALTERed the book_by_id table with two new columns:
ALTER TABLE book_by_id ADD pubdate2 TEXT;
ALTER TABLE book_by_id ADD pubdate3 DATE;
My BookEntity class for those columns looks like this:
#Column("pubdate2")
#CassandraType(type = CassandraType.Name.TEXT)
private String publishedDate2;
#Column("pubdate3")
#CassandraType(type = CassandraType.Name.DATE)
private LocalDate publishedDate3;
The code to parse and set the date looks like this:
DateTimeFormatter dateFormat = DateTimeFormatter.ofPattern("yyyy-MM-dd'T'HH:mm:ss.SSSSSS");
String dateStr = "2022-03-03T09:52:33.235555";
musicBook.setPublishedDate2(LocalDate.parse(dateStr,dateFormat).toString());
musicBook.setPublishedDate3(LocalDate.parse(dateStr,dateFormat));
template.insert(musicBook);
tl;dr;
Redefine published_date as a DATE type, and it will work. Besides, dates/times should be stored in date/time types in databases.
Note that Cassandra won't allow you to modify a column's data type. Also, the process of dropping and adding a column with the same name in quick succession has proven to be problematic with Cassandra in the past. I'd advise adding a newly named column of a DATE type, and reloading its data. Or recreate the table (with the correct data types) and reload the data.
I developing application using Cassandra and SpringBoot.
I have written Cassandra query in Java;
String userName="testUser";
String lastUpdatedDate="2018-11-29 13:00:43.400";
String tenantName="demo";
Select select = QueryBuilder.select().all()
.from(tenantName,getGenericClass().getSimpleName())
.where(QueryBuilder.eq("user_Name", userName))
.and(QueryBuilder.gt("last_updateddate", lastUpdatedDate))
.allowFiltering()
.limit(100);
return (List<T>) cassandraOperations.select(select, getGenericClass());
last_updateddate is timestamp data type column in Cassandra. userName and last_updateddate columns are composite key in database and using latest version of Cassandra.
while executing getting the following error.
Caused by: com.datastax.driver.core.exceptions.InvalidQueryException: Expected 8 or 0 byte long for date (25)
but
Issue got resolved after below change.
SimpleDateFormat sdf = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss.SSS");
sdf.setTimeZone(TimeZone.getTimeZone("UTC"));
Date date = sdf.parse(lastUpdatedDate);
long timeInSec = date.getTime();
Timestamp ts=new Timestamp(timeInSec);
Date date1=ts;
Select select = QueryBuilder.select().all()
.from(tenantName,getGenericClass().getSimpleName())
.where(QueryBuilder.eq("user_Name", userName))
.and(QueryBuilder.gt("last_updateddate", date1))
.allowFiltering()
.limit(100);
return (List<T>) cassandraOperations.select(select, getGenericClass());
you have to check data type of Cassandra table and use corresponding datatype of java and use it in java code when declaring variable to pass value of column .
Example : In Cassandra LocalDate then define LocalDate of cassandra.core package in java instead Date or LocalDate of util package.
check this type mapping between Cassandra vs Java
I am new to spark, and was trying to write a dataframe to db2 table. The error I am getting is:
Exception in thread "main" java.lang.IllegalArgumentException: Can't get JDBC type for struct <data:int, day:int, hours:int, minutes:int, month:int, seconds:int, time:bigint, timeZoneOffset:int, year:int>
My database schema is
localId <-- Integer type
effectiveDate <-- Timestamp
activityDate <-- Timestamp
inDate <-- Timestamp
outDate <-- Timestamp
I created a POJO class for my db table which goes like this
public class StowageTable {
private long localId;
private Date effectiveDate;
private Date activityDate;
private Date inDate;
private Date outDate;
//setters and getters
}
I then basically read a csv which has the same schema as the db table as follows:
JavaRDD<String> dataFromCSV = javaSparkContext.textFile(fileURL);
//The I create a JavaRDD of the POJO type
JavaRDD<StowageTable> dataToPOJO = dataFromCSV.map((Function<String, StowageTable) line -> {
String[] fields = line.split(",");
StowageTable st = createNewStowageTable(fields);
return st;
});
//converting the RDD to DataFrame
DataFrame stowageTableDF = sqlContext.createDataFrame(dataToPOJO, StowageTable.class);
//call jdbc persister
persistToTable(stowageTableDF);
My persistToTable(DataFrame df) method is as follows:
private void persistToTable(DataFrame df) {
Class.forName("")//driver here
//skipping a few lines for brevity
df.write().mode(SaveMode.Append).jdbc(url, table, connectionProperties);
}
I found a few solutions here: Spark DataFrame write to JDBC - Can't get JDBC type for array<array<int>>
and java.lang.IllegalArgumentException: Can't get JDBC type for array<string>
but could not find any which addresses a date-time data type issue. Please suggest me some solution to it. I am on spark 1.6.3.
Since I could not find any answer yet, and figured out a solution for myself in the meantime so here is the basic idea. If the database has datatype as Timestamp, then you have to use Timestamp in the POJO of the object and then convert that Timestamp to spark's structtype.
I'm trying to create an aggregate query using mongotemplate where there's a grouping by date (i.e 2016-03-01) instead of datetime (i.e 2016-03-01 16:40:12).
The dateToString operation exists in the mongodb documentation it can be used to extract the date from the datetime using formatting:
https://docs.mongodb.org/manual/reference/operator/aggregation/dateToString/
but I get get it to work with mongotemplate - I get a NullPointerException.
(my db version is 3.2)
List<AggregationOperation> aggregationOperations = new ArrayList<AggregationOperation>();
aggregationOperations.add(
Aggregation.project("blabla", ...).
andExpression("dateToString('%Y-%m-%d',timeCreated).as("date"));
aggregationOperations.add(Aggregation.group("date").sum("blabla").as("blabla"));
AggregationResults<?> aggregationResults = this.mongoTemplate.aggregate(
Aggregation.newAggregation(aggregationOperations),
collectionName,
resultClass);
When I use dayOfMonth(timeCreated) to extract the day, there's no exception, but I couldn't find and example of how to make this work with dateToString. I tried without '' for the date format, and it also didn't work...
This is the exception I get:
java.lang.NullPointerException
at org.bson.BasicBSONEncoder._putObjectField(BasicBSONEncoder.java:226)
at org.bson.BasicBSONEncoder.putObject(BasicBSONEncoder.java:194)
at org.bson.BasicBSONEncoder._putObjectField(BasicBSONEncoder.java:255)
at org.bson.BasicBSONEncoder.putObject(BasicBSONEncoder.java:194)
at org.bson.BasicBSONEncoder._putObjectField(BasicBSONEncoder.java:255)
at org.bson.BasicBSONEncoder.putObject(BasicBSONEncoder.java:194)
at org.bson.BasicBSONEncoder._putObjectField(BasicBSONEncoder.java:255)
at org.bson.BasicBSONEncoder.putIterable(BasicBSONEncoder.java:324)
at org.bson.BasicBSONEncoder._putObjectField(BasicBSONEncoder.java:263)
at org.bson.BasicBSONEncoder.putObject(BasicBSONEncoder.java:194)
at org.bson.BasicBSONEncoder.putObject(BasicBSONEncoder.java:136)
at com.mongodb.DefaultDBEncoder.writeObject(DefaultDBEncoder.java:36)
at com.mongodb.OutMessage.putObject(OutMessage.java:289)
at com.mongodb.OutMessage.writeQuery(OutMessage.java:211)
at com.mongodb.OutMessage.query(OutMessage.java:86)
at com.mongodb.DBCollectionImpl.find(DBCollectionImpl.java:81)
at com.mongodb.DB.command(DB.java:320)
at com.mongodb.DB.command(DB.java:299)
at com.mongodb.DB.command(DB.java:374)
at com.mongodb.DB.command(DB.java:246)
at org.springframework.data.mongodb.core.MongoTemplate$2.doInDB(MongoTemplate.java:357)
at org.springframework.data.mongodb.core.MongoTemplate$2.doInDB(MongoTemplate.java:355)
at org.springframework.data.mongodb.core.MongoTemplate.execute(MongoTemplate.java:442)
at org.springframework.data.mongodb.core.MongoTemplate.executeCommand(MongoTemplate.java:355)
at org.springframework.data.mongodb.core.MongoTemplate.aggregate(MongoTemplate.java:1497)
at org.springframework.data.mongodb.core.MongoTemplate.aggregate(MongoTemplate.java:1432)
EDIT:
Eventually we decided here on a different solution than what was suggested below, I'm writing it here in case anyone else finds it useful:
In addition to the "timeCreated" field which holds the datetime, we saved another field in the document: "date", that holds just the date (as long).
For example if "timeCreated" = "2015-12-24 16:36:06.657+02:00", then date is "2015-12-24 00:00:00", and we save 1449180000000.
Now we can simply group by "date".
You could try projecting the fields first by using the SpEL andExpression in the projection operation and then group by the new fields in the group operation:
Aggregation agg = newAggregation(
project()
.andExpression("year(timeCreated)").as("year")
.andExpression("month(timeCreated)").as("month")
.andExpression("dayOfMonth(timeCreated)").as("day"),
group(fields().and("year").and("month").and("day"))
.sum("blabla").as("blabla")
);
AggregationResults<BlaBlaModel> result =
mongoTemplate.aggregate(agg, collectionName, BlaBlaModel.class);
List<BlaBlaModel> resultList = result.getMappedResults();
You could try using the DateOperators.DateToString class
aggregationOperations.add(
Aggregation.project("blabla", ...).
and(DateOperators.DateToString.dateOf("timeCreated").toString("%Y-%m-%d"));
There are many similar questions asked. But not exactly similar to the issue i am facing. I have seen almost all the questions and answers around it
So the problem is
I got to insert a date field in my mongo collection
But I can't access the collection directly. I got to use a service. The service takes a string and returns me oid.
So once i construct the BasicDBObject I call toString on it and pass it on to my service.. I even tried inserting it directly in a test collection and mongo is complaining.
BasicDBObject document = new BasicDBObject();
long createdAtSinceEpoch = 0;
long expiresAtSinceEpoch = 0;
createdAtSinceEpoch = System.nanoTime();
Date createdAt = new Date(TimeUnit.NANOSECONDS.toMillis(createdAtSinceEpoch));
document.append("createdAt", createdAt);
expiresAtSinceEpoch = createdAtSinceEpoch + +TimeUnit.SECONDS.toNanos(30);
Date expiresAt = new Date(TimeUnit.NANOSECONDS.toMillis(expiresAtSinceEpoch));
document.append("expiresAt", expiresAt);
service.storeRecord(document.toString());
and the generated JSON String looks like
{
"createdAt": {
"$date": "2015-09-01T20:05:21.641Z"
},
"expiresAt": {
"$date": "2015-09-01T20:05:51.641Z"
}
and Mongo complains that
Unable to parse JSON : Date expecting integer milliseconds, at (3,17)
So If i pass milliseconds alone instead of date object in the document.append() method then it DOES NOT recognize this field as date and considers it as String but inserts into the collection
I need 2 things
1) I want the data to be inserted
2) I am planning to expire that row by adding an index to the expiresAt field. So I want mongo to recognize that its a date field
JSON makes a difference between a numeric field and a text field containing a number. The latter one is only recognized as a String; I assume that this is what you did when you thought you were giving your service the date as an integer. Unfortunately you didn’t show us the relevant code.
When I save the Date info as a non String format, I annotate the field in my DTO as below. This helps the MongoDB know that the field is to be treated as an ISO date which then would be useful for making range search etc.,
#DateTimeFormat(iso = ISO.DATE_TIME) private Date date;
Date date = new Date();
BasicDBObject date= new BasicDBObject("date", date);
Data.insert(date);