DBUnit inserting Timestamp default value in H2 Table parsing error - java

When i try to use DBUnit to insert a record into a H2 table i get an Exception caused by:
Caused by: org.h2.jdbc.JdbcSQLException:
Cannot parse "TIMESTAMP" constant "1970-00-01";
SQL statement: insert into PUBLIC.TABLE (COLX, COLY, COLZ) values (?, ?, ?)
Caused by: java.lang.IllegalArgumentException:
1970-0-1 at org.h2.util.DateTimeUtils.parseDateValue(DateTimeUtils.java:276)
None of the values are for the required timestamp column, so dbUnit seems to try to insert a default value which raises the problem.
Note: the 1970-0-1 is not the 1970-00-01 as in the table description (see below).
I am not sure whre i could configure the behaviour. Anyways her's some of my setup which could help identify the error:
Create statement:
create table MYTABLE (
"UUID" binary default random_uuid() not null,
"COL2" varchar(18) not null,
"COL3" varchar(20),
"COL4" timestamp default '1970-00-01',
...
The DataSet XML (as said the three values are not including the timestamp column)
<dataset>
<MYTABLE COLX="text" COLY="1" COLZ="Text"/>
i have also set
dbConfig.setProperty(DatabaseConfig.PROPERTY_DATATYPE_FACTORY, new H2DataTypeFactory());
I was thinking of two possibilities:
1. Is dbUnit trying to parse the value from the column info?
2. Is there a standard default generator for timestamp/date fields in dbUnit?
So any ideas what i need to do to have the correct default value will be inserted?

There is no month 0. January is month 1. So you need to use '1970-01-01' instead of '1970-00-01'.

Related

Use getColumnName() or getColumnLabel() for getString()?

I am facing an issue with the SAS jdbc driver I have not seen before, and wondered what would be the correct JDBC behaviour.
Suppose I have some ResultSetMetaData:
metadata.getColumnName(index) -> col1
metadata.getColumnLabel(index) -> Column1
This would be the SQL result when running this query:
SELECT col1 AS Column1
When getting a value from a ResultSet, I expect to use this:
rs.getString("Column1")
But instead, I seem to have to use:
rs.getString("col1")
Is this to be expected and where my assumptions wrong? Or is this driver-specific behaviour?
In JDBC, you retrieve values of a result set by column label (the alias), not by column name. In the code in your question, the proper way to retrieve a value is using rs.getString("Column1") (or - as it is handled case-insensitive - rs.getString("COLUMN1"))
This is documented in the API, as all String based getters have the following documentation:
Parameters:
columnLabel - the label for the column specified with the SQL AS
clause. If the SQL AS clause was not specified, then the label is
the name of the column
Historically, JDBC 3 and earlier did not clearly discern between column labels and column names, which - to this day - results in drivers that require you to get by column name, or allow you to get both by column name or label, or return the column name from ResultSetMetaData.getColumnLabel(int) or the column label from ResultSetMetaData.getColumnName(int), or have configuration options to set which behavior to use.
Since you are using alias it seems the jdbc is expecting alias in that column name argument.
I found an related link for this : DB alias Resultset.
Hope this might be helpful

You have an error in your SQL syntax;..(in the alter command)

Please tell me what causes error in this sql query given below:-
ALTER TABLE message ADD COLUMN sent_at timestamptz DEFAULT NOW();
timestamptz is not a MySQL data type. timestamp is a MySQL data type.
You have an error don't need column
ALTER TABLE message ADD sent_at timestamptz DEFAULT NOW();

Can't create table in DB2

I can't create the table in DB2 via Eclipse. I've been stuck for a long time and I've searched a lot for the following error:
Error SQLCODE=-204
Below is my code:
CREATE TABLE BaseEntity(
wts Timestamp NOT NULL,
siteID NOT NULL,
oid varchar2(11),
PRIMARY KEY (oid),
AccelerationVector varchar2(8),
DeadReckoningAlgorithm varchar2(8),
Orientation varchar2(8),
WorldLocation varchar2(8),
VelocityVector varchar2(8)
)
com.ibm.db2.jcc.am.SqlSyntaxErrorException: DB2 SQL Error: SQLCODE=-204, SQLSTATE=42704, SQLERRMC=NOT, DRIVER=3.63.123
at com.ibm.db2.jcc.am.fd.a(fd.java:679)
at com.ibm.db2.jcc.am.fd.a(fd.java:60)
at com.ibm.db2.jcc.am.fd.a(fd.java:127)
at com.ibm.db2.jcc.am.ho.b(ho.java:2317)
at com.ibm.db2.jcc.am.ho.c(ho.java:2300)
at com.ibm.db2.jcc.t4.cb.l(cb.java:370)
at com.ibm.db2.jcc.t4.cb.a(cb.java:62)
at com.ibm.db2.jcc.t4.q.a(q.java:50)
at com.ibm.db2.jcc.t4.tb.b(tb.java:220)
at com.ibm.db2.jcc.am.io.lc(io.java:3318)
at com.ibm.db2.jcc.am.io.b(io.java:4275)
at com.ibm.db2.jcc.am.io.dc(io.java:759)
at com.ibm.db2.jcc.am.io.executeUpdate(io.java:742)
at testDB.XmlToDBSchema.insertIntoDB(XmlToDBSchema.java:37)
at testDB.XmlToDBSchema.createDBSchma(XmlToDBSchema.java:191)
at testXMLPar.testXML.main(testXML.java:16)
The error -204 refers to an undefined name which can have several reasons. See here for an overview. In your case the statement has several issues:
siteID does not have a type,
primary key is in the middle of the statement, this should be moved to the end
varchar2 can only be used if the database has been enabled for it, else you may get this error messages
To correct the errors you have to rewrite the statement, use data types where needed and to make sure that varchar2 support is enabled (check get db cfg).

Insert map and other complex types in hive using jdbc

I have a java Map (Map) and a JDBC connection to hive server.
The schema of the table at the server contains a column of type Map.
Is it possible to insert the java Map to the hive table column with similar datatype using JDBC?
I tried:
"create table test(key string, value Map<String, String>)"
"insert into table test values ('keywer', map('subkey', 'subvalue')) from dummy limit 1;"
ref: Hive inserting values to an array complex type column
but the insert failed with:
"Error: Error while compiling statement: FAILED: ParseException line 1:69 missing EOF at 'from' near ')' (state=42000,code=40000)"
[EDIT]
hive version is : 0.14.0
Thanks
The manual clearly says you cannot insert into a Map data type using SQL:
"Hive does not support literals for complex types (array, map, struct, union), so it is not possible to use them in INSERT INTO...VALUES clauses. This means that the user cannot insert data into a complex datatype column using the INSERT INTO...VALUES clause.”
I think the correct DDL and query would be:
CREATE TABLE test(key STRING, value MAP<STRING, STRING>);
INSERT INTO TABLE test VALUES('keywer', map('subkey', 'subvalue')) from dummy limit 1;
A working method to put a complex type from jdbc client is:
insert into table test select "key",map("key1","value1","key2","value2") from dummy limit 1;
where dummy is another table which has at least one row.

BatchUpdateException in Postgresql database?

I am working with Postgresql database and Java. I am trying to insert into Postgresql database using prepared statement so I am using Batch insert as well. And I am trying to insert 1000 records at a time.
And below is the exception I am getting when I am trying to insert into one of my tables in batches.
java.sql.BatchUpdateException: Batch entry 0 INSERT into data_week (name, address, version, type, size, fax, phone) values('helloooooo', '360 S', '5.0beta4', NULL, '-1', '673', '300') was aborted. Call getNextException to see the cause.
Does anyone know what does thise error means? And also, can I use batch insert for Postgresql database or not?
EDIT: Below is my schema for the table data_week
CREATE TABLE data_week
(
name character varying(256),
address character varying(256),
version character varying(256),
type integer,
size integer,
fax character varying(256),
phone integer,
created_date timestamp without time zone NOT NULL DEFAULT (now() - '7 days'::interval),
updated_date timestamp without time zone NOT NULL DEFAULT now()
)

Categories

Resources