I want to cache a postgres table into apache ignite cache; with json and jsonb support.
CREATE TABLE public.some_event (
id BIGINT NOT NULL,
name TEXT,
team_squad JSON,
event_type TEXT,
start_datime BIGINT,
end_datime BIGINT,
is_running BOOLEAN,
is_finished BOOLEAN,
recent_matches JSON,
CONSTRAINT event_info_pkey
PRIMARY KEY (id)
);
for say in apache ingite configuration,
"recent_matches" field is selected as jdbc type of other, and java type is Object; I get PGObject.
if field is selected as jdbc type varchar and java type is String; I get escaped json like this
"\"id\"": ..."
if sql with type casting ::text I get BufferedStream.
I don't need any special filtering for json or any special sql. Just want to send string for insert and update. For reading, the json string without double quote escaping.
As I am new to apache ignite, From documentation I am unable to understand binary marshaling and I am unable to find any complete example.
Can you provide any complete example?
Ignite has no specific support for PostgreSQL's JSON type.
You might need to extend CacheJdbcPojoStore, override fillParameter() method.
Related
I am using Spark Cassandra connector in Java to insert data. My data has a timeuuid and timestamp field. I have the following table:
CREATE TABLE abc.log (
time_uuid timeuuid,
session_id text,
event text,
time timestamp,
sequence int,
PRIMARY KEY (customer)
);
I am using this code to insert:
JavaRDD<EventLog> rdd = sc.parallelize(eventLogs);
javaFunctions(rdd)
.writerBuilder("dove", "event_log", mapToRow(EventLog.class))
.saveToCassandra();
how do I insert the timeuuid and timestamp fields? Using normal insert I would just use the now() function, how do I do that here?
You may use com.datastax.driver.core.utils.UUIDs for this.
The UUIDsTest utilizes the class like this to create a TimeUUID:
UUID uuid = UUIDs.timeBased();
Note that UUID is java.util.UUID. Note sure if you need it for your use case, but after that you can retrieve timestamp of the UUID by calling UUIDs.unixTimestamp(uuid);.
As for your timestamp, you pass an instance of java.util.Date, as proposed on the docs.
I’m struggling to insert a JSON object into my postgres v9.4 DB. I have defined the column called "evtjson" as type json (not jsonb).
I am trying to use a prepared statement in Java (jdk1.8) to insert a Json object (built using JEE javax.json libraries) into the column, but I keep running into SQLException errors.
I create the JSON object using:
JsonObject mbrLogRec = Json.createObjectBuilder().build();
…
mbrLogRec = Json.createObjectBuilder()
.add("New MbrID", newId)
.build();
Then I pass this object as a parameter to another method to write it to the DB using a prepared statement. (along with several other fields) As:
pStmt.setObject(11, dtlRec);
Using this method, I receive the following error:
org.postgresql.util.PSQLException: No hstore extension installed.
at org.postgresql.jdbc.PgPreparedStatement.setMap(PgPreparedStatement.java:553)
at org.postgresql.jdbc.PgPreparedStatement.setObject(PgPreparedStatement.java:1036)
I have also tried:
pStmt.setString(11, dtlRec.toString());
pStmt.setObject(11, dtlRec.toString());
Which produce a different error:
Event JSON: {"New MbrID":29}
SQLException: ERROR: column "evtjson" is of type json but expression is of type character varying
Hint: You will need to rewrite or cast the expression.
But, at least this tells me that the DB is recognizing the column as type JSON.
I did try installing the hstore extension, but it then told me that it was not an hstore object.
OracleDocs shows a number of various methods to set the parameter value in the preparedStatement, but I'd rather not try them all if someone knows the answer. (http://docs.oracle.com/javase/8/docs/api/java/sql/PreparedStatement.html) These also reference an additional parameter, SQLType, but I can't find any reference to these.
Should I try setAsciiStream? CharacterStream? CLOB?
This behaviour is quite annoying since JSON strings are accepted without problems when used as literal strings in SQL commands.
There is a already an issue for this in the postgres driver Github repository (even if the problem seems the be the serverside processing).
Besides using a cast (see answer of
#a_horse_with_no_name) in the sql string, the issue author offers two additional solutions:
Use a parameter stringtype=unspecified in the JDBC connection URL/options.
This tells PostgreSQL that all text or varchar parameters are actually
of unknown type, letting it infer their types more freely.
Wrap the parameter in a org.postgresql.util.PGobject:
PGobject jsonObject = new PGobject();
jsonObject.setType("json");
jsonObject.setValue(yourJsonString);
pstmt.setObject(11, jsonObject);
You can do it like this and you just need the json string:
Change the query to:
String query = "INSERT INTO table (json_field) VALUES (to_json(?::json))"
And set the parameter as a String.
pStmt.setString(1, json);
You have two options:
Use statement.setString(jsonStr) and then handle the conversion in the sql statement:
PreparedStatement statement = con.prepareStatement(
"insert into table (jsonColumn) values (?::json)");
statement.setString(1, jsonStr);
Another option is to use PGobject to create a custom value wrapper.
PGobject jsonObject = new PGobject();
PreparedStatement statement = con.prepareStatement(
"insert into table (jsonColumn) values (?)");
jsonObject.setType("json");
jsonObject.setValue(jsonStr);
statement.setObject(1, jsonObject);
I personally prefer the latter as the query is cleaner
Passing the JSON as a String is the right approach, but as the error message tells you, you need to cast the parameter in the INSERT statement to a JSON value:
insert into the_table
(.., evtjson, ..)
values
(.., cast(? as json), ..)
Then you can use pStmt.setString(11, dtlRec.toString()) to pass the value
Most answers here defines ways of inserting into postgres json field with jdbc in a non-standard way, ie. it is db implementation specific. If you need to insert a java string into a postgres json field with pure jdbc and pure sql use:
preparedStatement.setObject(1, "{}", java.sql.Types.OTHER)
This will make the postgres jdbc driver (tested with org.postgresql:postgresql:42.2.19) convert the java string to the json type. It will also validate the string as being a valid json representation, something that various answers using implicit string casts does not do - resulting in the possibility of corrupt persisted json data.
As others have mentioned, your SQL string needs to explicitly cast the bind value to the PostgreSQL json or jsonb type:
insert into t (id, j) values (?, ?::json)
Now you can bind the string value. Alternatively, you can use a library that can do it, for example jOOQ (works out of the box) or Hibernate (using a third party UserType registration). The benefits of this is that you don't have to think about this every time you bind such a variable (or read it). A jOOQ example:
ctx.insertInto(T)
.columns(T.ID, T.J)
.values(1, JSON.valueOf("[1, 2, 3]"))
.execute();
Behind the scenes, the same cast as above is always generated, whenever you work with this JSON (or JSONB) data type.
(Disclaimer: I work for the company behind jOOQ)
if using spring boot: adding the following line to application.properties helped:
spring.datasource.hikari.data-source-properties.stringtype=unspecified
as Wero wrote:
This tells PostgreSQL that all text or varchar parameters are actually
of unknown type
Instead of passing json object pass its string value and cast it to json in the query.
Example:
JSONObject someJsonObject=..........
String yourJsonString = someJsonObject.toString();
String query = "INSERT INTO table (json_field) VALUES (to_json(yourJsonString::json))";
this worked for me.
In Oracle I have types like:
create or replace TYPE "LOCATION_RECORD" AS OBJECT
( PSTL_ADR ADR_TABLE
,GEO_CDE VARCHAR2(36)
,PRMRY_FLG VARCHAR2(5)
);
create or replace TYPE "LOCATION_TABLE" AS TABLE OF LOCATION_RECORD;
create or replace TYPE "ADR_RECORD" AS OBJECT
( LN_1_TXT VARCHAR2(100)
,LN_2_TXT VARCHAR2(100)
)
create or replace TYPE "ADR_TABLE" AS TABLE OF ADR_RECORD;
Procedure:
PROCEDURE Main(P_LOCATION_TABLE IN LOCATION_TABLE
,P_OUTPUT IN LOCATION_TABLE );
How can I call this procedure in java? This is a special case where one oracle type table contains another
oracle type table.
I had tried with SQL data implementation. It was not working.
Only if oracle type record has no table type as one of its parameter then it works through java.
BUT when one oracle type record has another table type as one of its parameter, it does not work.
How to construct STRUCT array type of LOCATION table which also contains ADR_TABLE? How to get output parameter?
Consider to use Spring Data JDBC Extensions
The convenient class for you is SqlStructArrayValue, where arrayTypeName is your LOCATION_TABLE, structTypeName - LOCATION_RECORD. And StructMapper implementation should correctly populate a StructDescriptor for your values.
Right, in this case you should use SqlStructArrayValue as nested object one more time, because your PSTL_ADR is an ARRAY too.
See the source code of that project and test-cases from there.
When you use cqlsh with Cassandra you can make a describe query to get the information of a table for example:
DESCRIBE TABLE emp;
And it will give you something like:
CREATE TABLE emp (
empid int PRIMARY KEY,
deptid int,
description text
) ...
....
So how can I query this using Astyanax support for CQL. I was able to query simple SELECT statements with this:
OperationResult<CqlResult<String, String>> result
= keyspace.prepareQuery(empColumnFamily)
.withCql("Select * from emp;")
.execute();
But this isn't working for DESCRIBE statements.
PD: I am really doing this to get the DATA TYPES of the table, parsing it later and obtaining for example int, int, text, so please if you have a different approach to get those, it could be awesome.
This query select column, validator from system.schema_columns; doesn't work because it doesn't return the composite keys.
DESCRIBE is not part of the CQL spec (neither CQL2 nor CQL3). If you'd like to completely reconstruct the DESCRIBE you could take a look at cqlsh implementation (look for print_recreate_columnfamily).
You could also get some more meta info from system.schema_columnfamilies:
select keyspace_name, columnfamily_name, key_validator from schema_columnfamilies;
I am using geotools library to extract the location information. With that I am getting an object of type
class com.vividsolutions.jts.geom.MultiPolygon
I now want to store this field in my mysql table with a jdbc connection .
When I directly try to insert it as
pstmtInsert.setObject(4, geoobject)
I am getting this error
Exception in thread "main" com.mysql.jdbc.MysqlDataTruncation: Data truncation: Cannot get geometry object from data you send to the GEOMETRY field
Answer
You need to convert the geometry object you have to well known text. You'll find information on how to do that in the vividsolutions API documentation.
geoobject.toText();
Insert / Update the data using the mysql GeomFromText method.
INSERT INTO geom VALUES (GeomFromText(#g));
It can be binary as well, e.g.
PreparedStatement preparedStatement = connection.prepareStatement
("INSERT INTO table (point, polygon) VALUES (PointFromWKB(?), GeomFromWKB(?))");
WKBWriter writer = new WKBWriter();
preparedStatement.setBytes(1, writer.write(point));
preparedStatement.setBytes(2, writer.write(polygon));
preparedStatement.executeUpdate();
MySql can't know how to store your GEO object, or what is his size. You should not store the object the way you're trying.
The PreparedStatement#setObject() documentation says :
The JDBC specification specifies a standard mapping from Java Object types to SQL types. The given argument will be converted to the corresponding SQL type before being sent to the database.
[...]
This method throws an exception if there is an ambiguity, for example, if the object is of a class implementing more than one of the interfaces named above.