In application code, we are using JSON_VALUE function for getting particular value from stored json.
SELECT * FROM EMPLOYEE WHERE JSON_VALUE(ADDR_JSON, '$.pinCode') != '390018'
When we are writing functional test cases which uses H2 database, it is resulting into following error:
nested exception is java.sql.SQLSyntaxErrorException: user lacks privilege or object not found: JSON_VALUE in statement
Can you please suggest if anyone has some solution on this ?
H2 does not support accessing JSON attributes. See here for a list of supported functions: https://www.h2database.com/html/functions.html
I’m struggling to insert a JSON object into my postgres v9.4 DB. I have defined the column called "evtjson" as type json (not jsonb).
I am trying to use a prepared statement in Java (jdk1.8) to insert a Json object (built using JEE javax.json libraries) into the column, but I keep running into SQLException errors.
I create the JSON object using:
JsonObject mbrLogRec = Json.createObjectBuilder().build();
…
mbrLogRec = Json.createObjectBuilder()
.add("New MbrID", newId)
.build();
Then I pass this object as a parameter to another method to write it to the DB using a prepared statement. (along with several other fields) As:
pStmt.setObject(11, dtlRec);
Using this method, I receive the following error:
org.postgresql.util.PSQLException: No hstore extension installed.
at org.postgresql.jdbc.PgPreparedStatement.setMap(PgPreparedStatement.java:553)
at org.postgresql.jdbc.PgPreparedStatement.setObject(PgPreparedStatement.java:1036)
I have also tried:
pStmt.setString(11, dtlRec.toString());
pStmt.setObject(11, dtlRec.toString());
Which produce a different error:
Event JSON: {"New MbrID":29}
SQLException: ERROR: column "evtjson" is of type json but expression is of type character varying
Hint: You will need to rewrite or cast the expression.
But, at least this tells me that the DB is recognizing the column as type JSON.
I did try installing the hstore extension, but it then told me that it was not an hstore object.
OracleDocs shows a number of various methods to set the parameter value in the preparedStatement, but I'd rather not try them all if someone knows the answer. (http://docs.oracle.com/javase/8/docs/api/java/sql/PreparedStatement.html) These also reference an additional parameter, SQLType, but I can't find any reference to these.
Should I try setAsciiStream? CharacterStream? CLOB?
This behaviour is quite annoying since JSON strings are accepted without problems when used as literal strings in SQL commands.
There is a already an issue for this in the postgres driver Github repository (even if the problem seems the be the serverside processing).
Besides using a cast (see answer of
#a_horse_with_no_name) in the sql string, the issue author offers two additional solutions:
Use a parameter stringtype=unspecified in the JDBC connection URL/options.
This tells PostgreSQL that all text or varchar parameters are actually
of unknown type, letting it infer their types more freely.
Wrap the parameter in a org.postgresql.util.PGobject:
PGobject jsonObject = new PGobject();
jsonObject.setType("json");
jsonObject.setValue(yourJsonString);
pstmt.setObject(11, jsonObject);
You can do it like this and you just need the json string:
Change the query to:
String query = "INSERT INTO table (json_field) VALUES (to_json(?::json))"
And set the parameter as a String.
pStmt.setString(1, json);
You have two options:
Use statement.setString(jsonStr) and then handle the conversion in the sql statement:
PreparedStatement statement = con.prepareStatement(
"insert into table (jsonColumn) values (?::json)");
statement.setString(1, jsonStr);
Another option is to use PGobject to create a custom value wrapper.
PGobject jsonObject = new PGobject();
PreparedStatement statement = con.prepareStatement(
"insert into table (jsonColumn) values (?)");
jsonObject.setType("json");
jsonObject.setValue(jsonStr);
statement.setObject(1, jsonObject);
I personally prefer the latter as the query is cleaner
Passing the JSON as a String is the right approach, but as the error message tells you, you need to cast the parameter in the INSERT statement to a JSON value:
insert into the_table
(.., evtjson, ..)
values
(.., cast(? as json), ..)
Then you can use pStmt.setString(11, dtlRec.toString()) to pass the value
Most answers here defines ways of inserting into postgres json field with jdbc in a non-standard way, ie. it is db implementation specific. If you need to insert a java string into a postgres json field with pure jdbc and pure sql use:
preparedStatement.setObject(1, "{}", java.sql.Types.OTHER)
This will make the postgres jdbc driver (tested with org.postgresql:postgresql:42.2.19) convert the java string to the json type. It will also validate the string as being a valid json representation, something that various answers using implicit string casts does not do - resulting in the possibility of corrupt persisted json data.
As others have mentioned, your SQL string needs to explicitly cast the bind value to the PostgreSQL json or jsonb type:
insert into t (id, j) values (?, ?::json)
Now you can bind the string value. Alternatively, you can use a library that can do it, for example jOOQ (works out of the box) or Hibernate (using a third party UserType registration). The benefits of this is that you don't have to think about this every time you bind such a variable (or read it). A jOOQ example:
ctx.insertInto(T)
.columns(T.ID, T.J)
.values(1, JSON.valueOf("[1, 2, 3]"))
.execute();
Behind the scenes, the same cast as above is always generated, whenever you work with this JSON (or JSONB) data type.
(Disclaimer: I work for the company behind jOOQ)
if using spring boot: adding the following line to application.properties helped:
spring.datasource.hikari.data-source-properties.stringtype=unspecified
as Wero wrote:
This tells PostgreSQL that all text or varchar parameters are actually
of unknown type
Instead of passing json object pass its string value and cast it to json in the query.
Example:
JSONObject someJsonObject=..........
String yourJsonString = someJsonObject.toString();
String query = "INSERT INTO table (json_field) VALUES (to_json(yourJsonString::json))";
this worked for me.
In Oracle I have types like:
create or replace TYPE "LOCATION_RECORD" AS OBJECT
( PSTL_ADR ADR_TABLE
,GEO_CDE VARCHAR2(36)
,PRMRY_FLG VARCHAR2(5)
);
create or replace TYPE "LOCATION_TABLE" AS TABLE OF LOCATION_RECORD;
create or replace TYPE "ADR_RECORD" AS OBJECT
( LN_1_TXT VARCHAR2(100)
,LN_2_TXT VARCHAR2(100)
)
create or replace TYPE "ADR_TABLE" AS TABLE OF ADR_RECORD;
Procedure:
PROCEDURE Main(P_LOCATION_TABLE IN LOCATION_TABLE
,P_OUTPUT IN LOCATION_TABLE );
How can I call this procedure in java? This is a special case where one oracle type table contains another
oracle type table.
I had tried with SQL data implementation. It was not working.
Only if oracle type record has no table type as one of its parameter then it works through java.
BUT when one oracle type record has another table type as one of its parameter, it does not work.
How to construct STRUCT array type of LOCATION table which also contains ADR_TABLE? How to get output parameter?
Consider to use Spring Data JDBC Extensions
The convenient class for you is SqlStructArrayValue, where arrayTypeName is your LOCATION_TABLE, structTypeName - LOCATION_RECORD. And StructMapper implementation should correctly populate a StructDescriptor for your values.
Right, in this case you should use SqlStructArrayValue as nested object one more time, because your PSTL_ADR is an ARRAY too.
See the source code of that project and test-cases from there.
I need to store public key in my MySQL DB. I have the following:
rsaModulus=rsaPk.getModulus();
The method getModulus() returns BigInt. But, when I use preparedstatement in order to insert the value in the table, I can not find the approperiate method to do that (i.e, similar to toString, toInt). I need to retrieve this public key and do some mathematical calculations later. That's why, I don't think storing it as String would be a good idea. Also, toLong did not work as the field in the DB is defined as BigInt. Is there any solutions for this problem ??
Files, like keys and certificates generally are stored as a bite array.
This can be done with a blob collumn.
Mysql doc shows about these java type and mysql type relation:
http://dev.mysql.com/doc/refman/5.0/en/connector-j-reference-type-conversions.html
If you are able to get your public key as a byte[], use like this sample code:
...
InputStream is = PapelServiceTest.class.getResourceAsStream("MY_FILE_HERE");
byte[] pkBytes = IOUtils.toByteArray(is);
String query = "INSERT INTO my_table (my_key_col) VALUES (?)";
PreparedStatement pstat = conn.prepareStatement(query);
pstat.setBytes(1, pkBytes);
pstat.execute();
...
This thread has an interesting workaround using BLOB.
Store BigInteger into Mysql
Perhaps you can use it.
PS. Next time, search SO first :)
I am using geotools library to extract the location information. With that I am getting an object of type
class com.vividsolutions.jts.geom.MultiPolygon
I now want to store this field in my mysql table with a jdbc connection .
When I directly try to insert it as
pstmtInsert.setObject(4, geoobject)
I am getting this error
Exception in thread "main" com.mysql.jdbc.MysqlDataTruncation: Data truncation: Cannot get geometry object from data you send to the GEOMETRY field
Answer
You need to convert the geometry object you have to well known text. You'll find information on how to do that in the vividsolutions API documentation.
geoobject.toText();
Insert / Update the data using the mysql GeomFromText method.
INSERT INTO geom VALUES (GeomFromText(#g));
It can be binary as well, e.g.
PreparedStatement preparedStatement = connection.prepareStatement
("INSERT INTO table (point, polygon) VALUES (PointFromWKB(?), GeomFromWKB(?))");
WKBWriter writer = new WKBWriter();
preparedStatement.setBytes(1, writer.write(point));
preparedStatement.setBytes(2, writer.write(polygon));
preparedStatement.executeUpdate();
MySql can't know how to store your GEO object, or what is his size. You should not store the object the way you're trying.
The PreparedStatement#setObject() documentation says :
The JDBC specification specifies a standard mapping from Java Object types to SQL types. The given argument will be converted to the corresponding SQL type before being sent to the database.
[...]
This method throws an exception if there is an ambiguity, for example, if the object is of a class implementing more than one of the interfaces named above.