I need to store public key in my MySQL DB. I have the following:
rsaModulus=rsaPk.getModulus();
The method getModulus() returns BigInt. But, when I use preparedstatement in order to insert the value in the table, I can not find the approperiate method to do that (i.e, similar to toString, toInt). I need to retrieve this public key and do some mathematical calculations later. That's why, I don't think storing it as String would be a good idea. Also, toLong did not work as the field in the DB is defined as BigInt. Is there any solutions for this problem ??
Files, like keys and certificates generally are stored as a bite array.
This can be done with a blob collumn.
Mysql doc shows about these java type and mysql type relation:
http://dev.mysql.com/doc/refman/5.0/en/connector-j-reference-type-conversions.html
If you are able to get your public key as a byte[], use like this sample code:
...
InputStream is = PapelServiceTest.class.getResourceAsStream("MY_FILE_HERE");
byte[] pkBytes = IOUtils.toByteArray(is);
String query = "INSERT INTO my_table (my_key_col) VALUES (?)";
PreparedStatement pstat = conn.prepareStatement(query);
pstat.setBytes(1, pkBytes);
pstat.execute();
...
This thread has an interesting workaround using BLOB.
Store BigInteger into Mysql
Perhaps you can use it.
PS. Next time, search SO first :)
Related
I want to execute the JSON Query using the the following query
string sql = "SELECT JSON_OBJECT ('customerid' VALUE customerID, 'customerutility' VALUE customerutility) FROM customerTABLE";
I need to run this from Java application and store the results in a file.
statement = connection.createStatement();
ResultSet rs = statement.executeQuery(sql);
I am guessing the executeQuery returns a JSON OBJECT. I am not quite sure how to get this object to serialize to a file.
any help in this is greatly appreciated
Honestly I never worked with SQLServer, but I will take the liberty of answering because nobody has done so far..
If I were in your place I would try one of the following options:
I would try to read the JSON as if it were a string: rs.getString("json") where "json" is used as alias
If the first approach doesn't work, I would try with an explicit string cast on the sql side
Both the previous solutions derive from the assumption that, if the json will be written on a file, no typing is necessary, but a string representation of the json is enough.
Otherwise my approach would be to read in the classic way the fields of the table and then turn each row into a json java side, and this operation is very simple:
Json.createObjectBuilder()
.add("customerid", 1)
.add("customerUtility", "something")
.build();
I am using geotools library to extract the location information. With that I am getting an object of type
class com.vividsolutions.jts.geom.MultiPolygon
I now want to store this field in my mysql table with a jdbc connection .
When I directly try to insert it as
pstmtInsert.setObject(4, geoobject)
I am getting this error
Exception in thread "main" com.mysql.jdbc.MysqlDataTruncation: Data truncation: Cannot get geometry object from data you send to the GEOMETRY field
Answer
You need to convert the geometry object you have to well known text. You'll find information on how to do that in the vividsolutions API documentation.
geoobject.toText();
Insert / Update the data using the mysql GeomFromText method.
INSERT INTO geom VALUES (GeomFromText(#g));
It can be binary as well, e.g.
PreparedStatement preparedStatement = connection.prepareStatement
("INSERT INTO table (point, polygon) VALUES (PointFromWKB(?), GeomFromWKB(?))");
WKBWriter writer = new WKBWriter();
preparedStatement.setBytes(1, writer.write(point));
preparedStatement.setBytes(2, writer.write(polygon));
preparedStatement.executeUpdate();
MySql can't know how to store your GEO object, or what is his size. You should not store the object the way you're trying.
The PreparedStatement#setObject() documentation says :
The JDBC specification specifies a standard mapping from Java Object types to SQL types. The given argument will be converted to the corresponding SQL type before being sent to the database.
[...]
This method throws an exception if there is an ambiguity, for example, if the object is of a class implementing more than one of the interfaces named above.
I’m struggling to insert a JSON object into my postgres v9.4 DB. I have defined the column called "evtjson" as type json (not jsonb).
I am trying to use a prepared statement in Java (jdk1.8) to insert a Json object (built using JEE javax.json libraries) into the column, but I keep running into SQLException errors.
I create the JSON object using:
JsonObject mbrLogRec = Json.createObjectBuilder().build();
…
mbrLogRec = Json.createObjectBuilder()
.add("New MbrID", newId)
.build();
Then I pass this object as a parameter to another method to write it to the DB using a prepared statement. (along with several other fields) As:
pStmt.setObject(11, dtlRec);
Using this method, I receive the following error:
org.postgresql.util.PSQLException: No hstore extension installed.
at org.postgresql.jdbc.PgPreparedStatement.setMap(PgPreparedStatement.java:553)
at org.postgresql.jdbc.PgPreparedStatement.setObject(PgPreparedStatement.java:1036)
I have also tried:
pStmt.setString(11, dtlRec.toString());
pStmt.setObject(11, dtlRec.toString());
Which produce a different error:
Event JSON: {"New MbrID":29}
SQLException: ERROR: column "evtjson" is of type json but expression is of type character varying
Hint: You will need to rewrite or cast the expression.
But, at least this tells me that the DB is recognizing the column as type JSON.
I did try installing the hstore extension, but it then told me that it was not an hstore object.
OracleDocs shows a number of various methods to set the parameter value in the preparedStatement, but I'd rather not try them all if someone knows the answer. (http://docs.oracle.com/javase/8/docs/api/java/sql/PreparedStatement.html) These also reference an additional parameter, SQLType, but I can't find any reference to these.
Should I try setAsciiStream? CharacterStream? CLOB?
This behaviour is quite annoying since JSON strings are accepted without problems when used as literal strings in SQL commands.
There is a already an issue for this in the postgres driver Github repository (even if the problem seems the be the serverside processing).
Besides using a cast (see answer of
#a_horse_with_no_name) in the sql string, the issue author offers two additional solutions:
Use a parameter stringtype=unspecified in the JDBC connection URL/options.
This tells PostgreSQL that all text or varchar parameters are actually
of unknown type, letting it infer their types more freely.
Wrap the parameter in a org.postgresql.util.PGobject:
PGobject jsonObject = new PGobject();
jsonObject.setType("json");
jsonObject.setValue(yourJsonString);
pstmt.setObject(11, jsonObject);
You can do it like this and you just need the json string:
Change the query to:
String query = "INSERT INTO table (json_field) VALUES (to_json(?::json))"
And set the parameter as a String.
pStmt.setString(1, json);
You have two options:
Use statement.setString(jsonStr) and then handle the conversion in the sql statement:
PreparedStatement statement = con.prepareStatement(
"insert into table (jsonColumn) values (?::json)");
statement.setString(1, jsonStr);
Another option is to use PGobject to create a custom value wrapper.
PGobject jsonObject = new PGobject();
PreparedStatement statement = con.prepareStatement(
"insert into table (jsonColumn) values (?)");
jsonObject.setType("json");
jsonObject.setValue(jsonStr);
statement.setObject(1, jsonObject);
I personally prefer the latter as the query is cleaner
Passing the JSON as a String is the right approach, but as the error message tells you, you need to cast the parameter in the INSERT statement to a JSON value:
insert into the_table
(.., evtjson, ..)
values
(.., cast(? as json), ..)
Then you can use pStmt.setString(11, dtlRec.toString()) to pass the value
Most answers here defines ways of inserting into postgres json field with jdbc in a non-standard way, ie. it is db implementation specific. If you need to insert a java string into a postgres json field with pure jdbc and pure sql use:
preparedStatement.setObject(1, "{}", java.sql.Types.OTHER)
This will make the postgres jdbc driver (tested with org.postgresql:postgresql:42.2.19) convert the java string to the json type. It will also validate the string as being a valid json representation, something that various answers using implicit string casts does not do - resulting in the possibility of corrupt persisted json data.
As others have mentioned, your SQL string needs to explicitly cast the bind value to the PostgreSQL json or jsonb type:
insert into t (id, j) values (?, ?::json)
Now you can bind the string value. Alternatively, you can use a library that can do it, for example jOOQ (works out of the box) or Hibernate (using a third party UserType registration). The benefits of this is that you don't have to think about this every time you bind such a variable (or read it). A jOOQ example:
ctx.insertInto(T)
.columns(T.ID, T.J)
.values(1, JSON.valueOf("[1, 2, 3]"))
.execute();
Behind the scenes, the same cast as above is always generated, whenever you work with this JSON (or JSONB) data type.
(Disclaimer: I work for the company behind jOOQ)
if using spring boot: adding the following line to application.properties helped:
spring.datasource.hikari.data-source-properties.stringtype=unspecified
as Wero wrote:
This tells PostgreSQL that all text or varchar parameters are actually
of unknown type
Instead of passing json object pass its string value and cast it to json in the query.
Example:
JSONObject someJsonObject=..........
String yourJsonString = someJsonObject.toString();
String query = "INSERT INTO table (json_field) VALUES (to_json(yourJsonString::json))";
this worked for me.
I'm trying to fill an SQLite database with data in my java program.
The data is read from an excel file using Apache POI. I have no trouble inserting the data into the db using normal methods.
However, when I check the database manually with the shell, all the Norwegian characters æ,ø,å are not displayed correctly. Whenever I fill out the database manually through the shell, they are displayed as they should.
Also, when printing out a java string in console containing these characters, they are displayed correctly.
The problem must occur when an action like this is performed:
String sql = "insert into db(name) values (æøå)";
stmt.executeUpdate(sql);
I have tried
byte[] b = sql.getBytes("utf-8");
sql = new String(b, "utf-8");
to no avail.
Any idea how to remedy the situation?
Thanks!
There is a very simple solution for you: Let Java and the SQLite driver do everything for you. You don't have to care about encodings and escaping of parameters.
How that is possible: Use a PreparedStatement:
String name = "æøå"
PreparedStatement prepStmt = conn.prepareStatement("insert into db(name) values (?)");
prepStmt.setString(1, name);
prepStmt.executeUpdate();
Furthermore this code fragment is secure against SQL injection attacks.
BTW: The second code fragment you posted is totally useless, it does nothing. Converting a String to byte[] and back to String does not change a single bit of the String.
I have an auto generated timestamp that is created each time a record is inserted or updated in a mysql table. Is there a way to return this timestamp in a way similar to how I would use a keyholder to return a newly created id?
KeyHolder keyHolder = new GeneratedKeyHolder();
JdbcTemplate jdbcTemplate = new JdbcTemplate(dataSource);
//Insert Contact
jdbcTemplate.update(new PreparedStatementCreator() {
#Override
public PreparedStatement createPreparedStatement(Connection connection) throws SQLException {
PreparedStatement preparedStatement = connection.prepareStatement(SQL_ADD, Statement.RETURN_GENERATED_KEYS);
preparedStatement.setString(1, contact.getFirstName());
preparedStatement.setString(2, contact.getLastName());
preparedStatement.setInt(3, contact.getOrganizationId());
preparedStatement.setString(4, contact.getType());
preparedStatement.setInt(5, contact.getUserId());
return preparedStatement;
}
}, keyHolder);
//Use keyholder to obtain newly created id
contact.setId(keyHolder.getKey().intValue());
Is there some way to also return the new timestamp without having to requery the table? I have been looking for ways to return it along with the id as a key in the keyholder, but it doesn't seem to be returned as a key?
Not very satisfying, but I think "no" is the answer to your question. I don't know any of the Spring stuff, but I think this is due to the basic JDBC that it's wrapping. See http://docs.oracle.com/javase/6/docs/api/java/sql/Statement.html#getGeneratedKeys%28%29
You only option would be to create a stored procedure on MySQL that has an out parameter and call that. See http://dev.mysql.com/doc/refman/5.0/en/call.html.
There are few options for solving this issue on the MySQL database server side. You could start with creating a TRIGGER on the table. As TRIGGER has a restriction and cannot return data, you can set the TIMESTAMP value to a variable:
DEMILITER //
CREATE TRIGGER ai_tbl_name AFTER INSERT ON tbl_name
FOR EACH ROW
BEGIN
SET #TimeStamp = NEW.timestamp_column;
END;//
DELIMITER ;
To retrieve this timestamp value, run the following command:
SELECT #TimeStamp;
Since the variables are stored in the memory, there will be no need to open any tables again.
You go even further. You could create a STORED PROCEDURE in MySQL to automate all the above (sample code, as I do not know your table's details):
DELIMITER //
DROP PROCEDURE IF EXISTS sp_procedure_name //
CREATE PROCEDURE sp_procedure_name (IN col1_val VARCHAR(25),
IN col2_val VARCHAR(25),
IN col3_val INT)
BEGIN
INSERT INTO tbl_name (col1, col2, col3)
VALUES (col1_val, col2_val, col3_val);
SELECT #TimeStamp;
END; //
DELIMITER ;
You can run this procedure with the following code:
CALL sp_procedure_name(col1_val, col2_val, col3_val);
As I'm not familiar with the Java, you'll need to finish it up with your side of code.
It seems that the variable contact is an instance for the newly inserted record. As it contains the newly generated id (primary key) field value, you can execute a new query to return the required timestamp field value for this new id.
The query may look like this:
select timestamp_field from my_table where id=?
Use PreparedStatement to input new id value and execute it to fetch required timestamp field value.
GeneratedKeyHolder also has two methods: getKeyList() that returns Map<String,Object> of generated fields; and getKeyList() that produces a list of generated keys for all affected rows.
See java doc of GeneratedKeyHolder and Spring tutorial of auto generated keys
In addition Spring's SimpleJdbcInsert has methods for generated key retrieval. See also method SimpleJdbcInsert#usingGeneratedKeyColumns
There are 2 methods in java.sql.Connection class causing PreparedStatement execution to return selected key columns :
PreparedStatement prepareStatement(String sql,
int[] columnIndexes)
throws SQLException
PreparedStatement prepareStatement(String sql,
String[] columnNames)
throws SQLException
You don't need to use Spring KeyHolder & JDBCTemplate to do this.
The give hope you could number/name your timestamp column. But the javadoc doesn't require or suggest that any JDBC implementation can return non-key columns, so your out of luck with this approach:
Creates a default PreparedStatement object capable of returning the auto-generated keys
designated by the given array. This array contains the names of the columns in the target
table that contain the auto-generated keys that should be returned.
As suggested in another answer, can switch to a stored procedure that does exactly what you want (CallableStatement is actually a PreparedStatement that executes storedprocedures - i.e. a subclass).
Can populate the timestamp column within the prepared statement via new Timestamp(new Date()) - but you should have in place a mechanism to sync times across your various servers (which is often used in windows and *nix environments). Your trigger could set the timestamp only if a value wasn't already provided.
As part of your app & DB design, you need to commit to a philosophy of where certain operations occur. If the DB derives needed data, the app needs to refresh data - you must pay the price of separate query executions or a combined stored proc that inserts & retrieves.