In Java, with JDBC I am trying to insert a file in a BLOB column in a table of an Oracle database.
Here is how I proceed:
private Statement getStatement(File f, String fid, Long dex, String uid, int id)
{
FileInputStream fis = null;
PreparedStatement statement;
try
{
statement = connection.prepareStatement("INSERT INTO BLOBTABLE (FID, FDEX, SFILE, UID, ID) VALUES (?, ?, ?, ?, ?)");
statement.setString(1, fid);
statement.setLong(2, dex);
fis = new FileInputStream(file);
statement.setBinaryStream(3, fis, file.length());
statement.setString(4, uid);
statement.setInt(5, id);
}
finally
{
if (fis != null)
fis.close();
}
return statement;
}
private insertStuff()
{
File f = new File("/home/user/thisFileExists");
PreparedStatement statement = getStatement(f, "XYZ", 18L, "ABC", 78);
statement.execute();
}
When the .execute is run, I get an Oracle error:
java.sql.SQLIntegrityConstraintViolationException: ORA-01400: cannot insert NULL into ("ORACLEUSER"."BLOBTABLE"."SFILE")
SFILE is the BLOB column. So this means the database at the end of the chain receives NULL in the query.
How come?
If I replace:
statement.setBinaryStream(3, fis, file.length());
With:
statement.setBinaryStream(3, new ByteArrayInputStream(("RANDOMSTRING".getBytes())));
It works so it somehow does not like my file stream.
Is it a problem that I close the stream? that is how they do it on all samples I saw.
You're closing the FileInputStream before you execute the statement, so there's no way for the statement to get the data when it actually needs it. It would better to pass an InputStream into your method, so you can close it externally after the statement has executed:
private insertStuff() {
File file = new File("/home/user/thisFileExists");
try (InputStream stream = new FileInputStream(file)) {
PreparedStatement statement = getStatement(stream, "XYZ", 18L, "ABC", 78);
statement.execute();
}
}
... where getStatement would accept an InputStream instead of the File, and use the overload of setBinaryStream which doesn't take a data length. Alternatively, you could pass in the File and it could open the stream, create the statement, execute the statement, then close the stream.
As a side note, you should be closing the statement using a try-with-resource or try/finally statement, too.
You are closing the FileInputStream before the database has used it. The JDBC driver is allowed to defer consumption of the stream until the actual execute.
Also note that your test comparison with a fixed string isn't entirely fair: it isn't the same method overload so it might be that one works and the other one doesn't (although that isn't the case here).
Related
I have a rest API that writes the InputStream of the body of a post directly to the database using a PreparedStatement like so:
public void store(String id, InputStream body) throws SQLException, SPSException {
if(id == null || id.length() == 0)
throw new SPSException("get: id is missing: " + id);
Connection conn = null;
PreparedStatement ps = null;
try {
conn = getConnection();
Parameters parameters = Parameters.parse("insert into properties (id, body) values (?,?)");
ps = conn.prepareStatement( parameters.getSQL() );
ps.setString(1, id );
ps.setBlob(2, body);
ps.executeUpdate();
conn.commit();
} finally {
close( ps );
close( conn );
}
}
The setBlob call takes the InputStream. Is there a way I can wrap this inputStream with a jsonParser InputStream? So that if the json doesn't parse, an exception is thrown?
I don't want to have to write my own or have to rebuild the stream if I don't have to, but I can't find anything available that would do this with the need for a lot of extra code.
Note - I cannot read the object into memory in its entirety. It has to be a streaming solution.
In many try-with-resource examples I have searched, Statement and ResultSet are declared separately. As the Java document mentioned, the close methods of resources are called in the opposite order of their creation.
try (Statement stmt = con.createStatement();
ResultSet rs = stmt.executeQuery(sql) ) {
} catch (Exception e) {
}
But now I have multiple queries in my function.
Can I make Statement and ResultSet in just one line ? My code is like:
try (ResultSet rs = con.createStatement().executeQuery(sql);
ResultSet rs2 = con.createStatement().executeQuery(sql2);
ResultSet rs3 = con.createStatement().executeQuery(sql3)) {
} catch (Exception e) {
}
If I only declare them in one line, does it still close resource of both ResultSet and Statement?
When you have a careful look you will see that the concept is called try-with-resources.
Note the plural! The whole idea is that you can declare one or more resources in that single statement and the jvm guarantees proper handling.
In other words: when resources belong together semantically, it is good practice to declare them together.
Yes, and it works exactly as you put it in your question, multiple statements separated by semicolon.
You may declare one or more resources in a try-with-resources statement. The following example retrieves the names of the files packaged in the zip file zipFileName and creates a text file that contains the names of these files:
try (
java.util.zip.ZipFile zf =
new java.util.zip.ZipFile(zipFileName);
java.io.BufferedWriter writer =
java.nio.file.Files.newBufferedWriter(outputFilePath, charset)
) {
// Enumerate each entry
for (java.util.Enumeration entries =
zf.entries(); entries.hasMoreElements();) {
// Get the entry name and write it to the output file
String newLine = System.getProperty("line.separator");
String zipEntryName =
((java.util.zip.ZipEntry)entries.nextElement()).getName() +
newLine;
writer.write(zipEntryName, 0, zipEntryName.length());
}
}
https://docs.oracle.com/javase/tutorial/essential/exceptions/tryResourceClose.html
ResultSet implements AutoCloseable, which means try-with-resources will also enforce closing it when it finishes using it.
https://docs.oracle.com/javase/7/docs/api/java/sql/ResultSet.html
I am getting null value when I am reading the blob data from database. What might be the issue? Can some one help me on this?
Connection con = null;
PreparedStatement psStmt = null;
ResultSet rs = null;
try {
try {
Class.forName("oracle.jdbc.driver.OracleDriver");
} catch (ClassNotFoundException e) {
e.printStackTrace();
}
con =
DriverManager.getConnection("jdbc:oracle:thin:#MyDatabase:1535:XE","password","password");
System.out.println("connection established"+con);
psStmt = con
.prepareStatement("Select Photo from Person where Firstname=?");
int i = 1;
psStmt.setLong(1, "Nani");
rs = null;
rs = psStmt.executeQuery();
InputStream inputStream = null;
while (rs.next()) {
inputStream = rs.getBinaryStream(1);
//Blob blob = rs.getBlob(1);
//Blob blob1 = (Blob)rs.getObject(1);
//System.out.println("blob length "+blob1);//rs.getString(1);
}
System.out.println("bytessssssss "+inputStream);//here i am getting null value.
} catch (SQLException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
I believe you didn't use setString function to assign any value to firstname which leads to null
for example:
ps.preparedStatement("Select photo from person where firstname = ?");
ps.setString(1,"kick"); <----- add this line
system.out.println("bytes "+rs.getBinaryStream(1));
Another suggestions
there is no need to use rs = null; inside try catch block because you have rs=null; at beginning of
your code.
change
InputStream inputStream = null;
to
InputStream inputStream = new InputStream();
or
get rid of InputStream inputStream = null;
source you should take a look at
The most obvious error is using setLong instead of setString.
However one practice is fatal: declaring in advance. This in other languages is a good practice, but in java one should declare as close as possible.
This reduces scope, by which you would have found the error! Namely inputStream is called after a failed rs.next() - outside the loop. Maybe because no records were found.
This practice, declaring as near as feasible, also helps with try-with-resources which were used here to automatically close the statement and result set.
Connection con = null;
try {
Class.forName("oracle.jdbc.driver.OracleDriver");
con = DriverManager.getConnection(
"jdbc:oracle:thin:#MyDatabase:1535:XE","password","password");
System.out.println("connection established"+con);
try (PreparedStatement psStmt = con.prepareStatement(
"Select Photo from Person where Firstname=?")) {
int i = 1;
psStmt.setString(1, "Nani");
try (ResultSet rs = psStmt.executeQuery()) {
while (rs.next()) {
try (InputStream inputStream = rs.getBinaryStream(1)) {
//Blob blob = rs.getBlob(1);
//Blob blob1 = (Blob)rs.getObject(1);
//System.out.println("blob length "+blob1);//rs.getString(1);
Files.copy(inputStream, Paths.get("C:/photo-" + i + ".jpg"));
}
++i;
}
//ERROR System.out.println("bytessssssss "+inputStream);
} // Closes rs.
} // Closes psStmt.
}
1- In your code when setting the parameter's value of SQL query, be sure to use the appropriate data type of the field. So here you should use
psStmt.setString(1, "Nani");
instead of
psStmt.setLong(1, "Nani");
2- Make sure that the query is correct (Table name, field name).
3- Make sure that the table is containing data.
I'm doing an individual project in java. I want to insert data into my database...but my program is successfully running without any error but when insert data and submit the my data it will give an error like this java.sql.SQLException: Can not issue data manipulation statements with executeQuery().This My Code: \
what can do for solved this problem
private void jButton1ActionPerformed(java.awt.event.ActionEvent evt) {
if (evt.getSource() == jButton1)``
{
int x = 0;
String s1 = jTextField1.getText().trim();
String s2 = jTextField2.getText();
char[] s3 = jPasswordField1.getPassword();
char[] s4 = jPasswordField2.getPassword();
String s8 = new String(s3);
String s9 = new String(s4);
String s5 = jTextField5.getText();
String s6 = jTextField6.getText();
String s7 = jTextField7.getText();
if(s8.equals(s9))
{
try{
File image = new File(filename);
FileInputStream fis = new FileInputStream(image);
ByteArrayOutputStream bos = new ByteArrayOutputStream();
byte buf[] = new byte[1024];
for (int readNum; (readNum = fis.read(buf)) != -1;) {
bos.write(buf, 0, readNum);
}
cat_image = bos.toByteArray();
PreparedStatement ps = conn.prepareStatement("insert into reg values(?,?,?,?,?,?,?)");
ps.setString(1,s1);
ps.setString(2,s2);
ps.setString(3,s8);
ps.setString(4,s5);
ps.setString(5,s6);
ps.setString(6,s7);
ps.setBytes(7,cat_image);
rs = ps.executeQuery();
if(rs.next())
{
JOptionPane.showMessageDialog(null,"Data insert Succesfully");
}else
{
JOptionPane.showMessageDialog(null,"Your Password Dosn't match" ,"Acces dinied",JOptionPane.ERROR_MESSAGE);
}
}catch(Exception e)
{
System.out.println(e);
}
}
Use ps.executeUpdate() or ps.execute().
From executeUpdate
Executes the SQL statement in this PreparedStatement object, which must be an SQL Data Manipulation Language (DML) statement, such as
INSERT, UPDATE or DELETE; or an SQL statement that returns nothing,
such as a DDL statement.
From execute
Executes the SQL statement in this PreparedStatement object, which
may be any kind of SQL statement. Some prepared statements return
multiple results; the execute method handles these complex statements
as well as the simpler form of statements handled by the methods
executeQuery and executeUpdate.The execute method returns a boolean to
indicate the form of the first result. You must call either the method
getResultSet or getUpdateCount to retrieve the result; you must call
getMoreResults to move to any subsequent result(s).
Then modify your code properly
int rowsAffected = ps.executeUpdate();
JOptionPane.showMessageDialog(null,"Data Rows Inserted "+ rowsAffected);
Also you have to close your streams and connections in a finally block.
SQLException is thrown because of wrong sql statement. You may have syntax error while inserting string and integer values. Check your sql statement after VALUES there should be "1-0" around integer elements and '"some value"' around string elements.
I am currently writing a Java program which loops through a folder of around 4000 XML files.
Using a for loop, it extracts the XML from each file, assigns it to a String 'xmlContent', and uses the PreparedStatement method setString(2,xmlContent) to insert the String into a table stored in my SQL Server.
The column '2' is a column called 'Data' of type XML.
The process works, but it is slow. It inserts about 50 rows into the table every 7 seconds.
Does anyone have any ideas as to how I could speed up this process?
Code:
{ ...declaration, connection etc etc
PreparedStatement ps = con.prepareStatement("INSERT INTO Table(ID,Data) VALUES(?,?)");
for (File current : folder.listFiles()){
if (current.isFile()){
xmlContent = fileRead(current.getAbsoluteFile());
ps.setString(1, current.getAbsoluteFile());
ps.setString(2, xmlContent);
ps.addBatch();
if (++count % batchSize == 0){
ps.executeBatch();
}
}
}
ps.executeBatch(); // performs insertion of leftover rows
ps.close();
}
private static String fileRead(File file){
StringBuilder xmlContent = new StringBuilder();
FileReader fr = new FileReader(file);
BufferedReader br = new BufferedReader(fr);
String strLine = "";
br.readLine(); //removes encoding line, don't need it and causes problems
while ( (strLine = br.readLine() ) != null){
xmlContent.append(strLine);
}
fr.close();
return xmlContent.toString();
}
Just from a little reading and a quick test - it looks like you can get a decent speedup by turning off autoCommit on your connection. All of the batch query tutorials I see recommend it as well. Such as http://www.tutorialspoint.com/jdbc/jdbc-batch-processing.htm
Turn it off - and then drop an explicit commit where you want (at the end of each batch, at the end of the whole function, etc).
conn.setAutoCommit(false);
PreparedStatement ps = // ... rest of your code
// inside your for loop
if (++count % batchSize == 0)
{
try {
ps.executeBatch();
conn.commit();
}
catch (SQLException e)
{
// .. whatever you want to do
conn.rollback();
}
}
Best make the read and write parallel.
Use one thread to read the files and store in a buffer.
Use another thread to read from the buffer and execute queries on database.
You can use more than one thread to write to the database in parallel. That should give you even better performance.
I would suggest you follow this MemoryStreamMultiplexer approach where you can read the XML files in one thread and store in a buffer and then use one or more thread to read from the buffer and execute against database.
http://www.codeproject.com/Articles/345105/Memory-Stream-Multiplexer-write-and-read-from-many
It is a C# implementation, but you get the idea.