I use wildfly-8.2.0.Final. There is connection pool (Oracle) on this server.
Look at following code:
public ArrayList<HashMap<String, Object>> fetchSome(String query)
throws OracleQueryProcessorException {
ArrayList<HashMap<String, Object>> result = new ArrayList<HashMap<String, Object>>();
try {
Context initCtx = new InitialContext();
DataSource ds = (DataSource) initCtx.lookup(driver);
try (Connection con = ds.getConnection();
PreparedStatement stmt = con.prepareStatement(query)) {
try (ResultSet rs = stmt.executeQuery()) {
ResultSetMetaData rsmd = rs.getMetaData();
rs.next();
HashMap<String, Object> row = new HashMap<String, Object>();
String name = rsmd.getColumnName(1);
Object value = rs.getObject(1);
if (value instanceof Blob) {
Blob bl = (Blob) value;
if (bl.length() > 0)
value = bl.getBinaryStream();
else
value = null;
}
row.put(name, value);
result.add(row);
}
} catch (SQLException e) {
throw new OracleQueryProcessorException();
}
} catch (NamingException e) {
throw new OracleQueryProcessorException();
}
return result;
}
And this is usage of this function:
InputStream is = (InputStream) fetchSome("SELECT BLOB_FIELD FROM TEST WHERE ID = 1").get(0).get("BLOB_FIELD");
if (is != null) {
byte[] a = new byte[3];
is.read(a);
}
Reading from this stream is working!! How can it work? Connection is closed (cause using try-with-resources clause). Reading from this stream take no connection from pool (All pool's connections are available).
fetchSome() opens a Connection, sends the query, and then reads the data back into the resulting ArrayList. Then fetchSome closes the Connection and returns the ArrayList. The code you are curious about then reads from the ArrayList that was returned, not from the Connection that was, as you correctly noticed, closed.
By the time your method returns, all database communication has finished, and all the data has been copied into the returned list, from which it can then be read as often and as late as you want, without needing a Connection again.
Does it really work for various BLOB sizes? Good thresholds are:
4000B (limit where BLOB might be in lined in the row - not stored aside)
2000B (maximum size for RAW) - BLOB can be casted to RAW somewhere
16KB, 32KB
some huge value bigger than JVM heap size
AFAIK on OCI level(C client library) LOBs might be "pre-allocated" .i.e. some smaller portion of BLOB can be sent to client, although it was not requested yet by the client. This should reduce number of round-trips between database and client.
Also you should try check v$instance view to check whether the connection really was closed. Cooperation between JDBC and Oracle is tricky sometimes.
For example temporary LOBs created via Connection.createBLOB() are treaded differently than any other temporary lobs by the database. I think it is because Oracle database can not talk to JVM GC and it does not know when really the java instance was disposed. So these lobs are kept in the database "forever".
Related
I am new to Java development so apologies in advance if I am asking something stupid.
I am trying to retrieve an image and it's thumbnail from sql database.
I get data from ResultSet in BinaryStream format and then convert it to byte[].
For thumbnail it works fine and for original image too I am able to retrieve BinaryStream using getBinaryStream method But when I convert it to byte[], the array remain empty for some reason.
binaryStream = rs.getBinaryStream("image");
thumbBinaryStream = rs.getBinaryStream("thumbnail");
if (binaryStream != null) {
// Tested on following line and I get empty imageBytes
byte[] imageBytes = IOUtils.toByteArray(binaryStream);
thisRecord.put("image", DatatypeConverter.printBase64Binary(imageBytes)); // imageBytes is empty here
}
We probably need more information, especially about the datatypes of the columns, but maybe it helps to retrieve the stream from the BLOB like in this example:
if (rs.getMetaData().getColumnType(column) == Types.BLOB) {
in = rs.getBlob(column).getBinaryStream();
} else {
in = rs.getBinaryStream(column);
}
To be sure:
Statement and ResultSet must be closed and getBinaryStream only used when the ResultSet is still open, like:
try (ResultSet rs = stmt.executeQuery()) {
while (rs.next()) {
InputStream binaryStream = rs.getBinaryStream("image");
InputStream thumbBinaryStream = rs.getBinaryStream("thumbnail");
if (binaryStream != null) {
// Tested on following line and I get empty imageBytes
byte[] imageBytes = IOUtils.toByteArray(binaryStream);
thisRecord.put("image", DatatypeConverter.printBase64Binary(imageBytes));
boolean mustGenerateThumbnail = thumbBinaryStream == null;
if (mustGenerateThumbnail ) {
thumbBinaryStream = generateThumbnail(imageBytes);
}
byte[] thumbBytes = IOUtils.toByteArray(thumbBinaryStream);
thisRecord.put("thumbnail", DatatypeConverter.printBase64Binary(thumbBytes));
Here we are at the error. At this point thumbBinaryStream is read till the end, so do:
if (mustGenerateThumbnail ) {
ByteArrayInputStream baIn = new ByteArrayInputStream(thumbBytes);
saveThumbnailForRecordWithId(baIn, floor_drawing_id);
}
}
}
}
(Here I used try-with-resources to automatically close the ResultSet even on thrown exception.)
Furthermore there is a more general class for Base64. Should you in future have the need for such.
DatatypeConverter.printBase64Binary(thumbBytes)
Base64.getEncoder().encodeToString(thumbBytes)
While sending ARRAY to the stord proc we are getting java level dead locks. I am attaching the thread dump.
Found one Java-level deadlock:
=============================
"http-bio-8080-exec-11":
waiting to lock monitor 0x00000000406fb2d8 (object 0x00000000fea1b130, a oracle.jdbc.driver.T4CConnection),
which is held by "http-bio-8080-exec-4"
"http-bio-8080-exec-4":
waiting to lock monitor 0x00000000407d6038 (object 0x00000000fe78b680, a oracle.jdbc.driver.T4CConnection),
which is held by "http-bio-8080-exec-11"
Java stack information for the threads listed above:
===================================================
"http-bio-8080-exec-11":
at oracle.sql.TypeDescriptor.getName(TypeDescriptor.java:682)
- waiting to lock <0x00000000fea1b130> (a oracle.jdbc.driver.T4CConnection)
at oracle.jdbc.oracore.OracleTypeCOLLECTION.isInHierarchyOf(OracleTypeCOLLECTION.java:149)
at oracle.jdbc.driver.OraclePreparedStatement.processCompletedBindRow(OraclePreparedStatement.java:2063)
at oracle.jdbc.driver.OraclePreparedStatement.executeInternal(OraclePreparedStatement.java:3579)
at oracle.jdbc.driver.OraclePreparedStatement.execute(OraclePreparedStatement.java:3685)
- locked <0x00000000fe78b680> (a oracle.jdbc.driver.T4CConnection)
at oracle.jdbc.driver.OracleCallableStatement.execute(OracleCallableStatement.java:4714)
- locked <0x00000000fe78b680> (a oracle.jdbc.driver.T4CConnection)
at oracle.jdbc.driver.OraclePreparedStatementWrapper.execute(OraclePreparedStatementWrapper.java:1376)
at org.springframework.jdbc.core.JdbcTemplate$6.doInCallableStatement(JdbcTemplate.java:1066)
at org.springframework.jdbc.core.JdbcTemplate$6.doInCallableStatement(JdbcTemplate.java:1)
at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:1014)
at org.springframework.jdbc.core.JdbcTemplate.call(JdbcTemplate.java:1064)
at org.springframework.jdbc.object.StoredProcedure.execute(StoredProcedure.java:144)
How to avoid these kind of deadlocks.
Code :
Class extends org.springframework.jdbc.object.StoredProcedure
Map result;
Map hashMap = new HashMap();
hashMap.put(SOME_IDS_PARAM, getJdbcTemplate().execute(new ConnectionCallback() {
#Override
public Object doInConnection(Connection con)
throws SQLException, DataAccessException {
Connection connection = new SimpleNativeJdbcExtractor().getNativeConnection(con);
ArrayDescriptor descriptor = ArrayDescriptor.createDescriptor(schema + ".ARRAY_OF_NUMBER" , connection);
return new oracle.sql.ARRAY(descriptor, connection, someIds);
}
}));
result = super.execute(hashMap);
Even I tried with this approach:
OracleConnection connection = null;
DataSource datasource = null;
Map result;
try {
datasource = getJdbcTemplate().getDataSource();
connection = (OracleConnection) DataSourceUtils.getConnection(datasource);
synchronized (connection) {
Map hashMap = new HashMap();
hashMap.put(SOME_IDS_PARAM, getArrayOfNumberValue(someIds, schema, connection));
result = super.execute(hashMap);
}
} finally {
if (null != connection) {
DataSourceUtils.releaseConnection(connection, datasource);
}
}
Array :
public ARRAY getArrayOfNumberValue(Integer[] array, String schema, OracleConnection connection) throws DataAccessResourceFailureException {
String arrayOfNumberTypeName = schema + ARRAY_OF_NUMBER;
ARRAY oracleArray = null;
ArrayDescriptor descriptor = null;
try {
descriptor = (ArrayDescriptor) connection.getDescriptor(arrayOfNumberTypeName);
if (null == descriptor) {
descriptor = new ArrayDescriptor(arrayOfNumberTypeName, connection);
connection.putDescriptor(arrayOfNumberTypeName, descriptor);
}
oracleArray = new ARRAY(descriptor, connection, array);
} catch (SQLException ex) {
throw new DataAccessResourceFailureException("SQLException " + "encountered while attempting to retrieve Oracle ARRAY", ex);
}
return oracleArray;
}
I suspect that, when i check out the connection from "connection = (OracleConnection) DataSourceUtils.getConnection(datasource);". It will give you the logical connection but underlying it will make use of the "T4Connection" but it is releasing it. And again looking for the same connection.
java.lang.Thread.State: BLOCKED (on object monitor)
at oracle.sql.TypeDescriptor.getName(TypeDescriptor.java:682)
- waiting to lock <0x00000000c1356fc8> (a oracle.jdbc.driver.T4CConnection)
at oracle.jdbc.oracore.OracleTypeCOLLECTION.isInHierarchyOf(OracleTypeCOLLECTION.java:149)
at oracle.jdbc.driver.OraclePreparedStatement.processCompletedBindRow(OraclePreparedStatement.java:2063)
at oracle.jdbc.driver.OraclePreparedStatement.executeInternal(OraclePreparedStatement.java:3579)
at oracle.jdbc.driver.OraclePreparedStatement.execute(OraclePreparedStatement.java:3685)
- locked <0x00000000c14b34f0> (a oracle.jdbc.driver.T4CConnection)
at oracle.jdbc.driver.OracleCallableStatement.execute(OracleCallableStatement.java:4714)
- locked <0x00000000c14b34f0> (a oracle.jdbc.driver.T4CConnection)
at oracle.jdbc.driver.OraclePreparedStatementWrapper.execute(OraclePreparedStatementWrapper.java:1376)
at org.springframework.jdbc.core.JdbcTemplate$6.doInCallableStatement(JdbcTemplate.java:1066)
at org.springframework.jdbc.core.JdbcTemplate$6.doInCallableStatement(JdbcTemplate.java:1)
at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:1014)
at org.springframework.jdbc.core.JdbcTemplate.call(JdbcTemplate.java:1064)
at org.springframework.jdbc.object.StoredProcedure.execute(StoredProcedure.java:144)
at com.intuit.platform.integration.sdx.da.procedures.subscription.serviceSubscription.LookupRealmSubscriptions.execute(LookupRealmSubscriptions.java:55)
- locked <0x00000000fbd00bc0> (a oracle.jdbc.driver.LogicalConnection)
at com.intuit.platform.integration.sdx.da.ServiceSubscriptionDAOImpl.getRealmServiceSubscriptions(ServiceSubscriptionDAOImpl.java:153)
at com.intuit.platform.integration.sdx.ws.beans.ServiceSubscriptionResourceBean.filterRealmIds(ServiceSubscriptionResourceBean.java:84)
The connection in the ARRAY is not the same as the connection in which the Stored Procedure is being executed. You can see this because the T4CConnection that is waiting for a lock (line 3 of the stack trace) has a different IF from the one locked earlier.
Use the answer in How to get current Connection object in Spring JDBC to get your current Connection, and then downcast it to an Oracle connection using https://stackoverflow.com/a/7879073/1395668. You should then be able to create the ARRAY valid for your current connection, and you shouldn't get the deadlock.
I have an application that uses PostgreSQL, JSP and the STRUTS Framework
I want to insert a file into a table in PostgreSQL using the OID type, so it's stored as a large object in the database.
My table definition is this one:
CREATE TABLE mensaje
(
id serial NOT NULL,
file oid,
CONSTRAINT pk_mensaje PRIMARY KEY (id)
)
WITH (
OIDS=TRUE
);
ALTER TABLE mensaje
OWNER TO postgres;
Anybody know an example of how the Action, the ActionForm and the .jsp should be?
If not, is there any other example that explains how to do it without using the OID type?
This is a two step process to solve the problem:
File upload using Struts 2
PostgreSQL Java tutorial, check the Writing images section.
Additional note: Once the file has been received in your Action, you should use the byte array data to save it in your OID field.
From your comment, this should be the way in Struts 1.x
In the JSP
<html:form action="fileUploadAction" method="post" enctype="multipart/form-data">
File : <html:file property="upload" />
<br />
<html:submit />
</html:form>
In your action class
YourForm uploadForm = (YourForm) form;
FormFile file = null;
try {
file = uploadForm.getFile();
//FormFile#getFileData() returns the byte array containing the file data
//You can use it to save the file in your database and other things you want/need
int id = 9001; //assuming this is a valid id in the mensaje table
MensajeService mensajeService = new MensajeService();
mensajeService.saveFile(id, file.getFileData());
} catch (Exception e) {
//log the errors for maintenance purposes (bugs, fixes, etc)
}
The MensajeService class will connect to your Postgre database and save the file
public class MensajeService {
public MensajeService() {
}
public void saveFile(int id, byte[] fileData) throws SQLException {
//this is a very simple skeleton, you have to adapt this to
//your needs, the way you're connecting to dabatase, etc...
Connection con = null;
PreparedStatement pstmt = null;
try {
con = ... //get the connection to your postgre db
//Initialize a new transaction
con.setAutoCommit(false);
// Get the Large Object Manager to perform operations with
LargeObjectManager lobj = ((org.postgresql.PGConnection)conn)
.getLargeObjectAPI();
// Create a new large object
int oid = lobj.create(LargeObjectManager.READ | LargeObjectManager.WRITE);
// Open the large object for writing
LargeObject obj = lobj.open(oid, LargeObjectManager.WRITE);
//in the provided example, the code shows a way to get the byte array data
//from the file (using the File and FileInputStream classes)
//you don't need all that because you already have the byte array (good!)
//so you only write the binary data in your LargeObject (OID) object
obj.write(fileData);
//creating the SQL statement to insert the OID
String sql = "INSERT INTO mensaje VALUES (?, ?)";
pstmt = con.prepareStatement(sql);
pstmt.setInt(1, id);
ps.setInt(2, oid);
//
pstmt.setBinaryStream(2, fin, (int) img.length());
//saving the file
pstmt.executeUpdate();
//closing the transaction successfully
con.commit();
} catch (SQLException e) {
//error in the transaction, start a rollback
if (con != null) {
con.rollback();
}
throw e;
} finally {
//don't forget to free the resources after using them
pstmt.close();
con.close();
}
}
}
Struts 1 code adapted from: Uploading a file in struts1.
PostreSQL code adapted from here.
We have a web application where users can download a list of data by clicking a link. What that does is fires a stored proc in a MS sql server db - fetches rows with 14 columns and all Strings. Once we get that and extract from the resultset we directly stream it down to a csv file on teh client's machine. That way we are saving creating intermediate domain objects(equal to number of rows retuned) in memory before starting the streaming operation. Also we donot wait until the whole resultset has been loaded in memory.
However say for clients who has e..g 80000 instances of such data - it is still spiking up the memory by 50 mb - then there is a fall of around 20 mb and it remains at that level for quite sometime. If I do a Perform GC on jconsole it frees the remaining 30 mb as well. Not sure what is causing that to linger for sometime. Also the 50 mb spike is unnacceptable for a application running on 1.2 gig memory. for bigger clients it shots up by 400 mb and freezes the application or OOM happens.
Any suggestion how we can achieve this?
PLEASE note - I have implemented teh same thing in another place and there is downloads a file of same size but different data(6 columns) in 5 secs and with a memory spike of only 5 mb. In this case it took the stored proc to run in only 4 secs though when run on Sql Mnagament studio. But the one for which i am getting a huge spike the query itself takes 45 secs to run and more based on data as it passes it through a lot of validation. Can that have an adverse effect? I was hoping not as we are fetching chunks of 1000 on the setFetchSize() on the preparedststement
here is the snippet of code
Connection connection = null;
PreparedStatement statement = null;
ResultSet rs = null;
OutputStream outputStream = null;
BufferedWriter bufferedWriter = null;
try
{
response.setContentType("application/save");
response.setHeader("Content-Disposition", "attachment; filename="
+ link.getFileName());
outputStream = response.getOutputStream();
bufferedWriter = new BufferedWriter(new OutputStreamWriter(
outputStream));
connection = dataSource.getConnection();
statement = connection.prepareStatement(link.getQuery());
statement.setFetchSize(1000);
statement.setInt(1, form.getSelectedClientId());
rs = statement.executeQuery();
while (rs.next())
{
bufferedWriter
.write(getCsvRowString(new String[]
{ rs.getString(1), rs.getString(2), rs.getString(3),
rs.getString(4), rs.getString(5),
rs.getString(6), rs.getString(7),
rs.getString(8), rs.getString(9),
rs.getString(10), rs.getString(11),
rs.getString(12), rs.getString(13),
rs.getString(14), rs.getString(15),
rs.getString(16), rs.getString(17),
rs.getString(18) }));
}
} catch (final Exception e)
{
log.error("Error in downloading extracts " + e.getMessage());
throw e;
} finally
{
if (bufferedWriter != null)
{
bufferedWriter.flush();
bufferedWriter.close();
}
if (outputStream != null)
{
outputStream.close();
}
rs.close();
statement.close();
connection.close();
}
Probably the ancient MS SQL JDBC driver is ignoring the setFetchSize() hint (see this answer https://stackoverflow.com/a/1982109/116509). Try using the jTDS driver like on Horizon.
Please suggest me the piece of code for deleting a row from a mysql database that contains three columns problemid, problem and solution.
I want to delete it from a browser i.e. it is a web application.
You may consider using JDBC (Java Database Connectivity) API for your problem. I recommend you to take a close look at the following simple tutorials about developing Java Web Applications Using a MySQL Database.
https://blogs.oracle.com/JavaFundamentals/entry/creating_a_simple_web_application
http://www.javaguicodexample.com/javawebmysqljspjstljsf5.html
Here is a sample Servlet.
But please remember this is just to show you how to do it, you SHOULD NOT use this in a productive system!This is more for demonstration, look at it how and learn how to do it.
This servlet should run fine, but there are some things you have to do!
Anyway you should read these documents, if you haven't already done it
http://docs.oracle.com/javaee/5/tutorial/doc/bnadp.html
http://docs.oracle.com/javase/tutorial/jdbc/basics/index.html
Things which i havent taken into account, while i wrote this:
Request Parameters
If one of the request parameters cannot be found, it will throw an Exception. You need a better way of handling with this situation.
Connection Pooling
This example will open a connection to the database on EVERY request. Opening a connection costs time. Therefore everyone use a connection pool. This library/server opens a specified amount of connections to the database. Everytime you need to access the database you fetch it from this pool and if you're finished you return it to the pool.
Security
Someone who knows the address of this servlet, could easily use it to delete any row in your table. This is your job to secure it.
import java.io.*;
import javax.servlet.*;
import javax.servlet.http.*;
import java.sql.*;
/**
* DON'T USE IN PRODUCTION, JUST FOR LEARNING PURPOSES
**/
public class MySqlServlet extends HttpServlet {
public void doGet(HttpServletRequest request, HttpServletResponse response)
throws ServletException, IOException {
long problemId;
long problem;
long solution;
Object problemIdAsObject = request.getParameter("problemId");
Object problemAsObject = request.getParameter("problem");
Object solutionAsObject = request.getParameter("solution");
if ( problemIdAsObject == null ){
throw new ServletException("problemId has not been specified!");
}
if ( problemAsObject == null ){
throw new ServletException("problem has not been specified!");
}
if ( solutionAsObject == null ){
throw new ServletException("solution has not been specified!");
}
problemId = Long.valueOf( (String)problemIdAsObject );
problem = Long.valueOf( (String)problemAsObject );
solution = Long.valueOf( (String)solutionAsObject );
PreparedStatement statement = null;
Connection connectionToDatabase = null;
try{
connectionToDatabase = getConnection();
String sql = "DELETE FROM table WHERE problemid = ? and "+
"problem = ? and solution = ?";
statement = connectionToDatabase.prepareStatement(sql);
statement.setLong(1,problemId);
statement.setLong(2,problem);
statement.setLong(3,solution);
statement.execute();
}catch( SQLException sqle ){
throw new ServletException(sqle);
}catch( ClassNotFoundException cnfe ){
throw new ServletException(cnfe);
}finally{
try{
statement.close();
connectionToDatabase.close();
}catch( SQLException sqle ){
throw new ServletException(sqle);
}
}
response.setContentType("text/html");
PrintWriter out = response.getWriter();
out.println("<HTML>");
out.println("<BODY>");
out.println("OK");
out.println("</BODY></HTML>");
}
private Connection getConnection()
throws ClassNotFoundException,SQLException{
String userName = "user";
String password = "password";
String databaseName = "database";
String serverAddress = "localhost";
String connectionString = "jdbc:mysql://"+serverAddress+"/"+databaseName+
"?user="+userName+"&password="+password;
//If this line is not working, use this instead:
//Class.forName("com.mysql.jdbc.Driver").newInstance();
Class.forName("com.mysql.jdbc.Driver");
Connection connection = DriverManager.getConnection(connectionString);
return connection;
}
}
It's so simple buddy, first of all you have to make a link to any servlet from jsp. And with that link you have to pass the deleted record id as parameter & write the code inside servlet for deleting the given row from database. And then return that same page where the previous link you have clicked.