The problem is, user clicks a button in JSP, which will export the displayed data. So what i am doing is, creating a temp. file and writing the contents in it [ resultSet >> xml >> csv ], and then writing the contents to ServletResponse. After closing the respons output stream, i try to delete the file, but every time it returns false.
code;
public static void writeFileContentToResponse ( HttpServletResponse response , String fileName ) throws IOException{
ServletOutputStream responseoutputStream = response.getOutputStream();
File file = new File(fileName);
if (file.exists()) {
file.deleteOnExit();
DataInputStream dis = new DataInputStream(new FileInputStream(
file));
response.setContentType("text/csv");
int size = (int) file.length();
response.setContentLength(size);
response.setHeader("Content-Disposition",
"attachment; filename=\"" + file.getName() + "\"");
response.setHeader("Pragma", "public");
response.setHeader("Cache-control", "must-revalidate");
if (size > Integer.MAX_VALUE) {
}
byte[] bytes = new byte[size];
dis.read(bytes);
FileCopyUtils.copy(bytes, responseoutputStream );
}
responseoutputStream.flush();
responseoutputStream.close();
file.delete();
}
i have used 'file.deleteOnExit();' and file.delete(); but none of them is working.
file.deleteOnExit() isn't going to produce the result you want here - it's purpose is to delete the file when the JVM exits - if this is called from a servlet, that means to delete the file when the server shuts down.
As for why file.delete() isn't working - all I see in this code is reading from the file and writing to the servlet's output stream - is it possible when you wrote the data to the file that you left the file's input stream open? Files won't be deleted if they're currently in use.
Also, even though your method throws IOException you still need to clean up things if there's an exception while accessing the file - put the file operations in a try block, and put the stream.close() into a finally block.
Don't create that file.
Write your data directly from your resultset to your CSV responseoutputStream.
That saves time, memory, diskspace and headache.
If you realy need it, try using File.createTempFile() method.
These files will be deleted when your VM stops normaly if they haven't been deleted before.
I'm assuming you have some sort of concurrency issue going on here. Consider making this method non-static, and use a unique name for your temp file (like append the current time, or use a guid for a filename). Chances are that you're opening the file, then someone else opens it, so the first delete fails.
as I see it, you are not closing the DataInputStream dis - this results to the false status, when you do want to delete file. Also, you should handle the streams in try-catch-finally block and close them within finally. The code is a bit rough, but it is safe:
DataInputStream dis = null;
try
{
dis = new DataInputStream(new FileInputStream(
file));
... // your other code
}
catch(FileNotFoundException P_ex)
{
// catch only Exceptions you want, react to them
}
finally
{
if(dis != null)
{
try
{
dis.close();
}
catch (IOException P_ex)
{
// handle exception, again react only to exceptions that must be reacted on
}
}
}
How are you creating the file. You probably need to use createTempFile.
You should be able to delete a temporary file just fine (No need for deleteOnExit). Are you sure the file isn't in use, when you are trying to delete it? You should have one file per user request (That is another reason you should avoid temp files and store everything in memory).
you can try piped input and piped output stream. those buffers need two threads one to feed the pipe (exporter) and the other (servlet) to consume data from the pipe and write it to the response output stream
You really don't want to create a temporary file for a request. Keep the resulting CSV in memory if at all possible.
You may need to tie the writing of the file in directly with the output. So parse a row of the result set, write it out to response stream, parse the next row and so on. That way you only keep one row in memory at a time. Problem there is that the response could time out.
If you want a shortcut method, take a look at Display tag library. It makes it very easy to show a bunch of results in a table and then add pre-built export options to said table. CSV is one of those options.
You don't need a temporary file. The byte buffer which you're creating there based on the file size may also cause OutOfMemoryError. It's all plain inefficient.
Just write the data of the ResultSet immediately to the HTTP response while iterating over the rows. Basically: writer.write(resultSet.getString("columnname")). This way you don't need to write it to a temporary file or to gobble everything in Java's memory.
Further, most JDBC drivers will by default cache everything in Java's memory before giving anything to ResultSet#next(). This is also inefficient. You'd like to let it give the data immediately row-by-row by setting the Statement#setFetchSize(). How to do it properly depends on the JDBC driver used. In case of for example MySQL, you can read it up in its JDBC driver documentation.
Here's a kickoff example, assuming that you're using MySQL:
protected void doGet(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException {
response.setContentType("text/csv");
response.setCharacterEncoding("UTF-8");
Connection connection = null;
Statement statement = null;
ResultSet resultSet = null;
PrintWriter writer = response.getWriter();
try {
connection = database.getConnection();
statement = connection.createStatement(ResultSet.TYPE_FORWARD_ONLY, ResultSet.CONCUR_READ_ONLY);
statement.setFetchSize(Integer.MIN_VALUE);
resultSet = statement.executeQuery("SELECT col1, col2, col3 FROM tbl");
while (resultSet.next()) {
writer.append(resultSet.getString("col1")).append(',');
writer.append(resultSet.getString("col2")).append(',');
writer.append(resultSet.getString("col3")).println();
// Note: don't forget to escape quotes/commas as per RFC4130.
}
} catch (SQLException e) {
throw new ServletException("Retrieving CSV rows from DB failed", e);
} finally {
if (resultSet != null) try { resultSet.close; } catch (SQLException logOrIgnore) {}
if (statement != null) try { statement.close; } catch (SQLException logOrIgnore) {}
if (connection != null) try { connection.close; } catch (SQLException logOrIgnore) {}
}
}
That's it. This way effectlvely only one database row is been kept in the memory all the time.
Related
Basically attempting to send video data and trying to understand how this whole process works, not sure whether I've put this together properly. Any help would be greatly appreciated.
public void OutputStream(BufferedOutputStream out) throws MalformedURLException {
URL url = new URL("http://www.android.com//");
HttpURLConnection urlConnection = null;
try {
urlConnection = (HttpURLConnection) url.openConnection();
urlConnection.setDoOutput(true);
urlConnection.setChunkedStreamingMode(0);
out = new BufferedOutputStream(urlConnection.getOutputStream());
out = new BufferedOutputStream(new FileOutputStream(String.valueOf(mVideoUri)), 8 * 1024);
} catch (IOException e) {
e.printStackTrace();
} finally {
assert urlConnection != null;
urlConnection.disconnect();
}
}
You aren't technically using the output stream at all in this case, merely reassigning it several times. The input parameter out is being reassigned within the method but never used prior which doesn't seem like what you want to do at all since whatever existing output stream instance reference passed to this method is simply discarded.
You reassign out once more and discard the buffered of the socket connection on this line:
new BufferedOutputStream(urlConnection.getOutputStream());
which is mostly harmless in that there isn't any resource leakage (given that disconnect() is called) but once again doesn't seem like what you want to do.
Your code also has resource leakage on the last out given that it is not closed anywhere within the try-catch-finally block which is a serious flaw. Additionally, the usage of assertions to check for nulls on out needs to be promoted to a if-statement to handle the very real possibility that out is null in case of a failed URL resolution/open. Assertion tests can be turned off, in which you'd get a NPE (and when turned on, you'll get an AssertionError, nether of which is better).
Whilst it's hard to anticipate exactly what your project structure is, the general contract of output stream usage can be seen as follows:
public void foo(){
OutputStream out = null;
byte[] data = ... // Populated from some data source
try{
out = ... // Populated from some source
out.write(data); // Writes the data to the output destination
}catch(IOException ex){
// Handle exception here
}finally{
// Only attempt to close the output stream if it was actually opened successfully
if(out != null){
try{
out.close();
}catch(IOException closeEx){
// Handle, propogate upwards or log it
}
}
}
}
The output stream is used within the try block such that any exceptions will result in the finally block closing the stream as appropriate, removing the resource leakage. Note the sample write() method in the try block, illustrating in the most basic form how OutputStreams can be used to put data into some destination.
Under java 7 (and above), the above example is more compact:
public void foo(){
byte[] data = ... // Populated from some data source
try(OutputStream out = ...){
out.write(data); // Writes the data to the output destination
}catch(IOException ex){
// Handle exception here
}
}
Utilizing try-with-resources, resource safety can be assured thanks to the AutoClosable interface and java 7 (and above's) new syntax. There is one small difference in that exceptions from closing the stream are also bunched into the same catch block instead of being separate as in the first example.
Here is the scenario, I try to upload a file, and after I uploaded it, I tried to access that file from the new directory (which i just write to), but I received error message:
There was an error opening this document. The file is already open or
in use by another application.
Below is my coding.
try{
conn = this.getConnection();
String getIP = "SELECT IP FROM TABLE WHERE ID='3'";
ps = conn.prepareStatement(getIP);
rs = ps.executeQuery();
Part file = request.getPart("upload");
String fileName = extractFileName(request.getPart("upload"));
String basePath = "//"+ipAdd+"/ns/"+fileName;
File outputFilePath = new File(basePath + fileName);
inputStream = file.getInputStream();
outputStream = new FileOutputStream(outputFilePath);
int read = 0;
final byte[] bytes = new byte[1024];
while ((read = inputStream.read(bytes)) != -1) {
outputStream.write(bytes, 0, read);
}
}catch(Exception ex){
ex.printStackTrace();
throw ex;
}finally{
if(!conn.isClosed())conn.close();
if(!ps.isClosed())ps.close();
if(!rs.isClosed())rs.close();
inputStream.close();
outputStream.close();
}
Is it because that I open the file too quick after I start the upload function? I do realize that after 1/2minutes, I'm able to access the file. Is there anyway to solve this bug?
You're not closing the file. Add
outputStream.close();
after the loop.
EDIT And do it first, before closing anything else. You should really use try-with-resources here. If you get any exception closing anything, the other closes won't happen.
In your code above, if an exception occurs whilst closing the JDBC Connection, then none of the other JDBC objects or Streams are closed. The finally block exits at that point.
Since Java 7, closing Streams and JDBC objects (Connections, Statements, ResultSets etc) can be done in a proper exception handling framework nice and easily, since they all implement a common interface AutoCloseable
So you can write a single close() method and handle the exception inside:
public void close(AutoCloseable closeable) {
try {
closeable.close();
} catch (Exception e) {
//Just log the exception. there's not much else you can do, and it probably doesn't
//matter. Don't re-throw!
}
}
So when closing your JDBC objects, you can do this in the finally block:
close(conn);
close(ps);
close(rs);
close(inputStream);
close(outputStream);
Now if an exception occurs whilst closing any of the objects, it is handled and the following objects are still closed.
I have next function:
static void write()
{
try {
File file = new File ("flip.out");
BufferedWriter out = new BufferedWriter(new FileWriter(file));
out.write(sMax);
System.out.println(sMax);//This command it works
out.close();
} catch (Exception e) {
e.printStackTrace();
}
}
The problem is that my program doesn't write anything in my file.
Few things to rectify -
Why create two different instances of File object
File file = new File ("flip.out");
BufferedWriter out = new BufferedWriter(new FileWriter("flip.out"));
All you need to do is
File file = new File ("flip.out");
BufferedWriter out = new BufferedWriter(new FileWriterfile(file ) ));
Next put your close call in finally statement rather than try block. Why? Because of IOException occurs resource will not be closed and if resource do not get closed your changes may not be reflected in the file.
Next it is a good programming practice not to catch Runtime exceptions. So do not use Exception as a ploymorphic type to catch your exception. Use whatever is being thrown like IOException in your case.
Now there might be various reasons why noting is being written in the file. As you are not getting and Exception one of the reason why this might be happening because your static function is not getting called or the string/object sMax(whatever that is) is empty.
Also the file(if not already present) will be created in the current directory. So if there are multiple instance is your code where your are creating files with same name then make sure you are checking the right one.
You have to flush the stream in order for what's in the memory to get written to the drive. What you wrote to BufferedWriter is sitting in a byte array waiting for the rest of it to be filled up before actually writing it to the disk. This helps with performance, but means you have to flush the stream in case you don't fill up that buffer. Here is how you do that:
static void write() throws IOException {
BufferedWriter out = new BufferedWriter(new FileWriter("flip.out"));
try {
out.write(sMax);
out.flush();
} catch (Exception e) {
// probably could ditch this and
// just the exception bubble up and
// handle it higher up.
e.printStackTrace();
} finally {
out.close();
}
}
So if it makes it to the flush() we know we wrote everything to the stream we wanted. However, if we ever get an exception we make sure we close the stream regardless of success or exception. And finally our stream is outside the try statement because the only exception ever thrown by Writers/OutputStreams during construction is FileNotFoundException which means the file never got opened in the first place so we don't have to close it.
can you call out.flush() before closing.
that will make sure any content in buffer is written to file immediately.
I am creating a file on a network drive and then adding data to it. Time to time writing to that file fails. Is there a good way of checking if the file is accessible before every time i save data to it or maybe is tehre a way checking afther to see if the data was saved?
EDIT:
Right now i am using try-catch block with PrintStream in my code:
try
{
logfile = new File(new File(isic_log), "log_" + production);
//nasty workaround - we'll have a file the moment we assign an output stream to it
if (!logfile.exists())
{
prodrow = production;
}
out = new FileOutputStream(logfile.getPath(), logfile.exists());
p = new PrintStream(out);
if (prodrow != "")
{
p.println (prodrow);
}
p.println (chip_code + ":" + isic_number);
p.close();
}
catch (Exception e)
{
logger.info("Got exception while writing to isic production log: " + e.getMessage());
}
So might be the PrintStream the problem? (PrintWriter and PrintStream never throw IOExceptions)
I would use plain BufferedWriter and add the newlines myself as required.
Normal FileOutputStream operations should throw an IOException if there is an error.
AFAIK, The only exception is PrintWriter which does not throw an exception. Instead you need to call checkError() but it gives you no indication of what the error was or when it occurred.
I suggest you not use PrintWriter in this situation.
The only reasonable way to address this is to try to write to the file, and handle any resulting exception in an appropriate manner. It's pretty much impossible to know beforehand whether an I/O operation is going to succeed, due to the unreliable nature of networks.
I have this ArrayList files
for(File file : files){
InputStream in = FileInputStream(file);
// process each file and save it to file
OutputStream out = FileOutputStream(file);
try{
} finally {
in.close();
out.close();
}
}
the performance is really slow since every loop there is a in/out close(), is there a better way to do this? I tried to put outputstream oustide of the loop, it doesn't work.
Using buffered streams makes a huge difference.
Try this:
for(final File file : files) {
final InputStream in = new BufferedInputStream(new FileInputStream(file));
final OutputStream out = new BufferedOutputStream(new FileOutputStream(new File(...)));
try {
// Process each file and save it to file
}
finally {
try {
in.close();
}
catch (IOException ignored) {}
try {
out.close();
}
catch (IOException ignored) {}
}
}
Note that the IOExceptions that can be thrown when closing the streams must be ignored, or you will lose the potential initial exception.
Another problem is that both streams are on the same file, which doesn't work. So I suppose you're using two different files.
A close() can take up to 20 ms. I doubt this is your program unless you have 1000's of files.
I suspect your performance problem is a lack of buffering the input and output. Can you show your buffering wrappers as well?
you can of course build a queue of OutputStreams and offload that to a background thread that handles the closing of these outputstreams. Same for InputStreams.
Alternatively you can leave it down to the JVM to do that -- simply don't close the files and leave it to the GC to do that when objects are finalized.