FileOutputStream sends 0 byte file - java

I am trying to allow a user to download a file (attachment) using Java to serve up the download. I have been partially successful. The file is read, and on the client side there is a prompt for a download. A file is saved successfully, but it has 0 bytes. Here is my server side code:
String stored = "/var/lib/tomcat/webapps/myapp/attachments/" + request.getParameter("stored");
String realname = request.getParameter("realname");
// Open the input and output streams
FileInputStream attachmentFis = new FileInputStream(stored);
FileOutputStream attachmentFos = new FileOutputStream(realname);
try {
// Send the file
byte[] attachmentBuffer = new byte[1024];
int count = 0;
while((count = attachmentFis.read(attachmentBuffer)) != -1) {
attachmentFos.write(attachmentBuffer, 0, count);
}
} catch (IOException e) {
// Exception handling
} finally {
// Close the streams
attachmentFos.flush();
attachmentFos.close();
attachmentFis.close();
}
For context, this is in a servlet. The files have an obfuscated name, which is passed as "stored" here. The actual file name, the name the user will see, is "realname".
What do I need to do to get the actual file to arrive at the client end?
EDIT
Following suggestions in the comments, I changed the write to include the 0, count parameters and put the close stuff in a finally block. However, I am still getting a 0 byte file when I attempt a download.
EDIT 2
Thanks to the logging suggestion from Dave the Dane, I discovered the file was being written locally. A bit of digging and I found I needed to use response.getOutputStream().write instead of a regular FileOutputStream. I have been successful in getting a file to download through this method. Thank you all for your helpful suggestions.

As others have observed, you'd be better off using try-with-resources & let that handle the closing.
Assuming you have some Logging Framework available, maybe the following would cast light on the matter...
try {
LOG.info ("Requesting....");
final String stored = "/var/lib/tomcat/webapps/myapp/attachments/" + request.getParameter("stored");
LOG.info ("stored.......: {}", stored);
final String realname = request.getParameter("realname");
LOG.info ("realname.....: {}", realname);
final File fileStored = new File(stored);
LOG.info ("fileStored...: {}", fileStored .getCanonicalPath());
final File fileRealname = new File(realname);
LOG.info ("fileRealname.: {}", fileRealname.getCanonicalPath());
try(final InputStream attachmentFis = new FileInputStream (fileStored);
final OutputStream attachmentFos = new FileOutputStream(fileRealname))
{
final byte[] attachmentBuffer = new byte[64 * 1024];
int count;
while((count = attachmentFis.read (attachmentBuffer)) != -1) {
; attachmentFos.write(attachmentBuffer, 0, count);
LOG.info ("Written......: {} bytes to {}", count, realname);
}
attachmentFos.flush(); // Probably done automatically in .close()
}
LOG.info ("Done.");
}
catch (final Exception e) {
LOG.error("Problem!.....: {}", request, e);
}

If it won't reach the finally block, you should stop ignoring the IOException which is being thrown:
catch (IOException e) {
// Exception handling
System.err.println(e.getMessage());
}
I'd asssume that the realname is just missing an absolute path.

Related

Improve performance when reading file from URL and writing it to disk

I made a program which accesses some URLs and downloads the pdfs from there. The files vary between 2MB to 40MB. The program works with no problems but is there a way to improve the perfomance on this? For the larger files it takes a long time to do it.
The code below is the one used for reading / writing the file. This is called in a for loop with different fileNameURLPath.
#Override
public void downloadFile(String fileNameURLPath, String titleCellValue) throws FileException {
try (BufferedInputStream inputStream
= new BufferedInputStream(new URL(fileNameURLPath).openStream())){
FileOutputStream fileOS = new FileOutputStream(FileConstants.MandatoryDownloadProperties.path + titleCellValue + ".pdf");
byte data[] = new byte[32*1024];
int byteContent;
while((byteContent = inputStream.read(data,0 , data.length)) != -1) {
fileOS.write(data, 0 , byteContent);
}
inputStream.close();
fileOS.close();
} catch (MalformedURLException e) {
throw new FileException("Error while processing url. Make sure it is correct");
} catch (IOException e) {
throw new FileException("Error while downloading file. Make sure the download path is correct");
}
}
I read something about Java NIO but I couldn't quite comprehend it or if it can help me in this situation

Error where FileOutputStream only writes to file after the program has been terminated

I've had this error in the past but never fully understood it. After closing an OutputStream, regardless of the location of the java file or the manner in which it is called, completely screws up all sequential runs or attempts to write to another file, even if a different method of writing to a file is used. For this reason I avoid closing streams even though it is a horrible habit not to. In my program, I created was trying a test case that had a close statement which destroyed all of my previous streams, making it for some reason that they only write to files after the program has been terminated.
I kept the file location open and it writes the Text in the text file at the appropriate time, however the "Preview" panel in Windows does not detect it (which used to happen). Note that this all worked perfectly before the stream was accidentally closed. Is there a manner to reset the stream? I've tried flushing it during the process but is still does not run as it did prior.
Here is the method used to create the file:
protected void createFile(String fileName, String content) {
try {
String fileLoc = PATH + fileName + ".txt";
File f = new File(fileLoc);
if(!f.isFile())
f.createNewFile();
FileOutputStream outputStream = new FileOutputStream(fileLoc);
byte[] strToBytes = content.getBytes();
outputStream.write(strToBytes);
} catch (IOException e) {
e.printStackTrace();
return;
}
}
as well as the method used to read the file:
protected String readFile(String fileName) {
try {
StringBuilder sb = new StringBuilder("");
String fileLoc = PATH + fileName + ".txt";
File f = new File(fileLoc);
if(!f.exists())
return "null";
Scanner s = new Scanner(f);
int c = 0;
while(s.hasNext()) {
String str = s.nextLine();
sb.append(str);
if(s.hasNext())
sb.append("\n");
}
return sb.toString();
} catch(Exception e) {
e.printStackTrace();
return "null";
}
}
I'd be happy to answer any clarification questions if needed. Thank you for the assistance.
without try-resource, you need close in final clause to make sure no leak. Or use Stream.flush() if you need more 'in-time' update.
} catch (IOException e) {
e.printStackTrace();
return;
} finally {
outputStream.close();
}
You need to call flush() on the stream to write the bytes to the stream.
You're currently calling write() by itself, like this:
FileOutputStream outputStream = new FileOutputStream(fileLoc);
outputStream.write(content.getBytes());
What you want to do is this:
FileOutputStream outputStream = new FileOutputStream(fileLoc);
outputStream.write(content.getBytes());
outputStream.flush();
From the Javadoc (https://docs.oracle.com/javase/8/docs/api/java/io/OutputStream.html#flush--) for OutputStream (where FileOutputStream is an OutputStream), this is what it says for flush():
Flushes this output stream and forces any buffered output bytes to be written out. The general contract of flush is that calling it is an indication that, if any bytes previously written have been buffered by the implementation of the output stream, such bytes should immediately be written to their intended destination.
Even better would be to close the stream in a finally block, so that no matter what your code always tries to free up any open resources, like this:
FileOutputStream outputStream = null;
try {
outputStream = new FileOutputStream(fileLoc);
outputStream.write(content.getBytes());
outputStream.flush();
} finally {
if (outputStream != null) {
outputStream.close();
}
}
or use automatic resource management, like this:
try (FileOutputStream outputStream = new FileOutputStream(fileLoc)) {
outputStream.write(content.getBytes());
outputStream.flush();
}

Reading a list of Files as a Java 8 Stream

I have a (possibly long) list of binary files that I want to read lazily. There will be too many files to load into memory. I'm currently reading them as a MappedByteBuffer with FileChannel.map(), but that probably isn't required. I want the method readBinaryFiles(...) to return a Java 8 Stream so I can lazy load the list of files as I access them.
public List<FileDataMetaData> readBinaryFiles(
List<File> files,
int numDataPoints,
int dataPacketSize )
throws
IOException {
List<FileDataMetaData> fmdList = new ArrayList<FileDataMetaData>();
IOException lastException = null;
for (File f: files) {
try {
FileDataMetaData fmd = readRawFile(f, numDataPoints, dataPacketSize);
fmdList.add(fmd);
} catch (IOException e) {
logger.error("", e);
lastException = e;
}
}
if (null != lastException)
throw lastException;
return fmdList;
}
// The List<DataPacket> returned will be in the same order as in the file.
public FileDataMetaData readRawFile(File file, int numDataPoints, int dataPacketSize) throws IOException {
FileDataMetaData fmd;
FileChannel fileChannel = null;
try {
fileChannel = new RandomAccessFile(file, "r").getChannel();
long fileSz = fileChannel.size();
ByteBuffer bbRead = ByteBuffer.allocate((int) fileSz);
MappedByteBuffer buffer = fileChannel.map(FileChannel.MapMode.READ_ONLY, 0, fileSz);
buffer.get(bbRead.array());
List<DataPacket> dataPacketList = new ArrayList<DataPacket>();
while (bbRead.hasRemaining()) {
int channelId = bbRead.getInt();
long timestamp = bbRead.getLong();
int[] data = new int[numDataPoints];
for (int i=0; i<numDataPoints; i++)
data[i] = bbRead.getInt();
DataPacket dp = new DataPacket(channelId, timestamp, data);
dataPacketList.add(dp);
}
fmd = new FileDataMetaData(file.getCanonicalPath(), fileSz, dataPacketList);
} catch (IOException e) {
logger.error("", e);
throw e;
} finally {
if (null != fileChannel) {
try {
fileChannel.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
return fmd;
}
Returning fmdList.Stream() from readBinaryFiles(...) won't accomplish this because the file contents will already have been read into memory, which I won't be able to do.
The other approaches to reading the contents of multiple files as a Stream rely on using Files.lines(), but I need to read binary files.
I'm, open to doing this in Scala or golang if those languages have better support for this use case than Java.
I'd appreciate any pointers on how to read the contents of multiple binary files lazily.
There is no laziness possible for the reading within the a file as you are reading the entire file for constructing an instance of FileDataMetaData. You would need a substantial refactoring of that class to be able to construct an instance of FileDataMetaData without having to read the entire file.
However, there are several things to clean up in that code, even specific to Java 7 rather than Java 8, i.e you don’t need a RandomAccessFile detour to open a channel anymore and there is try-with-resources to ensure proper closing. Note further that you usage of memory mapping makes no sense. When copy the entire contents into a heap ByteBuffer after mapping the file, there is nothing lazy about it. It’s exactly the same what happens, when call read with a heap ByteBuffer on a channel, except that the JRE can reuse buffers in the read case.
In order to allow the system to manage the pages, you have to read from the mapped byte buffer. Depending on the system, this might still not be better than repeatedly reading small chunks into a heap byte buffer.
public FileDataMetaData readRawFile(
File file, int numDataPoints, int dataPacketSize) throws IOException {
try(FileChannel fileChannel=FileChannel.open(file.toPath(), StandardOpenOption.READ)) {
long fileSz = fileChannel.size();
MappedByteBuffer bbRead=fileChannel.map(FileChannel.MapMode.READ_ONLY, 0, fileSz);
List<DataPacket> dataPacketList = new ArrayList<>();
while(bbRead.hasRemaining()) {
int channelId = bbRead.getInt();
long timestamp = bbRead.getLong();
int[] data = new int[numDataPoints];
for (int i=0; i<numDataPoints; i++)
data[i] = bbRead.getInt();
dataPacketList.add(new DataPacket(channelId, timestamp, data));
}
return new FileDataMetaData(file.getCanonicalPath(), fileSz, dataPacketList);
} catch (IOException e) {
logger.error("", e);
throw e;
}
}
Building a Stream based on this method is straight-forward, only the checked exception has to be handled:
public Stream<FileDataMetaData> readBinaryFiles(
List<File> files, int numDataPoints, int dataPacketSize) throws IOException {
return files.stream().map(f -> {
try {
return readRawFile(f, numDataPoints, dataPacketSize);
} catch (IOException e) {
logger.error("", e);
throw new UncheckedIOException(e);
}
});
}
This should be sufficient:
return files.stream().map(f -> readRawFile(f, numDataPoints, dataPacketSize));
…if, that is, you are willing to remove throws IOException from the readRawFile method’s signature. You could have that method catch IOException internally and wrap it in an UncheckedIOException. (The problem with deferred execution is that the exceptions also need to be deferred.)
I don't know how performant this is, but you can use java.io.SequenceInputStream wrapped inside of DataInputStream. This will effectively concatenate your files together. If you create a BufferedInputStream from each file, then the whole thing should be properly buffered.
Building on VGR's comment, I think his basic solution of:
return files.stream().map(f -> readRawFile(f, numDataPoints, dataPacketSize))
is correct, in that it will lazily process the files (and stop if a short-circuiting terminal action is invoked off the result of the map() operation. I would also suggest a slightly different to the implementation of readRawFile that leverages try with resources and InputStream, which will not load the whole file into memory:
public FileDataMetaData readRawFile(File file, int numDataPoints, int dataPacketSize)
throws DataPacketReadException { // <- Custom unchecked exception, nested for class
FileDataMetadata results = null;
try (FileInputStream fileInput = new FileInputStream(file)) {
String filePath = file.getCanonicalPath();
long fileSize = fileInput.getChannel().size()
DataInputStream dataInput = new DataInputStream(new BufferedInputStream(fileInput);
results = new FileDataMetadata(
filePath,
fileSize,
dataPacketsFrom(dataInput, numDataPoints, dataPacketSize, filePath);
}
return results;
}
private List<DataPacket> dataPacketsFrom(DataInputStream dataInput, int numDataPoints, int dataPacketSize, String filePath)
throws DataPacketReadException {
List<DataPacket> packets = new
while (dataInput.available() > 0) {
try {
// Logic to assemble DataPacket
}
catch (EOFException e) {
throw new DataPacketReadException("Unexpected EOF on file: " + filePath, e);
}
catch (IOException e) {
throw new DataPacketReadException("Unexpected I/O exception on file: " + filePath, e);
}
}
return packets;
}
This should reduce the amount of code, and make sure that your files get closed on error.

Writing to an External File on Android --- File Doesn't Register, But Java Can Read

I'm trying to write to an external txt (or csv) file for Android. I can run an app, close it, and run it again, and readData() will read back to my log what I've stored. However, the dirFile (file directory) appears nowhere within my Android files (even if I connect it to a computer and search).
Something interesting, though: if I clear my log (similar to a list of print statements shown within Eclipse) and disconnect my phone from my computer, then reconnect it, the log reappears with everything I've ever written to my file (even if I later overwrote it)...yet the app isn't even running!
Here is my code. Please help me understand why I cannot find my file!
(Note: I've tried appending a "myFile.txt" extension to the directory, but it just causes an EISDIR exception.)
public void writeData(String dirName){
try
{
File root = new File(getExternalFilesDir(null), dirName);
// Writes to file
//
// The "true" argument allows the file to be appended. Without this argument (just root),
// the file will be overwritten (even though we later call append) rather than appended to.
FileWriter writer = new FileWriter(root, true);
writer.append("Append This Text\n");
writer.flush();
writer.close();
// Checks if we actually wrote to file by reading it back in (appears in Log)
//readData(dirName);
}
catch(Exception e)
{
Log.v("2222", "2222 ERROR: " + e.getMessage());
}
}
If you're interested, here's the function I wrote to read in the data:
public void readData(String dirName){
try
{
File root = new File(getExternalFilesDir(null), dirName);
// Checks to see if we are actually writing to file by reading in the file
BufferedReader reader = new BufferedReader(new FileReader(root));
try {
String s = reader.readLine();
while (s != null) {
Log.v("2222", "2222 READ: " + s);
s = reader.readLine();
}
}
catch(Exception e) {
Log.v("2222", "2222 ERROR: " + e.getMessage());
}
finally {
reader.close();
}
}
catch(Exception e) {
Log.v("2222", "2222 ERROR: " + e.getMessage());
}
}
Thanks!
even if I connect it to a computer and search
if I clear my log (similar to a list of print statements shown within Eclipse) and disconnect my phone from my computer, then reconnect it, the log reappears with everything I've ever written to my file (even if I later overwrote it).
What you are seeing on your computer is what is indexed by MediaStore, and possibly a subset of those, depending upon whether your computer caches information it gets from the device in terms of "directory" contents.
To help ensure that MediaStore indexes your file promptly:
Use a FileOutputStream (optionally wrapped in an OutputStreamWriter), not a FileWriter
Call flush(), getFD().sync(), and close() on the FileOutputStream, instead of calling flush() and close() on the FileWriter (sync() will ensure the bytes are written to disk before continuing)
Use MediaScannerConnection and scanFile() to tell MediaStore to index your file
You can then use whatever sort of "reload" or "refresh" or whatever option is in your desktop OS's file manager, and your file should show up.
This blog post has more on all of this.
public void create(){
folder = new File(Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_MOVIES),"video");
boolean success = true;
if (!folder.exists()) {
success=folder.mkdirs();
}
if (success) {
readfile();
} else {
System.out.println("failed");
}
}
The above code will be used to crete the directory in th emobile at desired path
private void readfile() {
// TODO Auto-generated method stub
AssetManager assetManager = getResources().getAssets();
String[] files = null;
try {
files = assetManager.list("clipart");
} catch (Exception e) {
Log.e("read clipart ERROR", e.toString());
e.printStackTrace();
}
for(String filename : files) {
System.out.println("File name => "+filename);
InputStream in = null;
OutputStream out = null;
try {
in = assetManager.open("clipart/" + filename);
out = new FileOutputStream(folder + "/" + filename);
copyFile(in, out);
in.close();
in = null;
out.flush();
out.close();
out = null;
} catch(Exception e) {
Log.e("copy clipart ERROR", e.toString());
e.printStackTrace();
}
}}private void copyFile(InputStream in, OutputStream out) throws IOException {
byte[] buffer = new byte[1024];
int read;
while((read = in.read(buffer)) != -1){
out.write(buffer, 0, read);
}}
this is my code used to write file in internal memory from the assets folder in project. This code can read all type(extension) of file from asset folder to mobile.
Don't forget to add permission in manifest file
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
and call the above function by
readfile();//this call the function to read and write the file
I hope this may help you.
Thank you.

Why is my image coming out garbled?

I've got some Java code using a servlet and Apache Commons FileUpload to upload a file to a set directory. It's working fine for character data (e.g. text files) but image files are coming out garbled. I can open them but the image doesn't look like it should. Here's my code:
Servlet
protected void doPost(HttpServletRequest request, HttpServletResponse response)
throws ServletException, IOException {
try {
String customerPath = "\\leetest\\";
// Check that we have a file upload request
boolean isMultipart = ServletFileUpload.isMultipartContent(request);
if (isMultipart) {
// Create a new file upload handler
ServletFileUpload upload = new ServletFileUpload();
// Parse the request
FileItemIterator iter = upload.getItemIterator(request);
while (iter.hasNext()) {
FileItemStream item = iter.next();
String name = item.getFieldName();
if (item.isFormField()) {
// Form field. Ignore for now
} else {
BufferedInputStream stream = new BufferedInputStream(item
.openStream());
if (stream == null) {
LOGGER
.error("Something went wrong with fetching the stream for field "
+ name);
}
byte[] bytes = StreamUtils.getBytes(stream);
FileManager.createFile(customerPath, item.getName(), bytes);
stream.close();
}
}
}
} catch (Exception e) {
throw new UploadException("An error occured during upload: "
+ e.getMessage());
}
}
StreamUtils.getBytes(stream) looks like:
public static byte[] getBytes(InputStream src, int buffsize)
throws IOException {
ByteArrayOutputStream byteStream = new ByteArrayOutputStream();
byte[] buff = new byte[buffsize];
while (true) {
int nBytesRead = src.read(buff);
if (nBytesRead < 0) {
break;
}
byteStream.write(buff);
}
byte[] result = byteStream.toByteArray();
byteStream.close();
return result;
}
And finally FileManager.createFile looks like:
public static void createFile(String customerPath, String filename,
byte[] fileData) throws IOException {
customerPath = getFullPath(customerPath + filename);
File newFile = new File(customerPath);
if (!newFile.getParentFile().exists()) {
newFile.getParentFile().mkdirs();
}
FileOutputStream outputStream = new FileOutputStream(newFile);
outputStream.write(fileData);
outputStream.close();
}
Can anyone spot what I'm doing wrong?
Cheers,
Lee
One thing I don't like is here in this block from StreamUtils.getBytes():
1 while (true) {
2 int nBytesRead = src.read(buff);
3 if (nBytesRead < 0) {
4 break;
5 }
6 byteStream.write(buff);
7 }
At line 6, it writes the entire buffer, no matter how many bytes are read in. I am not convinced this will always be the case. It would be more correct like this:
1 while (true) {
2 int nBytesRead = src.read(buff);
3 if (nBytesRead < 0) {
4 break;
5 } else {
6 byteStream.write(buff, 0, nBytesRead);
7 }
8 }
Note the 'else' on line 5, along with the two additional parameters (array index start position and length to copy) on line 6.
I could imagine that for larger files, like images, the buffer returns before it is filled (maybe it is waiting for more). That means you'd be unintentionally writing old data that was remaining in the tail end of the buffer. This is almost certainly happening most of the time at EoF, assuming a buffer > 1 byte, but extra data at EoF is probably not the cause of your corruption...it is just not desirable.
I'd just use commons io Then you could just do an IOUtils.copy(InputStream, OutputStream);
It's got lots of other useful utility methods.
Are you sure that the image isn't coming through garbled or that you aren't dropping some packets on the way in.
I don't know what difference it makes, but there seems to be a mismatch of method signatures. The getBytes() method called in your doPost() method has only one argument:
byte[] bytes = StreamUtils.getBytes(stream);
while the method source you included has two arguments:
public static byte[] getBytes(InputStream src, int buffsize)
Hope that helps.
Can you perform a checksum on your original file, and the uploaded file and see if there is any immediate differences?
If there are then you can look at performing a diff, to determine the exact part(s) of the file that are missing changed.
Things that pop to mind is beginning or end of stream, or endianness.

Categories

Resources