OOM while uploading large file - java

I need to upload a very large file from my machine to a server. (a few GB)
Currently, I tried the below approach but I keep getting.
Caused by: java.lang.OutOfMemoryError: Java heap space
at java.util.Arrays.copyOf(Arrays.java:3236)
I can increase the memory but this is not something I want to do because not sure where my code will run. I want to read a few MB/kb send them to the server and release the memory and repeat. tried other approaches like Files utils or IOUtils.copyLarge but I get the same problem.
URL serverUrl =
new URL(url);
HttpURLConnection urlConnection = (HttpURLConnection) serverUrl.openConnection();
urlConnection.setConnectTimeout(Configs.TIMEOUT);
urlConnection.setReadTimeout(Configs.TIMEOUT);
File fileToUpload = new File(file);
urlConnection.setDoOutput(true);
urlConnection.setRequestMethod("POST");
urlConnection.addRequestProperty("Content-Type", "application/octet-stream");
urlConnection.connect();
OutputStream output = urlConnection.getOutputStream();
FileInputStream input = new FileInputStream(fileToUpload);
upload(input, output);
//..close streams
private static long upload(InputStream input, OutputStream output) throws IOException {
try (
ReadableByteChannel inputChannel = Channels.newChannel(input);
WritableByteChannel outputChannel = Channels.newChannel(output)
) {
ByteBuffer buffer = ByteBuffer.allocateDirect(10240);
long size = 0;
while (inputChannel.read(buffer) != -1) {
buffer.flip();
size += outputChannel.write(buffer);
buffer.clear();
}
return size;
}
}
I think it has something to do with this but I can't figure out what I am doing wrong.
Another approach was but I get the same issue:
private static long copy(InputStream source, OutputStream sink)
throws IOException {
long nread = 0L;
byte[] buf = new byte[10240];
int n;
int i = 0;
while ((n = source.read(buf)) > 0) {
sink.write(buf, 0, n);
nread += n;
i++;
if (i % 10 == 0) {
log.info("flush");
sink.flush();
}
}
return nread;
}

Use setFixedLengthStreamingMode as per this answer on the duplicate question Denis Tulskiy linked to:
conn.setFixedLengthStreamingMode((int) fileToUpload.length());
From the docs:
This method is used to enable streaming of a HTTP request body without internal buffering, when the content length is known in advance.
At the moment, your code is attempting to buffer the file into Java's heap memory in order to compute the Content-Length header on the HTTP request.

Related

how would i go about downloading something from a java application, then putting it in the desktop?

hi all well i wan to make an app where it downloads something from a website and puts it in the desktop.
this code downloads it but temporarly, how would i go about saving it?
heres my code
private static void grabItem() throws ClassNotFoundException,
InstantiationException, IllegalAccessException, IOException,
UnsupportedLookAndFeelException {
final URL url = new URL("sampleurl");
final InputStream is = url.openStream();
final byte[] b = new byte[2048];
int length;
final HttpURLConnection connection = (HttpURLConnection) url
.openConnection();
// Specify what portion of file to download.
connection.setRequestProperty("Range", "bytes=" + downloaded + "-");
// Connect to server.
connection.connect();
// Make sure response code is in the 200 range.
if ((connection.getResponseCode() / 100) != 2) {
logger.info("Unable to find file");
return;
}
// set content length.
size = connection.getContentLength();
while ((length = is.read(b)) != -1) {
downloaded += length;
progressBar.setValue((int) getProgress()); // set progress bar
}
is.close();
setFrameTheme();
}
thanks
you never write any data at all to your computer... but anyways...
this is how I download & save a file ... it needs to be a direct download but its easy enough to change it to work the way you want it
URL url = new URL("direct link goes here");
URLConnection connection = url.openConnection();
InputStream inputstream = connection.getInputStream();
to get it to save you would then...
BufferedOuputStream bufferedoutputstream = new BufferedOutputStream(new FileOutputStream(new File("location to save downloaded file")));
byte[] buffer = new byte[1024];
int bytesRead = 0;
while((bytesRead = inputstream.read(buffer)))
{
bufferedoutputstream.write(buffer, 0, bytesRead);
}
bufferedoutputstream.flush();
bufferedoutputstream.close();
inputstream.close();
that should download & save

Response size limit when using Apache HttpComponents

I am converting some code from the Http Client 3.x library over to the Http Components 4.x library. The old code contains a check to make sure that the response is not over a certain size. This is fairly easy to do in Http Client 3.x since you can get back a stream from the response using the getResponseBodyAsStream() method and determine when the size has been exceeded. I can't find a similar way in Http Components.
Here's the old code as an example of what I'm trying to do:
private static final long RESPONSE_SIZE_LIMIT = 1024 * 1024 * 10;
private static final int READ_BUFFER_SIZE = 16384;
private static ByteArrayOutputStream readResponseBody(HttpMethodBase method)
throws IOException {
int len;
byte buff[] = new byte[READ_BUFFER_SIZE];
ByteArrayOutputStream out = null;
InputStream in = null;
long byteCount = 0;
in = method.getResponseBodyAsStream();
out = new ByteArrayOutputStream(READ_BUFFER_SIZE);
while ((len = in.read(buff)) != -1 && byteCount <= RESPONSE_SIZE_LIMIT) {
byteCount += len;
out.write(buff, 0, len);
}
if (byteCount >= RESPONSE_SIZE_LIMIT) {
throw new IOException(
"Size limited exceeded reading from HTTP input stream");
}
return (out);
}
You can use HttpEntity.getContent() to get an InputStream to read from yourself.

Incomplete file when downloading with Java from FTP

In my application I upload a byte[] (serialized Object) to my FTP server, which is working perfectly. However when I try to download it only the first part (like 3000 bytes) of the array are correct, the rest is filled with zeros.
I can't seem to figure out what is wrong, any help would be appreciated.
I am using the package
org.apache.commons.net.*
public static byte[] downloadBoard( String host, int port, String usr, String pwd) throws IOException {
FTPClient ftpClient = new FTPClient();
byte[] buf = new byte[20000];
try {
ftpClient.connect( host, port );
ftpClient.login( usr, pwd );
ftpClient.setFileType(FTP.BINARY_FILE_TYPE);
InputStream is = ftpClient.retrieveFileStream("asdf.board");
is.read(buf);
is.close();
ftpClient.completePendingCommand();
ftpClient.logout();
} finally {
ftpClient.disconnect();
}
return buf;
}
is.read() may not return the full content. You'll need to put read() into a loop similar to this:
int pos = 0;
while (true) {
int count = is.read(buf, pos, buf.length - pos);
if (count <= 0) {
break;
}
pos += count;
}
P.S.:
If you know the size of the file, you can use a DataInputStream to read the buffer without a loop:
byte[] buf = new byte[exactFileSize];
DataInputStream dis = new DataInputStream(is);
dis.readFully(buf);
InputStream.read() typically doesn't read the entire stream for you, only some part of it. Note that InputStream.read() returns the number of bytes actually read, which you'll need to check.
The typical pattern is to loop until the InputStream has reported that no more bytes are available.

How to download a ZIp file from a URl and store them as Zip file only

I have a url like below
http://blah.com/download.zip
I want a java code to download this Zip file from the URL and save it in my server directory as ZIP file only . I would also like to know what is the most effecient way to do this.
First, your URL is not http:\\blah.com\download.zip. It is http://blah.com/download.zip.
Second, it is simple. You have to perform HTTP GET request, take the stream and copy it to FileOutputStream. Here is the code sample.
URL url = new URL("http://blah.com/download.zip");
HttpURLConnection connection = (HttpURLConnection) url.openConnection();
connection.setRequestMethod("GET");
InputStream in = connection.getInputStream();
FileOutputStream out = new FileOutputStream("download.zip");
copy(in, out, 1024);
out.close();
public static void copy(InputStream input, OutputStream output, int bufferSize) throws IOException {
byte[] buf = new byte[bufferSize];
int n = input.read(buf);
while (n >= 0) {
output.write(buf, 0, n);
n = input.read(buf);
}
output.flush();
}

Need help optimising bufferedReader output

I am sending a file to the browser in a servlet. The highest JDK I can use is 1.4.2, and I also have to retrieve the file via a URL. I am also trying to use "guessContentTypeFromStream", but I keep getting null which raises an exception when used in the code sample below. I currently have to hard code or work out the content-type myself.
What I would like to know is, how I can re-factor this code so the file transmission is as fast as possible and also use guessContentTypeFromStream ? (Note "res" is HttpServletResponse).
URL servletUrl = new URL(sFileURL);
URLConnection conn = servletUrl.openConnection();
int read;
BufferedInputStream bis = new BufferedInputStream(conn.getInputStream());
String sContentType =conn.guessContentTypeFromStream(conn.getInputStream());
res.setContentType(sContentType);
//res.setContentType("image/jpeg");
PrintWriter os = res.getWriter();
while((read = bis.read()) != -1){
os.write(read);
}
//Clean resources
os.flush();
This is how you normally read/writes data.
in = new BufferedInputStream(socket.getInputStream(), BUFFER_SIZE);
byte[] dataBuffer = new byte[1024 * 16];
int size = 0;
while ((size = in.read(dataBuffer)) != -1) {
out.write(dataBuffer, 0, size);
}

Categories

Resources