Java ApacheIO File download progress on console using copyInputStreamToFile - java

I'm trying to find a way to use
copyInputStreamToFile(InputStream source, File destination)
to make a small progress bar in the console by file size. Is there a way to do this?

The short answer you can't, look at the source code of this method, I tried to track its execution path and it goes to this method at IOUtils class:
public static long copyLarge(final InputStream input, final OutputStream output, final byte[] buffer)
throws IOException {
long count = 0;
int n = 0;
while (EOF != (n = input.read(buffer))) {
output.write(buffer, 0, n);
count += n;
}
return count;
}
So, this functionality is encapsulated by an API.
The long answer you can implement downloading method by yourself, by using relative parts of IOUtils and FileUtils libraries and add functionality to print percentage of downloaded file in a console:
This is a working kick-off example:
package apache.utils.custom;
import java.io.File;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStream;
import org.apache.commons.io.FileUtils;
import org.apache.commons.io.IOUtils;
public class Downloader {
private static final int EOF = -1;
private static final int DEFAULT_BUFFER_SIZE = 1024 * 4;
public static void copyInputStreamToFileNew(final InputStream source, final File destination, int fileSize) throws IOException {
try {
final FileOutputStream output = FileUtils.openOutputStream(destination);
try {
final byte[] buffer = new byte[DEFAULT_BUFFER_SIZE];
long count = 0;
int n = 0;
while (EOF != (n = source.read(buffer))) {
output.write(buffer, 0, n);
count += n;
System.out.println("Completed " + count * 100/fileSize + "%");
}
output.close(); // don't swallow close Exception if copy completes normally
} finally {
IOUtils.closeQuietly(output);
}
} finally {
IOUtils.closeQuietly(source);
}
}
You should provide expected file size to this method which you can calculate by using this code:
URL url = new URL(urlString);
URLConnection urlConnection = url.openConnection();
urlConnection.connect();
int file_size = urlConnection.getContentLength();
Of course the better idea is to encapsulate the whole functionality in a single method.
Hope it will help you.

Related

Binary File Download using OkHTTP Client get corrupted

I am trying to download binary file using OkHttp with progress.
The file get download properly when BUFFER_SIZE is 1.
However file get corrupted when I set BUFFER_SIZE to 1024.
With BUFFER_SIZE set to 1 file takes long time to download
Below is the code snippet:
import java.io.File;
import java.io.FileOutputStream;
import java.io.InputStream;
import java.io.OutputStream;
import okhttp3.Call;
import okhttp3.OkHttpClient;
import okhttp3.Request;
import okhttp3.Response;
public class DownloadTest {
public static String url = "https://cdn.pixabay.com/photo/2017/02/06/12/34/reptile-2042906_1280.jpg";
public static void main(String[] args) throws Exception {
OkHttpClient client = new OkHttpClient();
Call call = client.newCall(new Request.Builder().url(url).get().build());
Response response = call.execute();
System.out.println("" + response.headers().toString());
System.out.println("" + response.body().contentLength());
InputStream inputStream = response.body().byteStream();
float contentLength = (float) response.body().contentLength();
OutputStream fileOutputStream = new FileOutputStream(new File("myfile.jpg"));
System.out.println("writing file " + contentLength);
float downloaded = 0;
/**
* IF BUFFER_SIZE IS 1 file is downloaded properly
* if BUFFER_SIZE is 1024 file is corrupted
* open the downloaded image to test
*/
//byte[] BUFFER_SIZE = new byte[1]; //Proper Download
byte[] BUFFER_SIZE = new byte[1024]; //File Corrupt
while (true) {
int byteRead = inputStream.read(BUFFER_SIZE);
if (byteRead == -1) {
break;
}
downloaded += byteRead;
fileOutputStream.write(BUFFER_SIZE);
System.out.println(" " + downloaded + "/" + contentLength + " = " + ((downloaded / contentLength) * 100));
}
fileOutputStream.flush();
fileOutputStream.close();
System.out.println("file closed");
}
}
If your BUFFER_SIZE is not full at the last read then you will have wrong data written in file:
You have
fileOutputStream.write(BUFFER_SIZE);
you should have:
fileOutputStream.write(BUFFER_SIZE, 0, byteRead);
EDIT1: I would also sugest to replace this part of the code:
while (true) {
int byteRead = inputStream.read(BUFFER_SIZE);
if (byteRead == -1) {
break;
}
With a better approach:
int byteRead;
while ( (byteRead = inputStream.read(BUFFER_SIZE)) > 0 ) {
Your code is a bit confusing because BUFFER_SIZE looks like a numeric constant. Apart from that, I think your problem is with fileOutputStream.write(BUFFER_SIZE). On the last write, when your byte[] is not "full," you will still write the entire content of the array. Use the overload that specifies an offset (0) and the bytes to write (byteRead.)

Java application exits instantaneously

I am trying to make a java program that downloads a lot of images off a website. However, once I run the class, it instantaneously exits, and I can't figure out why. Here is my code:
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
import java.net.URL;
import java.util.HashMap;
public class Main {
static HashMap<Integer, String> hmap = new HashMap<Integer, String>();
public static void main(String[] args) throws IOException {
for(int i = 1; i > 151; i++) {
for(int i1 = 1; i1 > 151; i1++) {
if(i == i1) {
continue;
}
String imageUrl1 = "http://images.alexonsager.net/pokemon/fused/" + i + "/" + i + "." + i1 + ".png";
String destinationFile1 = hmap.get(i) + " and " + hmap.get(i1);
saveImage(imageUrl1, destinationFile1);
System.out.println("Downloaded " + destinationFile1);
}
}
}
public static void saveImage(String imageUrl, String destinationFile) throws IOException {
URL url = new URL(imageUrl);
InputStream is = url.openStream();
OutputStream os = new FileOutputStream(destinationFile);
byte[] b = new byte[2048];
int length;
while ((length = is.read(b)) != -1) {
os.write(b, 0, length);
}
is.close();
os.close();
}
public static void createHash() {
//hmap.put(int, string) times 151
}
}
What I want it to do is download, say, i=1 and i1=2, then download i=1 and i1=3, and so on until both hit 151 (they can't both be equal). In all, this will download 22650 files that will approximately be 27.6MB alltogether. So, that being said, is it a memory issue with the java settings themselves (I have 32GB of RAM, so me running out is not really an option), or is it a problem with the code?
I would greatly appreciate it if someone could help me with this.
Thanks!
i is never greater than 151 so you never enter your loops.
Solution
for (int i = 1 ; i < 151 ; i++)

Splitting files into chunks with size bigger than 127

I'm trying to make a simplified HDFS (Hadoop Distributed File System) for a final project in a Distributed System course.
So, the first thing that I'm trying is to write a program which split an arbitrary file into blocks (chunks) of an arbitrary dimension.
I found this useful example, which code is:
package javabeat.net.io;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.IOException;
/**
* Split File Example
*
* #author Krishna
*
*/
public class SplitFileExample {
private static String FILE_NAME = "TextFile.txt";
private static byte PART_SIZE = 5;
public static void main(String[] args) {
File inputFile = new File(FILE_NAME);
FileInputStream inputStream;
String newFileName;
FileOutputStream filePart;
int fileSize = (int) inputFile.length();
int nChunks = 0, read = 0, readLength = PART_SIZE;
byte[] byteChunkPart;
try {
inputStream = new FileInputStream(inputFile);
while (fileSize > 0) {
if (fileSize <= 5) {
readLength = fileSize;
}
byteChunkPart = new byte[readLength];
read = inputStream.read(byteChunkPart, 0, readLength);
fileSize -= read;
assert (read == byteChunkPart.length);
nChunks++;
newFileName = FILE_NAME + ".part"
+ Integer.toString(nChunks - 1);
filePart = new FileOutputStream(new File(newFileName));
filePart.write(byteChunkPart);
filePart.flush();
filePart.close();
byteChunkPart = null;
filePart = null;
}
inputStream.close();
} catch (IOException exception) {
exception.printStackTrace();
}
}
}
But I think that there is a big issue: the value of PART_SIZE cannot be greater than 127, otherwise an error: possible loss of precision will occur.
How can I solve without totally changing the code?
The problem is that PART_SIZE is a byte; its maximum value is therefore indeed 127.
The code you have at the moment however is full of problems; for one, incorrect resource handling etc.
Here is a version using java.nio.file:
private static final String FILENAME = "TextFile.txt";
private static final int PART_SIZE = xxx; // HERE
public static void main(final String... args)
throws IOException
{
final Path file = Paths.get(FILENAME).toRealPath();
final String filenameBase = file.getFileName().toString();
final byte[] buf = new byte[PART_SIZE];
int partNumber = 0;
Path part;
int bytesRead;
byte[] toWrite;
try (
final InputStream in = Files.newInputStream(file);
) {
while ((bytesRead = in.read(buf)) != -1) {
part = file.resolveSibling(filenameBase + ".part" + partNumber);
toWrite = bytesRead == PART_SIZE ? buf : Arrays.copyOf(buf, bytesRead);
Files.write(part, toWrite, StandardOpenOption.CREATE_NEW);
partNumber++;
}
}
}
List<PDDocument> Pages=new ArrayList<PDDocument>();
Document.load(filePath);
try {
Splitter splitter = new Splitter();
splitter.setSplitAtPage(NoOfPagesDocumentWillContain);
Pages = splitter.split(document);
}catch(Exception e)
{
l
e.getCause().printStackTrace();
}

Performance of MappedByteBuffer vs ByteBuffer

I'm trying to do a few performance enhancements and am looking to use memory mapped files for writing data. I did a few tests and surprisingly, MappedByteBuffer seems slower than allocating direct buffers. I'm not able to clearly understand why this would be the case. Can someone please hint at what could be going on behind the scenes? Below are my test results:
I'm allocating 32KB buffers. I've already created the files with sizes 3Gigs before starting the tests. So, growing the file isn't the issue.
I'm adding the code that I used for this performance test. Any input / explanation about this behavior is much appreciated.
import java.io.BufferedWriter;
import java.io.File;
import java.io.FileWriter;
import java.io.IOException;
import java.io.RandomAccessFile;
import java.nio.ByteBuffer;
import java.nio.MappedByteBuffer;
import java.nio.channels.FileChannel;
import java.nio.channels.FileChannel.MapMode;
public class MemoryMapFileTest {
/**
* #param args
* #throws IOException
*/
public static void main(String[] args) throws IOException {
for (int i = 0; i < 10; i++) {
runTest();
}
}
private static void runTest() throws IOException {
// TODO Auto-generated method stub
FileChannel ch1 = null;
FileChannel ch2 = null;
ch1 = new RandomAccessFile(new File("S:\\MMapTest1.txt"), "rw").getChannel();
ch2 = new RandomAccessFile(new File("S:\\MMapTest2.txt"), "rw").getChannel();
FileWriter fstream = new FileWriter("S:\\output.csv", true);
BufferedWriter out = new BufferedWriter(fstream);
int[] numberofwrites = {1,10,100,1000,10000,100000};
//int n = 10000;
try {
for (int j = 0; j < numberofwrites.length; j++) {
int n = numberofwrites[j];
long estimatedTime = 0;
long mappedEstimatedTime = 0;
for (int i = 0; i < n ; i++) {
byte b = (byte)Math.random();
long allocSize = 1024 * 32;
estimatedTime += directAllocationWrite(allocSize, b, ch1);
mappedEstimatedTime += mappedAllocationWrite(allocSize, b, i, ch2);
}
double avgDirectEstTime = (double)estimatedTime/n;
double avgMapEstTime = (double)mappedEstimatedTime/n;
out.write(n + "," + avgDirectEstTime/1000000 + "," + avgMapEstTime/1000000);
out.write("," + ((double)estimatedTime/1000000) + "," + ((double)mappedEstimatedTime/1000000));
out.write("\n");
System.out.println("Avg Direct alloc and write: " + estimatedTime);
System.out.println("Avg Mapped alloc and write: " + mappedEstimatedTime);
}
} finally {
out.write("\n\n");
if (out != null) {
out.flush();
out.close();
}
if (ch1 != null) {
ch1.close();
} else {
System.out.println("ch1 is null");
}
if (ch2 != null) {
ch2.close();
} else {
System.out.println("ch2 is null");
}
}
}
private static long directAllocationWrite(long allocSize, byte b, FileChannel ch1) throws IOException {
long directStartTime = System.nanoTime();
ByteBuffer byteBuf = ByteBuffer.allocateDirect((int)allocSize);
byteBuf.put(b);
ch1.write(byteBuf);
return System.nanoTime() - directStartTime;
}
private static long mappedAllocationWrite(long allocSize, byte b, int iteration, FileChannel ch2) throws IOException {
long mappedStartTime = System.nanoTime();
MappedByteBuffer mapBuf = ch2.map(MapMode.READ_WRITE, iteration * allocSize, allocSize);
mapBuf.put(b);
return System.nanoTime() - mappedStartTime;
}
}
You're testing the wrong thing. This is not how to write the code in either case. You should allocate the buffer once, and just keep updating its contents. You're including allocation time in the write time. Not valid.
Swapping data to disk is the main reason for MappedByteBuffer to be slower than DirectByteBuffer.
cost of allocation and deallocation is high with direct buffers , including MappedByteBuffer, and this is cost is accrued to both the examples hence the only difference in writing to disk , which is the case with MappedByteBuffer but not with Direct Byte Buffer

Data loss during conversion of video from images in Java

My video bailey.mpg from which i created .png images using xuggler method, then i read each image in byte array and append hash as delimiter and text data in this byte array and recreate image using this byte array.
Now i am reconstructing video in .avi format from this (text appended images)set of images. By getting the sets of .png images from new avi video using xuggler .I am reading each image in byte array, and i am searching delimiter in byte array of image's set, but I am unable to find hash delimiter.I think this means text data loss during creation of video.
what should i do?
Code for create images from video
package DifferentPackage;
import com.xuggle.mediatool.IMediaReader;
import com.xuggle.mediatool.MediaListenerAdapter;
import com.xuggle.mediatool.ToolFactory;
import com.xuggle.mediatool.event.IVideoPictureEvent;
import com.xuggle.xuggler.Global;
import java.awt.image.BufferedImage;
import java.io.File;
import java.io.IOException;
import javax.imageio.ImageIO;
/*
* To change this template, choose Tools | Templates
* and open the template in the editor.
*/
/**
*
* #author pratibha
*/
public class VideoIntoFrames {
public static final double SECONDS_BETWEEN_FRAMES =1;
String inputFilename;
private static final String outputFilePrefix = "C:\\photo\\image_";
// The video stream index, used to ensure we display frames from one and
//only one video stream from the media container.
private static int mVideoStreamIndex = -1;
// Time of last frame write
private static long mLastPtsWrite = Global.NO_PTS;
public static final long MICRO_SECONDS_BETWEEN_FRAMES =(long)(100 * SECONDS_BETWEEN_FRAMES);
int FrameNo=0;
public VideoIntoFrames(String filepath){
inputFilename=filepath;
IMediaReader mediaReader = ToolFactory.makeReader(inputFilename);
// stipulate that we want BufferedImages created in BGR 24bit color space
mediaReader.setBufferedImageTypeToGenerate(BufferedImage.TYPE_3BYTE_BGR);
mediaReader.addListener(new ImageSnapListener());
// read out the contents of the media file and
// dispatch events to the attached listener
while (mediaReader.readPacket() == null) ;
}
private class ImageSnapListener extends MediaListenerAdapter {
public void onVideoPicture(IVideoPictureEvent event) {
if (event.getStreamIndex() != mVideoStreamIndex) {
// if the selected video stream id is not yet set, go ahead an
// select this lucky video stream
if (mVideoStreamIndex == -1)
mVideoStreamIndex = event.getStreamIndex();
// no need to show frames from this video stream
else
return;
}
// if uninitialized, back date mLastPtsWrite to get the very first frame
if (mLastPtsWrite == Global.NO_PTS)
mLastPtsWrite = event.getTimeStamp() - MICRO_SECONDS_BETWEEN_FRAMES;
// if it's time to write the next frame
if (event.getTimeStamp() - mLastPtsWrite >=
MICRO_SECONDS_BETWEEN_FRAMES) {
++FrameNo;
String outputFilename = dumpImageToFile(event.getImage());
// indicate file written
double seconds = ((double) event.getTimeStamp()) /
Global.DEFAULT_PTS_PER_SECOND;
System.out.printf(
"at elapsed time of %6.3f seconds wrote: %s\n",
seconds, outputFilename);
// update last write time
mLastPtsWrite += MICRO_SECONDS_BETWEEN_FRAMES;
}
}
private String dumpImageToFile(BufferedImage image) {
try {
String outputFilename = outputFilePrefix +FrameNo+ ".gif";
ImageIO.write(image, "jpg", new File(outputFilename));
return outputFilename;
}
catch (Exception e) {
e.printStackTrace();
return null;
}
}
}
public static void main(String args[]){
String path="D:/bailey.mpg";
VideoIntoFrames v=new VideoIntoFrames(path);
}
}
All images are save at c:/photo .Code for Insert Text Data in image_1.gif is
try{
String data="My Name is ";
int[] charValue=new int[data.length()];
for(int rowIndex=0;rowIndex<charValue.length;rowIndex++){
charValue[rowIndex]=data.charAt(rowIndex);
}
File videoFile = new File("C:/photo/image_1.gif");
FileInputStream videoInput = new FileInputStream(videoFile);
int VideoByte = -1;
List<Byte> bytes = new ArrayList<Byte>();
while ((VideoByte = videoInput.read()) != -1) {
bytes.add((byte) VideoByte);
}
byte[] ByteOfVideo = new byte[bytes.size()];
for (int count = 0; count < ByteOfVideo.length; count++) {
ByteOfVideo[count] = bytes.get(count);
// System.out.println(count+" of ByteOfImage "+ByteOfVideo[count]);
}
////////////////////////////////////#////////////////////////
/////now insert actual image fRames,Row and Columns
ByteArrayOutputStream byteOut = new ByteArrayOutputStream();
for (int i = 0; i < ByteOfVideo.length; i++) {
byteOut.write(ByteOfVideo[i]);
}
byte[] HashArray = "#".getBytes();
byteOut.write(HashArray);
byteOut.write(HashArray);
byteOut.write(HashArray);
byteOut.write(HashArray);
byteOut.write(HashArray);
/// String retrievedString = new String(FrameByteArray, "UTF-8");
for (int i = 0; i < charValue.length; i++) {
System.out.println(" NameArray in Bytes" + charValue[i]);
byteOut.write(charValue[i]);
}
////insert #
//////write this Video File
String FinalModifiedVideo="C:\\photo\\image_1.gif";
File ModifiedFile=new File(FinalModifiedVideo);
DataOutputStream out=new DataOutputStream(new FileOutputStream(ModifiedFile));
byteOut.writeTo(out);
out.close();
System.out.println("Process End");
}
catch(Exception e){
e.printStackTrace();
}
Code for create video from image set(images in c:/photo) folder.this create image.avi video
/*
* To change this template, choose Tools | Templates
* and open the template in the editor.
*/
package Frame;
import test.*;
import ch.randelshofer.media.avi.AVIOutputStream;
import java.io.BufferedReader;
import java.io.File;
import java.io.FileReader;
import java.util.ArrayList;
import java.util.Scanner;
/**
*
* #author Inbo
*/
public class MyVideoWriter {
static ArrayList<String> img = new ArrayList();
public static void readFiles() {
String path = "C:\\photo\\";
String files;
File folder = new File(path);
File[] listOfFiles = folder.listFiles();
int c = 0;
for (int i = 0; i <799; i++) {
img.add(path + "\\image_" + (i + 1) + ".gif");
}
// System.out.println(img);
}
public MyVideoWriter(String path) {
readFiles();
try {
AVIOutputStream AVIout = new AVIOutputStream(new File(path + ".avi"), AVIOutputStream.VideoFormat.JPG);
AVIout.setVideoCompressionQuality(1);
//AVIout.setFrameRate(10);
AVIout.setVideoDimension(352, 240);
for (int i = 0; i < img.size(); i++) {
AVIout.writeFrame(new File(img.get(i)));
}
AVIout.close();
} catch (Exception e) {
e.printStackTrace();
}
}
public static void main(String args[]) {
String path="C:\\image";
MyVideoWriter mv = new MyVideoWriter(path);
}
}

Categories

Resources