Update data to file each amount of bytes - java

I want to write my content data to a file each 10kb of file. It looks like this:
What I tried:
FileInputStream is;
FileOutputStream out;
File input = new File(filePath);
int fileLength = input.length();
int len = 0;
while (len < fileLength){
len += is.read(buff);
// write my data
out.write(data, 0, data.length);
// how to move is to read next 10kb???
}
I wonder is there anyway to move the cursor reader to next amount of bytes? Or do I miss anything?
Update:Thank to #DThought, here is my implementation:
File input = new File(filePath);
long fileLength = input.length();
byte[] data;
byte[] buff = new byte[data.length];
long JUMP_LENGTH = 10 * 1024;
RandomAccessFile raf = new RandomAccessFile(input, "rw");
long step = JUMP_LENGTH + data.length;
for (long i = 0; i < fileLength; i += step) {
// read to buffer
raf.seek(i);
raf.read(buff);
raf.seek(i); // make sure it move to correct place after reading
raf.write(data);
}
raf.close();
And it worked well.

Try http://developer.android.com/reference/java/io/RandomAccessFile.html RandomAccessFile instead of FileOutputStream.
This will enable you to seek to arbitary positions
byte[] data=new byte[1024];
RandomAccessFile file=new RandomAccessFile(new File("name"),"rw");
file.seek(10*1024);
file.write(data);

You can write empty array or spaces to that specific portion for example,as you can't jump to specific memory of file and can't avoid 10KB.
FOR EXAMPLE
OutputStream os = new FileOutputStream(new File("D:/a.txt"));
byte[] emptyByte=new byte[10*1024];
Arrays.fill(emptyByte, " ".getBytes()[0]);//Empty array
os.write(yourData,0,yourData.length-1);
os.write(emptyByte,0,emptyByte.length-1);
//Write after each data to leave space of 10KB
NOTE I don't know how exactly set it for 10KB and other than that it is just an example,you can use it for yours.I have added spaces in that portion of file.You can achieve it according to your requirements.I think you can't directly jump to specific memory address but you can fill it with empty data.
I guess #seek method of RandomAccessFile may also help you as suggested by DThought,on this but it is measured from the beginning of this file so kindly note that.

Related

Trying to use BufferedInputStream and Base64 to Encode a large file in Java

I am new to the Java I/O so please help.
I am trying to process a large file(e.g. a pdf file of 50mb) using the apache commons library.
At first I try:
byte[] bytes = FileUtils.readFileToByteArray(file);
String encodeBase64String = Base64.encodeBase64String(bytes);
byte[] decoded = Base64.decodeBase64(encodeBase64String);
But knowing that the
FileUtils.readFileToByteArray in org.apache.commons.io will load the whole file into memory, I try to use BufferedInputStream to read the file piece by piece:
BufferedInputStream bis = new BufferedInputStream(inputStream);
StringBuilder pdfStringBuilder = new StringBuilder();
int byteArraySize = 10;
byte[] tempByteArray = new byte[byteArraySize];
while (bis.available() > 0) {
if (bis.available() < byteArraySize) { // reaching the end of file
tempByteArray = new byte[bis.available()];
}
int len = Math.min(bis.available(), byteArraySize);
read = bis.read(tempByteArray, 0, len);
if (read != -1) {
pdfStringBuilder.append(Base64.encodeBase64String(tempByteArray));
} else {
System.err.println("End of file reached.");
}
}
byte[] bytes = Base64.decodeBase64(pdfStringBuilder.toString());
However, the 2 decoded bytes array don't look quite the same... ... In fact, the only give 10 bytes, which is my temp array size... ...
Can anyone please help:
what am I doing it wrong to read the file piece by piece?
why is the decoded byte array only returns 10 bytes in the 2nd solution?
Thanks in advance:)
After some digging, it turns out that the byte array's size has to be multiple of 3 in order to avoid padding. After using a temp array size with multiple of 3, the program is able to go through.
I simply change
int byteArraySize = 10;
to be
int byteArraySize = 1024 * 3;

IOException when writing byte array

I am getting IOException: Map Failed when trying to write a large byte array. I use the method below to write a byte array to a file
private static void write(byte[] data) throws Exception {
File file = new File("C:/temp/file.json");
int length = data.length;
RandomAccessFile raf = new RandomAccessFile(file, "rw");
FileChannel fc = raf.getChannel();
MappedByteBuffer buffer = fc.map(FileChannel.MapMode.READ_WRITE, 0, length);
for (int i = 0; i < length; i++) {
buffer.put(data[i]);
}
}
The byte array is about 270mb.
Can anyone explain what I am doing wrong?
Thanks.
I am not sure why the map failed, but I wouldn't do it the way you have done.
FileOutputStream out = new FileOutputStream(filename);
out.write(data);
out.close();
to do it progressively you can use
FileOutputStream out = new FileOutputStream(filename);
for(int i = 0; i < data.length; i += 8192)
out.write(data, i, Math.min(data.length-i, 8192));
out.close();
The map could fail if you have a 32-bit JVM and you call this method repeatedly. e.g. you run out of virtual memory.
I think the solution is the same than in C with the mmap() function. If you just created the file, you should seek in it to data.length-1 offset and write a byte at this position in order to have a file of that size before mapping it. When I don't do it in C, I get memory corruption while accessing mapped memory.
Something like that should work :
private static void write(byte[] data) throws Exception {
File file = new File("C:/temp/file.json");
int length = data.length;
RandomAccessFile raf = new RandomAccessFile(file, "rw");
FileChannel fc = raf.getChannel();
fc.position(size-1);
ByteBuffer bf = ByteBuffer.wrap(new byte[]{0x00});
bf.flip(); // Not sure if flip is needed !!!!!!!
fc.write(bf);
MappedByteBuffer buffer = fc.map(FileChannel.MapMode.READ_WRITE, 0, length);
for (int i = 0; i < length; i++) {
buffer.put(data[i]);
}
}
To make simple : You can't map more than the file size, that's why you need to increase the file size before mapping it.
This can explain your problem, but I think it's more appropriate to open a FileOutputStream and directly write your data in it. You can still map it after if needed.
Is your JVM max heap size at least 270*2 MB? You would set this on the command line used to start Java:
java ... -Xmx 1024m ...

Divide the video to bytes

i want to convert the video to bytes it gives me result but i think the result is not correct because i test it for different videos and the gives me the same result so
can any one help please to do how to convert video to byte
String filename = "D:/try.avi";
byte[] myByteArray = filename.getBytes();
for(int i = 0; i<myByteArray.length;i ++)
{
System.out.println(myByteArray[i]);
}
Any help Please?
String filename = "D:/try.avi";
byte[] myByteArray = filename.getBytes();
That is converting the file name to bytes, not the file content.
As for reading the content of the file, see the Basic I/O lesson of the Java Tutorial.
Videos in same container formats start with same bytes. The codec used determines the actual video files.
I suggest you read more about container file formats and codecs first if you plan developing video applications.
But you have a different problem. As Andrew Thompson correctly pointed out, you are getting the bytes of the filename string.
The correct approach would be:
private static File fl=new File("D:\video.avi");
byte[] myByteArray = getBytesFromFile(fl);
Please also bear in mind that terminals usually have fixed buffer size (on Windows, it's several lines), so outputting a big chunk of data will display only last several lines of it.
Edit: Here's an implementation of getBytesFromFile; a java expert may offer more standard approach.
public static byte[] getBytesFromFile(File file) throws IOException {
InputStream is = openFile(file.getPath());
// Get the size of the file
long length = file.length();
if (length > Integer.MAX_VALUE) {
// File is too large
Assert.assertExp(false);
logger.warn(file.getPath()+" is too big");
}
// Create the byte array to hold the data
byte[] bytes = new byte[(int)length];
// debug - init array
for (int i = 0; i < length; i++){
bytes[i] = 0x0;
}
// Read in the bytes
int offset = 0;
int numRead = 0;
while (offset < bytes.length && (numRead=is.read(bytes, offset, bytes.length-offset)) >= 0) {
offset += numRead;
}
// Ensure all the bytes have been read in
if (offset < bytes.length) {
throw new IOException("Could not completely read file "+file.getName());
}
// Close the input stream and return bytes
is.close();
return bytes;
}
If you want to read the contents of the video file then use File.
String filename = "D:/try.avi";
File file=new File(filename);
byte myByteArray[]=new byte[(int)file.length()];
RandomAccessFile raf=new RandomAccessFile(file,"rw");
raf.read(myByteArray);

How do I uncompress a file and read it to the ByteBuffer in java?

I have a piece of code like so...
FileInputStream fi = new FileInputStream(filein);
GZIPInputStream gzis = new GZIPInputStream(fi);
ByteBuffer bbuffer = ByteBuffer.allocate(115200);
The fi.available() is 84300, but the gzis.available() is only 1. The file(filein) is already compressed.
I want to read the file, uncompress it, and finally put it into my ByteBuffer bbufer.
How could I realize this operation?
gzis.available() = 1; doesn't mean that there is a problem, it simply means that you can only read 1 byte of information from the Stream before you can continue. you can't expect that the entire uncompressed file will be available in a single command.
To read the entire file, you will need to have a loop that continues to read over the InputStream until you have all the data. For example...
int bytesRead = 0;
int bytesAvailable = gzis.available();
while (bytesAvailable > 0){
gzis.read(bbuffer,bytesRead,bytesAvailable);
bytesRead += bytesAvailable;
bytesAvailable = gzis.available();
}
Of course, if you aren't sure of the final size of the uncompressed file, you'll need to add in extra code to allow your bbuffer to be resized if you need more room.

Reading a binary input stream into a single byte array in Java

The documentation says that one should not use available() method to determine the size of an InputStream. How can I read the whole content of an InputStream into a byte array?
InputStream in; //assuming already present
byte[] data = new byte[in.available()];
in.read(data);//now data is filled with the whole content of the InputStream
I could read multiple times into a buffer of a fixed size, but then, I will have to combine the data I read into a single byte array, which is a problem for me.
The simplest approach IMO is to use Guava and its ByteStreams class:
byte[] bytes = ByteStreams.toByteArray(in);
Or for a file:
byte[] bytes = Files.toByteArray(file);
Alternatively (if you didn't want to use Guava), you could create a ByteArrayOutputStream, and repeatedly read into a byte array and write into the ByteArrayOutputStream (letting that handle resizing), then call ByteArrayOutputStream.toByteArray().
Note that this approach works whether you can tell the length of your input or not - assuming you have enough memory, of course.
Please keep in mind that the answers here assume that the length of the file is less than or equal to Integer.MAX_VALUE(2147483647).
If you are reading in from a file, you can do something like this:
File file = new File("myFile");
byte[] fileData = new byte[(int) file.length()];
DataInputStream dis = new DataInputStream(new FileInputStream(file));
dis.readFully(fileData);
dis.close();
UPDATE (May 31, 2014):
Java 7 adds some new features in the java.nio.file package that can be used to make this example a few lines shorter. See the readAllBytes() method in the java.nio.file.Files class. Here is a short example:
import java.nio.file.FileSystems;
import java.nio.file.Files;
import java.nio.file.Path;
// ...
Path p = FileSystems.getDefault().getPath("", "myFile");
byte [] fileData = Files.readAllBytes(p);
Android has support for this starting in Api level 26 (8.0.0, Oreo).
You can use Apache commons-io for this task:
Refer to this method:
public static byte[] readFileToByteArray(File file) throws IOException
Update:
Java 7 way:
byte[] bytes = Files.readAllBytes(Paths.get(filename));
and if it is a text file and you want to convert it to String (change encoding as needed):
StandardCharsets.UTF_8.decode(ByteBuffer.wrap(bytes)).toString()
You can read it by chunks (byte buffer[] = new byte[2048]) and write the chunks to a ByteArrayOutputStream. From the ByteArrayOutputStream you can retrieve the contents as a byte[], without needing to determine its size beforehand.
I believe buffer length needs to be specified, as memory is finite and you may run out of it
Example:
InputStream in = new FileInputStream(strFileName);
long length = fileFileName.length();
if (length > Integer.MAX_VALUE) {
throw new IOException("File is too large!");
}
byte[] bytes = new byte[(int) length];
int offset = 0;
int numRead = 0;
while (offset < bytes.length && (numRead = in.read(bytes, offset, bytes.length - offset)) >= 0) {
offset += numRead;
}
if (offset < bytes.length) {
throw new IOException("Could not completely read file " + fileFileName.getName());
}
in.close();
Max value for array index is Integer.MAX_INT - it's around 2Gb (2^31 / 2 147 483 647).
Your input stream can be bigger than 2Gb, so you have to process data in chunks, sorry.
InputStream is;
final byte[] buffer = new byte[512 * 1024 * 1024]; // 512Mb
while(true) {
final int read = is.read(buffer);
if ( read < 0 ) {
break;
}
// do processing
}

Categories

Resources