I have simple java server and client .In the server, a file is broken into several chunks of byte array Now this byte arrays has to be send through object output stream. But if every time i use a new array to load file data that is perfect but if i use the same array(It is necessary i have to memory efficient) to load the file data client receives the same (first) byte array every time.
networkUtil read and write
public Object read() {
Object o = null;
try {
o=ois.readObject();
} catch (Exception e) {
//System.out.println("Reading Error in network : " + e.toString());
}
return o;
}
public void write(Object o) {
try {
oos.writeObject(o);
} catch (IOException e) {
System.out.println("Writing Error in network : " + e.toString());
}
}
Server writing portion
public void run() {
try {
//Scanner input=new Scanner(System.in);
byte []b =new byte[1000];
int num=5;
long i=0;
//ObjectOutputStream oosp = null;
for(int j=0;j<num;j++) {
File f=new File("G:\\photography\\DSC01020.JPG");
RandomAccessFile file1=new RandomAccessFile(f,"r");
long l=file1.length();
num=(int)Math.ceil((double)l/(double)1000);
//System.out.println("it is num "+num);
//file1.close();
// RandomAccessFile file=new RandomAccessFile(f,"r");
// byte [] b =new byte[1000];
System.out.println("seeking from "+i+"left "+(l-(j*1000)));
file1.seek(i);
file1.read(b);
file1.close();
System.out.println("it is first "+b[0]+" it is second "+b[1]);
nc.write(b);//network util
//oosp.write(b);
file1.close();
i+=1000;
}
client reading portion
try {
FileOutputStream fos=new FileOutputStream("C:\\Temp\\test.jpg");
byte []a;
for(int j=0;j<225;j++) {
Object o=nc.read();//netwotk util
if(o!= null) {
if(o instanceof Data) {
Data obj=(Data)o;
//System.out.println(obj.getElement());
}
if(o instanceof byte[])
{
//System.out.println("it is byte array");
a=(byte[])o;
System.out.println("it is first "+a[0]+" it is second "+a[1]);
if(j==224)// it is hard coded for this file i have to change this for all file
{
fos.write(a,0,203);
}
else {
fos.write(a);
}
}
}
}
Do you have memory problems ? In Java, the objects and arrays not in use are garbage collected. See Deleting An Entire Array. I don't think that you will enconter any problems by reallocating each time.
Edit:
Since the reallocation is the problem, maybe a ByteBuffer http://docs.oracle.com/javase/7/docs/api/java/nio/ByteBuffer.html can solve this.
You can try to use java.nio instead with a FileChannel and a ByteBuffer. See http://www.java2s.com/Tutorial/Java/0180__File/UseFileChannelandByteBuffertoCopyFile.htm and FileChannel ByteBuffer and Hashing Files for examples.
Related
I am struggling with finding a solution to write my bytes array to a playable AAC audio file.
From my Flutter.io front-end, I am encoding my .aac audio files as a list of UInt8List and sending it to my Spring-Boot server. Then I am able to convert them to a proper bytes array where I then attempt to write it back to a .aac file as seen below:
public void writeToAudioFile(ArrayList<Double> audioData) {
byte[] byteArray = new byte[1024];
Iterator<Double> iterator = audioData.iterator();
System.out.println(byteArray);
while (iterator.hasNext()) {
// for some reason my list came in as a list of doubles
// so I am making sure to get these values back to an int
Integer i = iterator.next().intValue();
byteArray[i] = i.byteValue();
}
try {
File someFile = new File("test.aac");
FileOutputStream fos = new FileOutputStream(someFile);
fos.write(byteArray);
fos.flush();
fos.close();
System.out.println("File created");
} catch (Exception e) {
// TODO: handle exception
System.out.println("Error: " + e);
}
I am able to write my bytes array back to an audio file, however, it is unplayable. So I am wondering if this approach is possible and If my issue does lie in Java.
I have been doing extraneous research and I think that I need to say that this file is a specific type of media file? Or maybe the encoded audio file is corrupt when reaching my server?
Your conversion loop
while (iterator.hasNext()) {
// for some reason my list came in as a list of doubles
// so I am making sure to get these values back to an int
Integer i = iterator.next().intValue();
byteArray[i] = i.byteValue();
}
gets the value i from the iterator, and then tries to write it at the position i in the byteArray, which kind of jumbles your audio bytes in a weird way.
A working function that converts List<Double> to byte[] would look something like this
byte[] inputToBytes(List<Double> audioData) {
byte[] result = new byte[audioData.size()];
for (int i = 0; i < audioData.size(); i++) {
result[i] = audioData.get(i).byteValue();
}
return result;
}
then you could use it in the writeToAudioFile():
void writeToAudioFile(ArrayList<Double> audioData) {
try (FileOutputStream fos = new FileOutputStream("test.aac")) {
fos.write(inputToBytes(audioData));
System.out.println("File created");
} catch (Exception e) {
// TODO: handle exception
System.out.println("Error: " + e);
}
}
This certainly produces the playable file if you have the valid bytes in the audioData. The contents and the extension should be enough for the OS/player to recognize the format.
If this doesn’t work, I would look into the data received to see if it is correct.
I looked at some previous threads about binary files and I am doing the dataStream like it says, but I am not for sure why mine isn't working as I think I am doing the same thing as threads say I am. My goal is to make a method that takes in a file name that is in .bin format with a shift integer. I will make a new file of the .bin type with the characters shifted. Only capital or lower case letters will be shifted though. I don't know the length of the binary file that is being read in and needs to go through all of the characters. The file will only have 1 line though. I have a method that gives me the number of characters on that line and a method that creates a file. The program I know does create the file correctly. Anyways, what is happening is it creates the file, then gives me an EOF exception about the line: char currentChar=data.readChar();
Here is my code:
private static void cipherShifter(String file, int shift) {
String newFile=file+"_cipher";
createFile(newFile);
int numChar;
try {
FileInputStream stream=new FileInputStream(file);
DataInputStream data=new DataInputStream(stream);
FileOutputStream streamOut=new FileOutputStream(newFile);
DataOutputStream dataOut=new DataOutputStream(streamOut);
numChar=readAllInts(data);
for (int i=0;i<numChar;++i) {
char currentChar=data.readChar();
if (((currentChar>='A')&&(currentChar<='Z'))||((currentChar>='a')&&(currentChar<='z'))) {
currentChar=currentChar+=shift;
dataOut.writeChar(currentChar);
}
else {
dataOut.writeChar(currentChar);
}
}
data.close();
dataOut.flush();
dataOut.close();
} catch(IOException error) {
error.printStackTrace();
}
}
private static void createFile(String fileName) {
File file=new File(fileName);
if (file.exists()) {
//Do nothing
}
else {
try {
file.createNewFile();
} catch (IOException e) {
//Do nothing
}
}
}
private static int readAllInts(DataInputStream din) throws IOException {
int count = 0;
while (true) {
try {
din.readInt(); ++count;
} catch (EOFException e) {
return count;
}
}
}
So the error I do not think should be happening because I do have the correct data type and I am telling it to read just a character. Any help would be great. Thanks in advance.
Based on the description above, your error is reported at the data.readChar() method invocation and not inside the readAllInts method. I simulated the code near your error and got the same Exception on a text file at the same location.
I used the readByte method to read one byte at a time since you are mainly interested in ASCII bytes. I also changed readAllInts to be readAllBytes so I work with total byte count.
private static void cipherShifter(String file, int shift) {
String newFile=file+"_cipher";
createFile(newFile);
int numChar;
try {
FileInputStream stream=new FileInputStream(file);
DataInputStream data=new DataInputStream(stream);
FileOutputStream streamOut=new FileOutputStream(newFile);
DataOutputStream dataOut=new DataOutputStream(streamOut);
numBytes=readAllBytes(data);
stream.close();
data.close();
stream=new FileInputStream(file);
data=new DataInputStream(stream);
for (int i=0;i<numBytes;++i) {
byte currentByte=data.readByte();
if (((currentByte>=65)&&(currentByte<=90))||((currentByte>=97)&&(currentByte<=122))) {
currentByte=currentByte+=shift; //need to ensure no overflow beyond a byte
dataOut.writeByte(currentByte);
}
else {
dataOut.writeByte(currentByte);
}
}
data.close();
dataOut.flush();
dataOut.close();
} catch(IOException error) {
error.printStackTrace();
}
}
private static void createFile(String fileName) {
File file=new File(fileName);
if (file.exists()) {
//Do nothing
}
else {
try {
file.createNewFile();
} catch (IOException e) {
//Do nothing
}
}
}
private static int readAllBytes(DataInputStream din) throws IOException {
int count = 0;
while (true) {
try {
din.readByte(); ++count;
} catch (EOFException e) {
return count;
}
}
}
It looks like you're getting the EOFException because you're passing the DataInputStream object to your readAllInts method, reading through the stream, then trying to read from it again inside your for loop. The problem there is that the pointer that keeps track of where you are in the stream is already near the end of the stream (or at the end of it) when readAllInts returns. I suspect it's near the end, rather than at it since the readChar() method is throwing the EOFException immediately, which it does when it only reads one of the two bytes it expects to be able to read before hitting the EOF.
To solve that problem, you could call data.mark() before passing the reader to the readAllInts method, then calling data.reset() after that method returns; that would repoint the pointer to the beginning of the stream. (This assumes data.markSupported() is true.)
You also have the problem we talked about above that your counter is reading in four bytes at a time, and your character reader is reading in two at a time. Your suggested method of doubling the return value of readAllInts would help (you could also use readChar() instead of readInt().)
You still need to think about how you're going to handle the case of binary files that are odd-numbered bytes long. There are a variety of ways you could handle that one. I'm too beat to write up a code sample tonight, but if you're still stuck tomorrow, add a comment and I'll see what I can do to help.
I have a Client-Server system where server is written in cpp and the client is written is Java (Android application).
The server reads an image from a local directory as an ifstream using read method.
The reading process is done inside a loop, where the program reads parts of the image every time. Every time a part of the image is read, it's sent over a socket to the client that collects all the piece inside a byteBuffer and when all the bytes of the image are transfered to the client, the client attempts to turn that array of bytes (after using byteBuffer.array() method) into a Bitmap.
This is where the problem begins - I've tried a few methods but it seems that I'm unable to turn this array of bytes into a Bitmap.
From what I understood, this byte array is probably a raw representation of the image, which can't be decodded using methods like BitmapFactory.decodeByteArray() since it wasn't encoded in the first place.
Ultimately, my question is - how can I proccess this array of bytes so that I'll be able to set the image as a source to an ImageView?
Note: I've already made sure that all the data is sent over the socket correctly and the pieces are collected in the right order.
Client code:
byte[] image_bytes
byte[] response_bytes;
private void receive_image ( final String protocol, final int image_size, final int buffer_size)
{
if (image_size <= 0 || buffer_size <= 0)
return;
Thread image_receiver = new Thread(new Runnable() {
#Override
public void run() {
ByteBuffer byteBuffer = ByteBuffer.allocate(image_size);
byte[] buffer = new byte[buffer_size];
int bytesReadSum = 0;
try {
while (bytesReadSum != image_size) {
activeReader.read(buffer);
String message = new String(buffer);
if (TextUtils.substring(message, 0, 5len_of_protocol_number).equals(protocol)) {
int bytesToRead = Integer.parseInt(TextUtils.substring(message,
len_of_protocol_number,
len_of_protocol_number + len_of_data_len));
byteBuffer.put(Arrays.copyOfRange(buffer,
len_of_protocol_number + len_of_data_len,
bytesToRead + len_of_protocol_number + len_of_data_len));
bytesReadSum += bytesToRead;
} else {
response_bytes = null;
break;
}
}
if (bytesReadSum == image_size) {
image_bytes = byteBuffer.array();
if (image_bytes.length > 0)
response_bytes = image_bytes;
else
response_bytes = null;
}
} catch (IOException e) {
response_bytes = null;
}
}
});
image_receiver.start();
try {
image_receiver.join();
} catch (InterruptedException e) {
response_bytes = null;
}
if (response_bytes != null)
{
final ImageView imageIV = (ImageView) findViewById(R.id.imageIV);
File image_file = new File(Environment.getExternalStorageDirectory(), "image_file_jpg");
try
{
FileOutputStream stream = new FileOutputStream(image_file);
stream.write(image_bytes);
}
catch (FileNotFoundException e)
{
e.printStackTrace();
}
catch (IOException e)
{
e.printStackTrace();
}
//Here the method returns null
final Bitmap image_bitmap = BitmapFactory.decodeFile(image_file.getAbsolutePath());
main.this.runOnUiThread(new Runnable() {
#Override
public void run() {
imageIV.setImageBitmap(image_bitmap);
imageIV.invalidate();
}
}
}
}
Whenever you exchange data between two machines of different architectures via sockets you need to know the Endianness (big-endian/little-endian) of each machine. If different, you will need to convert bytes to correct the data. Perhaps that's your issue. Here's a link with sample code: Converting Little Endian to Big Endian. You should be able to easily find more articles explaining the concept.
It turned out that something was wrong with my sending protocol.
After patching it up a bit it actually worked.
Thanks for the help.
I have a (possibly long) list of binary files that I want to read lazily. There will be too many files to load into memory. I'm currently reading them as a MappedByteBuffer with FileChannel.map(), but that probably isn't required. I want the method readBinaryFiles(...) to return a Java 8 Stream so I can lazy load the list of files as I access them.
public List<FileDataMetaData> readBinaryFiles(
List<File> files,
int numDataPoints,
int dataPacketSize )
throws
IOException {
List<FileDataMetaData> fmdList = new ArrayList<FileDataMetaData>();
IOException lastException = null;
for (File f: files) {
try {
FileDataMetaData fmd = readRawFile(f, numDataPoints, dataPacketSize);
fmdList.add(fmd);
} catch (IOException e) {
logger.error("", e);
lastException = e;
}
}
if (null != lastException)
throw lastException;
return fmdList;
}
// The List<DataPacket> returned will be in the same order as in the file.
public FileDataMetaData readRawFile(File file, int numDataPoints, int dataPacketSize) throws IOException {
FileDataMetaData fmd;
FileChannel fileChannel = null;
try {
fileChannel = new RandomAccessFile(file, "r").getChannel();
long fileSz = fileChannel.size();
ByteBuffer bbRead = ByteBuffer.allocate((int) fileSz);
MappedByteBuffer buffer = fileChannel.map(FileChannel.MapMode.READ_ONLY, 0, fileSz);
buffer.get(bbRead.array());
List<DataPacket> dataPacketList = new ArrayList<DataPacket>();
while (bbRead.hasRemaining()) {
int channelId = bbRead.getInt();
long timestamp = bbRead.getLong();
int[] data = new int[numDataPoints];
for (int i=0; i<numDataPoints; i++)
data[i] = bbRead.getInt();
DataPacket dp = new DataPacket(channelId, timestamp, data);
dataPacketList.add(dp);
}
fmd = new FileDataMetaData(file.getCanonicalPath(), fileSz, dataPacketList);
} catch (IOException e) {
logger.error("", e);
throw e;
} finally {
if (null != fileChannel) {
try {
fileChannel.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
return fmd;
}
Returning fmdList.Stream() from readBinaryFiles(...) won't accomplish this because the file contents will already have been read into memory, which I won't be able to do.
The other approaches to reading the contents of multiple files as a Stream rely on using Files.lines(), but I need to read binary files.
I'm, open to doing this in Scala or golang if those languages have better support for this use case than Java.
I'd appreciate any pointers on how to read the contents of multiple binary files lazily.
There is no laziness possible for the reading within the a file as you are reading the entire file for constructing an instance of FileDataMetaData. You would need a substantial refactoring of that class to be able to construct an instance of FileDataMetaData without having to read the entire file.
However, there are several things to clean up in that code, even specific to Java 7 rather than Java 8, i.e you don’t need a RandomAccessFile detour to open a channel anymore and there is try-with-resources to ensure proper closing. Note further that you usage of memory mapping makes no sense. When copy the entire contents into a heap ByteBuffer after mapping the file, there is nothing lazy about it. It’s exactly the same what happens, when call read with a heap ByteBuffer on a channel, except that the JRE can reuse buffers in the read case.
In order to allow the system to manage the pages, you have to read from the mapped byte buffer. Depending on the system, this might still not be better than repeatedly reading small chunks into a heap byte buffer.
public FileDataMetaData readRawFile(
File file, int numDataPoints, int dataPacketSize) throws IOException {
try(FileChannel fileChannel=FileChannel.open(file.toPath(), StandardOpenOption.READ)) {
long fileSz = fileChannel.size();
MappedByteBuffer bbRead=fileChannel.map(FileChannel.MapMode.READ_ONLY, 0, fileSz);
List<DataPacket> dataPacketList = new ArrayList<>();
while(bbRead.hasRemaining()) {
int channelId = bbRead.getInt();
long timestamp = bbRead.getLong();
int[] data = new int[numDataPoints];
for (int i=0; i<numDataPoints; i++)
data[i] = bbRead.getInt();
dataPacketList.add(new DataPacket(channelId, timestamp, data));
}
return new FileDataMetaData(file.getCanonicalPath(), fileSz, dataPacketList);
} catch (IOException e) {
logger.error("", e);
throw e;
}
}
Building a Stream based on this method is straight-forward, only the checked exception has to be handled:
public Stream<FileDataMetaData> readBinaryFiles(
List<File> files, int numDataPoints, int dataPacketSize) throws IOException {
return files.stream().map(f -> {
try {
return readRawFile(f, numDataPoints, dataPacketSize);
} catch (IOException e) {
logger.error("", e);
throw new UncheckedIOException(e);
}
});
}
This should be sufficient:
return files.stream().map(f -> readRawFile(f, numDataPoints, dataPacketSize));
…if, that is, you are willing to remove throws IOException from the readRawFile method’s signature. You could have that method catch IOException internally and wrap it in an UncheckedIOException. (The problem with deferred execution is that the exceptions also need to be deferred.)
I don't know how performant this is, but you can use java.io.SequenceInputStream wrapped inside of DataInputStream. This will effectively concatenate your files together. If you create a BufferedInputStream from each file, then the whole thing should be properly buffered.
Building on VGR's comment, I think his basic solution of:
return files.stream().map(f -> readRawFile(f, numDataPoints, dataPacketSize))
is correct, in that it will lazily process the files (and stop if a short-circuiting terminal action is invoked off the result of the map() operation. I would also suggest a slightly different to the implementation of readRawFile that leverages try with resources and InputStream, which will not load the whole file into memory:
public FileDataMetaData readRawFile(File file, int numDataPoints, int dataPacketSize)
throws DataPacketReadException { // <- Custom unchecked exception, nested for class
FileDataMetadata results = null;
try (FileInputStream fileInput = new FileInputStream(file)) {
String filePath = file.getCanonicalPath();
long fileSize = fileInput.getChannel().size()
DataInputStream dataInput = new DataInputStream(new BufferedInputStream(fileInput);
results = new FileDataMetadata(
filePath,
fileSize,
dataPacketsFrom(dataInput, numDataPoints, dataPacketSize, filePath);
}
return results;
}
private List<DataPacket> dataPacketsFrom(DataInputStream dataInput, int numDataPoints, int dataPacketSize, String filePath)
throws DataPacketReadException {
List<DataPacket> packets = new
while (dataInput.available() > 0) {
try {
// Logic to assemble DataPacket
}
catch (EOFException e) {
throw new DataPacketReadException("Unexpected EOF on file: " + filePath, e);
}
catch (IOException e) {
throw new DataPacketReadException("Unexpected I/O exception on file: " + filePath, e);
}
}
return packets;
}
This should reduce the amount of code, and make sure that your files get closed on error.
I want to save raw data chunks to a file, And later on read those chunks one by one. This is no big deal except the following doubt:
What exact bytes to use as a delimiter, i.e to identify end of one chunk and beginning of next ? Given that chunk data might also contain such a sequence of bytes by random chance.
Notes: chunks are of variable size and contain random data. They are jpeg images actually.
You could first write the length of the chunk to the file as a fixed-size value, e.g. a 4 bytes integer, followed by the data itself:
public void appendChunk(byte[] data, File file) throws IOException {
DataOutputStream stream = null;
try {
stream = new DataOutputStream(new BufferedOutputStream(new FileOutputStream(file, true)));
stream.writeInt(data.length);
stream.write(data);
} finally {
if (stream != null) {
try {
stream.close();
} catch (IOException e) {
// ignore
}
}
}
}
If you later have to read the chunks back from that file, you start by reading the length of the first chunk. You now can decide whether to read the chunk data, or whether to skip it and continue with the next chunk.
public void processChunks(File file) throws IOException {
DataInputStream stream = null;
try {
stream = new DataInputStream(new BufferedInputStream(new FileInputStream(file)));
while (true) {
try {
int length = stream.readInt();
byte[] data = new byte[length];
stream.readFully(data);
// todo: do something with the data
} catch (EOFException e) {
// end of file reached
break;
}
}
} finally {
if (stream != null) {
try {
stream.close();
} catch (IOException e) {
// ignore
}
}
}
}
You can also add other meta-data about the chunks, like writing the original name of the file with stream.writeUTF(...). You only have to make sure that you write and read the same data in the same order.
Create a 2nd file in which you save the byteranges of your chunks in the chunkfile, or add that information to the header of your chunkfile. Did something similar once, don't forget that the byteranges than have the additional offset of the length of the header.
int startbyte = 0;
int lastByte = 0;
int chunkcount = 0;
File chunkfile;
File structurefile;
for (every chunk) {
append chunk to chunkfile
lastByte = startByte + chunk.sizeInBytes()
append to structurefile: chunkcount startByte lastByte
chunkcount++;
startByte = lastByte + 1
}