I would like to execute XOR operation in my code. However I have strange behavior on the output. Sometimes the result is right but sometime it's not.
Here's the situation:
I have file which I already split into two parts and then I created one parity file using xor operation on both file (source files). So now I have three files. Then I deleted one of the source file. I would like to retrieve the missing file within xor operation between parity file and the remaining source file regarding the missing file. I am using hash function to check whether the output is correct or not. If the function is called only one time, everything is fine, but whenever I have many operations to retrieve the missing file on other files, sometimes my function generates the wrong result.
When they generate the wrong results, it's always generating the same file. BUT if I put thread.sleep for 1 second, they always generate the correct result even if I have more than 1000 operations.
Could somebody help me to spot which part of my code is broke?
private byte[] parityByte(byte[] firstByte, byte[] secondByte) {
int size1;
size1 = firstByte.length;
int size2;
size2 = secondByte.length;
byte[] parity;
parity = new byte[size1];
for (int i = 0; i < size2; i++) {
parity[i] = (byte) (firstByte[i] ^ secondByte[i]);
}
for (int i = size2; i < size1; i++) {
parity[i] = firstByte[i];
}
return parity;
}
/**
* get original chunks
*/
public Chunk getOriginal(Chunk parity, Chunk compare, String orig) throws FileNotFoundException, IOException {
File par = new File(parity.getHash());
InputStream parity = new BufferedInputStream(new FileInputStream(parity.getHash()));
InputStream source = new BufferedInputStream(new FileInputStream(compare.getHash()));
int size = (int) par.length();
int bufferSize = size;
byte[] firstBuffer = new byte[size];
byte[] secondBuffer = new byte[size];
long remainSize;
byte[] destByte = new byte[1];
parity.read(destByte, 0, 1);
Integer dest = new Integer(destByte[0]);
remainSize = size - 1 - dest;
OutputStream originalChunk;
originalChunk = new FileOutputStream(orig);
while (remainSize > 0) {
if (remainSize > bufferSize) {
remainSize -= bufferSize;
} else {
bufferSize = (int) remainSize;
firstBuffer = new byte[bufferSize];
secondBuffer = new byte[bufferSize];
remainSize = 0;
}
parity.read(firstBuffer, 0, bufferSize);
source.read(secondBuffer, 0, bufferSize);
originalChunk.write(parityByte(firstBuffer, secondBuffer));
}
originalChunk.flush();
parity.close();
source.close();
originalChunk.close();
Chunk tempChunk = Chunk.newChunk(orig);
return tempChunk;
}
Thank you
sorry for my bad english.
You are assuming that all the reads fill the buffer. Check the Javadoc. The read(byte[] ...) method returns a value, and it is for a reason.
Have a look at DataInputStream.readFully() for a simple solution.
Related
I can not mix two audio extension files wav. My work:
byte[] bufData1 = null;
byte[] bufData2 = null;
ArrayList<Byte> bufData3 = new ArrayList<Byte>();
Creating two arrays with raw audio data
public void bootloadInputData(String p1, String p2) throws IOException {
bufData1 = bootloadReadFileByte(p1);
bufData2 = bootloadReadFileByte(p2);
System.arraycopy(bufData1, 44, bufData1, 0, (bufData1.length - 44));
System.arraycopy(bufData2, 44, bufData2, 0, (bufData2.length - 44));
}
public byte[] bootloadReadFileByte(String path) throws IOException{
ByteArrayOutputStream out = null;
InputStream input = null;
try{
out = new ByteArrayOutputStream();
input = new BufferedInputStream(new FileInputStream(path));
int data = 0;
while((data = input.read()) != -1){
out.write(data);
}
}
finally{
if(null != input){
input.close();
}
if(null != out){
out.close();
}
}
return out.toByteArray();
}
Mixing the bytes of raw audio data
public void bootloadOutputData() throws IOException {
for(int i = 0; i < ((bufData1.length + bufData2.length) / 4); i += 4) {
if(i < bufData1.length){
bufData3.add(bufData1[i]);
bufData3.add(bufData1[i+1]);
bufData3.add(bufData1[i+2]);
bufData3.add(bufData1[i+3]);
}
if(i < bufData2.length){
bufData3.add(bufData2[i]);
bufData3.add(bufData2[i+1]);
bufData3.add(bufData2[i+2]);
bufData3.add(bufData2[i+3]);
}
}
}
Create a new file, fill in the header and raw audio data.
private void bootloadCreateWaveMix(String p1, String p2, String p3) throws IOException {
int size1 = 0;
int size2 = 0;
FileInputStream fis1 = null;
FileInputStream fis2 = null;
try {
fis1 = new FileInputStream(p1);
fis2 = new FileInputStream(p2);
size1 = fis1.available();
size2 = fis2.available();
} finally {
if(fis1 != null){
fis1.close();
}
if(fis2 != null){
fis2.close();
}
}
int mNumBytes = (size1 + size2);
DataOutputStream out = null;
try {
out = new DataOutputStream(new FileOutputStream(p3));
writeId(out, "RIFF");
writeInt(out, 36 + mNumBytes);
writeId(out, "WAVE");
writeId(out, "fmt ");
writeInt(out, 16);
writeShort(out, (short) 1);
writeShort(out, (short) 4);
writeInt(out, (int) 44100);
writeInt(out, 2 * 44100 * 16 / 8);
writeShort(out, (short)(2 * 16 / 8));
writeShort(out, (short) 16);
writeId(out, "data");
writeInt(out, mNumBytes);
out.write(toByteArray(bufData3));
} finally {
if(out != null){
out.close();
}
}
}
private static void writeId(OutputStream out, String id) throws IOException {
for (int i = 0; i < id.length(); i++) out.write(id.charAt(i));
}
private static void writeInt(OutputStream out, int val) throws IOException {
out.write(val >> 0);
out.write(val >> 8);
out.write(val >> 16);
out.write(val >> 24);
}
private static void writeShort(OutputStream out, short val) throws IOException {
out.write(val >> 0);
out.write(val >> 8);
}
public static byte[] toByteArray(ArrayList<Byte> in) {
byte[] data = new byte[in.size()];
for (int i = 0; i < data.length; i++) {
data[i] = (byte) in.get(i);
}
return data;
}
Question:
This code does not correctly create a file that the computer can not
play, but the device can. Reproduction is bad, there is some kind of
interference at the end of the merged files. Also, playback ends when
the first file ends, even if the second file is larger than the first
one. Another problem with the channels on the idea is two stereo
files, and in the title I indicate 4 life even though 2. The files
will always be 44100/16 bit / stereo
If I understand correctly, you want to do the following:
Given 2 input WAV files, mix them together to a single WAV file.
The contents of the output will be the input files played at the same time, not one after the other.
The length of the new file will be the length of the longest of the input files.
All files, input and output, are 16 bit, stereo 44100Hz.
If that's the case, here are (some of) your mistakes:
You need to parse the incoming files so that you don't read their headers as audio data (Do not skip this step just because you already know the format of the audio. You need to read the headers to confirm the data format and accurately determine the number of samples in your input. Also, note that 2/16/44100 WAV files can have different size headers because they can contain various chunks, so you can't just skip over X bytes and then read the file -- you must parse the header!).
If the WAV files are all 16-bit, you need to convert the incoming data from bytes to shorts (note, this is not a simple typecasting -- you must pack 2 bytes into each short. I believe you can use a DataInputStream for this, but be sure to take endianness into account -- WAV files are little-endian and Java is big-endian). Once you've got the shorts representing your samples, average the shorts from the separate files to do the mixing. Your averaged values must then be converted back to bytes (DataOutputStream) to save the resulting file. When you've run out of data from one file, substitute zero.
Your calculation of numBytes is incorrect -- it is not the sum of raw bytes in both files, but a somewhat more complex calculation. In your case, you want it to be equal to something like this:
n1 = number of samples in file 1
n2 = number of samples in file 2
n = MAX( n1 + n2 )
numBytes = n * (number of channels) * (number of bytes per channel) = n * 2 * 2
I strongly urge you to consider using a library like JMF to tackle 1 & 2.
I'm trying to get a BufferedInputStream from an uploaded cvs file.
I'm working with a Multipart derived from the cvs file.
When I first get the Multipart, it's a BufferedInputStream, but the buffer is all null.
But if I look deeper down, there's another buffer in the CoyoteInputStream and that has data.
How can I get at this second buffer? My code is below.
And of course it's throwing a null exception when it gets to
while ((multiPartDataPos = stream.read(buffer)) >= 0)
What am I doing wrong? Am I mistaken that the CoyoteInputStream is the data I want?
public byte[] handleUploadedFile(Multipart multiPart) throws EOFException {
Multipart multiPartData = null;
BufferedInputStream stream = null;
int basicBufferSize = 0x2000;
byte[] buffer = new byte[basicBufferSize];
int bufferPos = 0;
try {
while (multiPart.hasNext()) {
int multiPartDataPos = bufferPos;
multiPartData = (Multipart) multiPart.next();
stream = new BufferedInputStream(multiPartData.getInputStream());
while ((multiPartDataPos = stream.read(buffer)) >= 0) {
int len = stream.read(buffer, multiPartDataPos, buffer.length - multiPartDataPos);
multiPartDataPos += len;
}
bufferPos = bufferPos + multiPartDataPos;
}
} ...
Your code doesn't make any sense.
while ((multiPartDataPos = stream.read(buffer)) >= 0) {
At this point you have read multiPartDataPos bytes into buffer, so that buffer[0..multiPartDataPos-1] contains the data just read.
int len = stream.read(buffer, multiPartDataPos, buffer.length - multiPartDataPos);
At this point you are doing another read, which could return -1, which will otherwise add some data from multiPartPos to multiPartDataPos+len-.
multiPartDataPos += len;
This step is only valid if len > 0.
And you are doing nothing with the buffer; and next time around the loop you will clobber whatever you just read.
The correct way to read any stream in Java is as follows:
while ((count = in.read(buffer)) > 0)
{
// use buffer[9..count-1], for example out.write(buffer, 0, count);
}
I don't understand why you think access to an underlying stream is required or what it's going to give you that you don't already have.
Turns out the better solution was to use move the data from an InputStream to a ByteArrayOutputStream and then return ByteArrayOutputStream.toByteArray()
Multipart multiPartData = null;
ByteArrayOutputStream buffer = new ByteArrayOutputStream();
int read;
byte[] input = new byte[4096];
InputStream is;
try {
multiPartData = (Multipart)multipart.next();
is = multiPartData.getInputStream();
while ((read = is.read(input, 0, input.length)) != -1) {
buffer.write(input, 0, read);
}
buffer.flush();
return buffer.toByteArray(); // just a test right now
}
Ok so i'm trying to send files with the sockets. This is the code i got on the server side. The problem is that if i let this code run i will get some strange output liking missing bytes. I tried to add thread.sleep() inside the if statement this seemed to help some and only made the problem appear when it was handling larger files. I also tried to place a breakpoint on the line buffer = setbuffersize(binaryData, i, in); so i stopped after each file. This seemed to help the problem. So i guess the problem is that the code continue to read even if there is nothing to read (the client has had time to send it yet). I'm not really sure how to fix this. I tried to check if BUFFER_SIZE was as big as in.available() but it seems to stop sending after a while and never reach the BUFFER_SIZE.
So how do i get the code to wait for the data to be transmitted before it tries to read it?
while (byteRead != -1) {
commandlengh = msg.length();
binaryData = new byte[Integer.parseInt(ParameterValues.get(ParameterValues.size() - 1))];
in.read();
byte[] buffer = setbuffersize(binaryData, i, in);
while (in.read(buffer) != -1) {
for (int j = 0; j < buffer.length; j++) {
binaryData[i] = buffer[j];
i++;
}
buffer = setbuffersize(binaryData, i, in);
if(buffer.length == 0)
{
Parameters.clear();
ParameterValues.clear();
i = 0;
commandlengh = 0;
break;
}
}
byteRead = in.read();
}
private byte[] setbuffersize(byte[] binaryData, int i, InputStream in) throws IOException
{
int BUFFER_SIZE = 65536;
if(binaryData.length - i < BUFFER_SIZE)
{
BUFFER_SIZE = binaryData.length - i;
}
else
{
BUFFER_SIZE = 65536;
}
byte[] buffer = new byte[BUFFER_SIZE];
return buffer;
}
You are ignoring the count returned by read() and assuming that it filled the buffer. Check the Javadoc. It isn't required to do that.
I can't make much sense of your code. You don't need all that to read from a socket. Just call read, check the result for -1; if so, stop; otherwise process bytes 0..count-1 of the buffer.
I am porting an Android app to iPhone (more like improving the iPhone app based on the Android version) and I need to split and combine large uncompressed audio files.
Currently, I load all the files into memory and split them and combine them in separate functions. It crashes with 100MB+ files.
This is the new process needed to do it:
I have two recordings (file1 and file2) and a split position where I want file2 to be inserted inside file1.
-create the input streams for file1 and file2 and the output stream for the outputfile.
-rewrite the new CAF header
-read the data from inputStream1 until it reaches the split point and I write all that data to the output file.
and write it to the output stream.
-read all data from inputStream2 and write it to output file.
-read remaining data from inputStream1 and write it to output file.
Here is my Android code for the process:
File file1File = new File(file1);
File file2File = new File(file2);
long file1Length = file1File.length();
long file2Length = file2File.length();
FileInputStream file1ByteStream = new FileInputStream(file1);
FileInputStream file2ByteStream = new FileInputStream(file2);
FileOutputStream outputFileByteStream = new FileOutputStream(outputFile);
// time = fileLength / (Sample Rate * Channels * Bits per sample / 8)
// convert position to number of bytes for this function
long sampleRate = eRecorder.RECORDER_SAMPLERATE;
int channels = 1;
long bitsPerSample = eRecorder.RECORDER_BPP;
long bytePositionLength = (position * (sampleRate * channels * bitsPerSample / 8)) / 1000;
//calculate total data size
int dataSize = 0;
dataSize = (int)(file1Length + file2Length);
WriteWaveFileHeaderForMerge(outputFileByteStream, dataSize,
dataSize + 36,
eRecorder.RECORDER_SAMPLERATE, 1,
2 * eRecorder.RECORDER_SAMPLERATE);
long bytesWritten = 0;
int length = 0;
//set limit for bytes read, and write file1 bytes to outputfile until split position reached
int limit = (int)bytePositionLength;
//read bytes to limit
writeBytesToLimit(file1ByteStream, outputFileByteStream, limit);
file1ByteStream.close();
file2ByteStream.skip(44);//skip wav file header
writeBytesToLimit(file2ByteStream, outputFileByteStream, (int)file2Length);
file2ByteStream.close();
//calculate length of remaining file1 bytes to be written
long file1offset = bytePositionLength;
//reinitialize file1 input stream
file1ByteStream = new FileInputStream(file1);
file1ByteStream.skip(file1offset);
writeBytesToLimit(file1ByteStream, outputFileByteStream, (int)file1Length);
file1ByteStream.close();
outputFileByteStream.close();
And this is my writeBytesToLimit function:
private void writeBytesToLimit(FileInputStream inputStream, FileOutputStream outputStream, int byteLimit) throws IOException
{
int bytesRead = 0;
int chunkSize = 65536;
int length = 0;
byte[] buffer = new byte[chunkSize];
while((length = inputStream.read(buffer)) != -1)
{
bytesRead += length;
if(bytesRead >= byteLimit)
{
int leftoverBytes = byteLimit % chunkSize;
byte[] smallBuffer = new byte[leftoverBytes];
System.arraycopy(buffer, 0, smallBuffer, 0, leftoverBytes);
outputStream.write(smallBuffer);
break;
}
if(length == chunkSize)
outputStream.write(buffer);
else
{
byte[] smallBuffer = new byte[length];
System.arraycopy(buffer, 0, smallBuffer, 0, length);
outputStream.write(smallBuffer);
}
}
}
How do I do this in iOS? Using the same delegate for two NSInputStreams and an NSOutputStream looks like it will get very messy.
Has anyone seen an example of how to do this (and do it clean)?
I ended up using NSFileHandle. For example, this is the first part of what I am doing.
NSData *readData = [[NSData alloc] init];
NSFileHandle *reader1 = [NSFileHandle fileHandleForReadingAtPath:file1Path];
NSFileHandle *writer = [NSFileHandle fileHandleForWritingAtPath:outputFilePath];
//start reading data from file1 to split point and writing it to file
long bytesRead = 0;
while(bytesRead < splitPointInBytes)
{
//read a chunk of data
readData = [reader1 readDataOfLength:chunkSize];
if(readData.length == 0)break;
//trim data if too much was read
if(bytesRead + readData.length > splitPointInBytes)
{
//get difference of read bytes and byte limit
long difference = bytesRead + readData.length - splitPointInBytes;
//trim data
NSMutableData *readDataMutable = [NSMutableData dataWithData:readData];
[readDataMutable setLength:readDataMutable.length - difference];
readData = [NSData dataWithData:readDataMutable];
NSLog(#"Too much data read, trimming");
}
//write data to output file
[writer writeData:readData];
//update byte counter
bytesRead += readData.length;
}
long file1BytesWritten = bytesRead;
I have a bin file that I need to convert to a byte array. Can anyone tell me how to do this?
Here is what I have so far:
File f = new File("notification.bin");
is = new FileInputStream(f);
long length = f.length();
/*if (length > Integer.MAX_VALUE) {
// File is too large
}*/
// Create the byte array to hold the data
byte[] bytes = new byte[(int)length];
// Read in the bytes
int offset = 0;
int numRead = 0;
while (offset < bytes.length && (numRead=is.read(bytes, offset, bytes.length-offset)) >= 0) {
offset += numRead;
}
// Ensure all the bytes have been read in
if (offset < bytes.length) {
throw new IOException("Could not completely read file "+f.getName());
}
But it's not working...
Kaddy
try using this
public byte[] readFromStream(InputStream inputStream) throws Exception
{
ByteArrayOutputStream baos = new ByteArrayOutputStream();
DataOutputStream dos = new DataOutputStream(baos);
byte[] data = new byte[4096];
int count = inputStream.read(data);
while(count != -1)
{
dos.write(data, 0, count);
count = inputStream.read(data);
}
return baos.toByteArray();
}
Btw, do you want a Java code or C++ code. Seeing the code in your question, I assumed it to be a java code and hence gave a java answer to it
You're probably better off using a memory mapped file. See this question
In Java, a simple solution is:
InputStream is = ...
ByteArrayOutputStream os = new ByteArrayOutputStream();
byte[] data = new byte[4096]; // A larger buffer size would probably help
int count;
while ((count = is.read(data)) != -1) {
os.write(data, 0, count);
}
byte[] result = os.toByteArray();
If the input is a file, we can preallocate a byte array of the right size:
File f = ...
long fileSize = f.length();
if (fileSize > Integer.MAX_VALUE) {
// file too big
}
InputStream is = new FileInputStream(f);
byte[] data = new byte[fileSize];
if (is.read(data)) != data.length) {
// file truncated while we were reading it???
}
However, there is probably a more efficient way to do this task using NIO.
Unless you really need to do it just that way, maybe simplify what you're doing.
Doing everything in the for loop may seem like a very slick way of doing it, but it's shooting yourself in the foot when you need to debug and don't immediately see the solution.
In this answer I read from an URL
You could modify it so the InputStream is from a File instead of a URLConnection.
Something like:
FileInputStream inputStream = new FileInputStream("your.binary.file");
ByteArrayOutputStream output = new ByteArrayOutputStream();
byte [] buffer = new byte[ 1024 ];
int n = 0;
while (-1 != (n = inputStream.read(buffer))) {
output.write(buffer, 0, n);
}
inputStream.close();
etc
Try open source library apache commons-io
IOUtils.toByteArray(inputStream)
You are not the first and not the last developer who needs to read a file, no need to reinvent it each time.