Transfer type double[] over network - java

I want to transfer the type double[] over a network, and then somehow manage to transfer it back into a double[] in the receiving side. I am not entirely sure how to do this. I tried to convert the String received to a char[] and then parse all the chars to a double[]. However this did not work, the double had different data. I need to do this to make a network protocol for opencv, to transfer Mat's easily.
So this is how the data gets sent:
private void send_info(int row,int col, double[] data) {
//Convert data to String, separated by : to indicate change
//char[] sendit = data.toString().toCharArray();
out.println("INF:ROW:"+row+":COL"+":"+col+":"+data);
}
And this is how it is received:
private void setInfo(String input) {
input = input.trim();
input=input.replace("INF:","");
String inputs[] = input.split(":");
System.out.println(inputs[1]);
int row = Integer.parseInt(inputs[1]);
int col = Integer.parseInt(inputs[3]);
//double[] data = magic(inputs[4]);
// What I need ^
frame.put(row,col,data);
}

Don't convert them at all. Waste of time and space. Just do it directly. To send double[] doubles:
DataOutputStream dos = new DataOutputStream(new BufferedOutputStream(socket.getOutputStream()));
dos.writeInt(doubles.length); // send the array length
for (d : doubles)
{
dos.writeDouble(d);
}
dos.flush();
To read:
DataInputStream din = new DataInputStream(new BufferedInputStream(socket.getInputStream()));
double[] doubles = new double[dis.readInt()];
for (int i = i; i < doubles.length; i++)
{
doubles[i] = dis.readDouble();
}
Or you can use ObjectOutputStream.writeObject() and ObjectInputStream.readObject() to write and read the entire array at once. Or you can use NIO and DoubleBuffer: left as an exercise for the reader.

As an addendum to EJP’s answer, here an NIO solution:
sending
try(SocketChannel ch=SocketChannel.open(
new InetSocketAddress(InetAddress.getLocalHost(), 12345))) {
ByteBuffer buf=ByteBuffer.allocateDirect(doubles.length*Double.BYTES+Integer.BYTES);
buf.putInt(doubles.length).asDoubleBuffer().put(doubles);
buf.clear();
while(buf.hasRemaining()) ch.write(buf);
}
receiving
final int DEFAULT_BUFFER_SIZE = 4096;
try(ServerSocketChannel ss=ServerSocketChannel.open()
.bind(new InetSocketAddress(InetAddress.getLocalHost(), 12345));
SocketChannel ch=ss.accept()) {
ByteBuffer bb=ByteBuffer.allocateDirect(DEFAULT_BUFFER_SIZE);
bb.limit(Integer.BYTES);
while(bb.hasRemaining()) if(ch.read(bb)<0) throw new EOFException();
bb.flip();
int size=bb.getInt(), byteSize=size*Double.BYTES;
if(bb.capacity()<byteSize) bb=ByteBuffer.allocateDirect(byteSize);
else bb.clear().limit(byteSize);
while(bb.hasRemaining()) if(ch.read(bb)<0) throw new EOFException();
double[] doubles=new double[size];
bb.flip();
bb.asDoubleBuffer().get(doubles);
return doubles;
}
It’s obvious that the buffer management gets more complicated on the receiving side due to the double array length which is not known beforehand.
If we want to reduce the number of transfers, i.e. avoid a distinct I/O operation just for the first four bytes, the method gets even more complicated:
final int DEFAULT_BUFFER_SIZE = 4096;
try(ServerSocketChannel ss=ServerSocketChannel.open()
.bind(new InetSocketAddress(InetAddress.getLocalHost(), 12345));
SocketChannel ch=ss.accept()) {
ByteBuffer bb=ByteBuffer.allocateDirect(DEFAULT_BUFFER_SIZE);
while(bb.position()<4) if(ch.read(bb)<0) throw new EOFException();
bb.flip();
int size=bb.getInt(), byteSize=size*Double.BYTES;
if(bb.remaining()<byteSize) {
if(bb.capacity()<byteSize) bb=ByteBuffer.allocateDirect(byteSize).put(bb);
else bb.compact().limit(byteSize);
while(bb.hasRemaining()) if(ch.read(bb)<0) throw new EOFException();
bb.flip();
}
else bb.limit(bb.position()+byteSize);
double[] doubles=new double[size];
bb.asDoubleBuffer().get(doubles);
return doubles;
}
But note that the format is identical to the one created with the DataOutputStream in EJP’s solution, so you could combine, e.g. the NIO sending code with the old I/O receiving code…

Related

InputStream audio mixing (MODE_STREAM)

I'm making a drum sequencer in Android...
I'm writing to an AudioTrack in MODE_STREAM, so that I can achieve synchronized audio playback with all InputStreams (availible via a list of 'active' InputStreams, activeStreams in the code below)
The audio is always: PCM (WAV), 16bit Stereo 44100 Hz.
Obviously, I can't composite audio in real time on the UI thread, so I'm using an AsyncTask to queue up all the audio buffering.
I got buffered playback working, but when it comes to merging the buffers of two (or more) InputStream's, the internet seems to be in some kind of debate of what to do next. "Convert the byte[] to short[]!", "No, do the bit mixing on-the-fly!", "But if you don't use shorts the byte Endianness is ignored!", "It gets ignored anyway!" - I don't even know any more.
How do I mix the buffer of two or more InputStreams? I don't understand why my current implementation is failing
I've tried like, 4 different StackOverflow solutions to convert the byte[] to short[] so I can add the samples together, but the conversion always instantly crashes Java with some cryptic error message that I can't get my head around. So now I give up. Here's my code implementing one such StackOverflow solution...
protected Long doInBackground(Object ... Object) {
int bytesWritten = 0;
InputStream inputStream;
int si = 0, i = 0;
//The combined buffers. The 'composition'
short[] cBuffer = new short[Synth.AUDIO_BUFFER_SIZE];
//The 'current buffer', the segment of inputStream audio.
byte[] bBuffer = new byte[Synth.AUDIO_BUFFER_SIZE];
//The 'current buffer', converted to short?
short[] sBuffer = new short[Synth.AUDIO_BUFFER_SIZE];
int curStreamNum;
int numStreams = activeStreams.size();
short mix;
//Start with an empty 'composition'
cBuffer = new short[Synth.AUDIO_BUFFER_SIZE];
boolean bufferEmpty = false;
try {
while(true) { // keep going forever, until stopped or paused.
for(curStreamNum = 0;curStreamNum < numStreams;curStreamNum++){
inputStream = activeStreams.get(curStreamNum);
i = inputStream.read(bBuffer);
bufferEmpty = i<=-1;
if(bufferEmpty){
//Input stream buffer was empty. It's out of audio. Close and remove the stream.
inputStream.close();
activeStreams.remove(curStreamNum);
curStreamNum--; numStreams--; continue; // hard continue.
}else{
//Take the now-read buffer, and convert to shorts.
ByteBuffer.wrap(bBuffer).order(ByteOrder.LITTLE_ENDIAN).asShortBuffer().get(sBuffer);
//Take the short buffer, merge into composition buffer.
//TODO: Optimize by making the 'first layer' of the composition the first buffer, on its own.
for(si=0;si<Synth.AUDIO_BUFFER_SIZE;si++){
mix = (short) (sBuffer[si] + cBuffer[si]);
//This part is probably completely wrong too. I'm not up to here yet to evaluate whats needed...
if(mix >= 32767){
mix = 32767;
}else if (mix <= -32768){
mix = -32768;
}
cBuffer[si] = mix;
}
}
}
track.write(sBuffer, 0, i);
//It's always full; full buffer of silence, or of composited audio.
totalBytesWritten += Synth.AUDIO_BUFFER_SIZE;
//.. queueNewInputStreams ..
publishProgress(totalBytesWritten);
if (isCancelled()) break;
}
} catch (IOException e) {e.printStackTrace();}
return Long.valueOf(totalBytesWritten);
}
I'm currently getting a BufferUnderflowException on this line: ByteBuffer.wrap(bBuffer).order(ByteOrder.LITTLE_ENDIAN).asShortBuffer().get(sBuffer);.
How is it possible to have buffer underrun? I'm only converting a byte[] to a short[].
Please help!
I've posted my whole function in the hopes that this more complete code sample and fairly adaptable usage can help other people out there.
(P.S. the byte[] to short[] conversion is followed by some flimsy hard clipping which I'm not even up to debugging yet, but advice there would also be appreciated)
Your solution seems almost good, I see two issues and a potential one:
the length of the short array: it MUST be the half of the byte array, otherwise you get the underflow
the sum of the short must be the average of the shorts and not just the sum, or you'll get just noise
(potential issue) the length of the array you read by InputStream cannot be totally free, since you have to sum 2bytes for every InputStream (then it must be an even array) and you should take care of mono vs. stereo audio files (if stereo you have 2bytes for the left channel and 2bytes for the right channel interleaved)
Here you can find a snippet that I would use to sum of two WAV array (16bit, mono)
Random random = new Random();
int bufferLength = 20;
byte[] is1 = new byte[bufferLength];
byte[] is2 = new byte[bufferLength];
byte[] average = new byte[bufferLength];
random.nextBytes(is1);
random.nextBytes(is2);
short[] shorts1 = new short[bufferLength/2];
ByteBuffer.wrap(is1).order(ByteOrder.LITTLE_ENDIAN).asShortBuffer().get(shorts1);
short[] shorts2 = new short[bufferLength/2];
ByteBuffer.wrap(is2).order(ByteOrder.LITTLE_ENDIAN).asShortBuffer().get(shorts2);
short[] result = new short[bufferLength/2];
for (int i=0; i<result.length; i++) {
result[i] = (short) ((shorts1[i] + shorts2[i])/2);
}
ByteBuffer.wrap(average).order(ByteOrder.LITTLE_ENDIAN).asShortBuffer().put(result);
For a 32bit stereo, the solution could be
Random random = new Random();
int bufferLength = 8 * 50;
byte[] is1 = new byte[bufferLength];
byte[] is2 = new byte[bufferLength];
byte[] average = new byte[bufferLength];
random.nextBytes(is1);
random.nextBytes(is2);
System.out.println(bytesToHex(is1));
System.out.println(bytesToHex(is2));
int[] ints1 = new int[bufferLength/4];
ByteBuffer.wrap(is1).order(ByteOrder.LITTLE_ENDIAN).asIntBuffer().get(ints1);
int[] ints2 = new int[bufferLength/4];
ByteBuffer.wrap(is2).order(ByteOrder.LITTLE_ENDIAN).asIntBuffer().get(ints2);
int[] result = new int[bufferLength/4];
for (int i=0; i<result.length; i++) {
result[i] = ((ints1[i] + ints2[i])/2);
}
ByteBuffer.wrap(average).order(ByteOrder.LITTLE_ENDIAN).asIntBuffer().put(result);

Add ByteArray to integer

In the following java code-snippet you'll see this line packetLengthMax += bytes.toByteArray()[43];
My question is: How does this work?
byte[] dataBuffer = new byte[265];
int packetLength = 0;
int packetLengthMax = 44;
ByteArrayOutputStream bytes = new ByteArrayOutputStream();
DataOutputStream outMessage = new DataOutputStream(bytes);
/* Client = Socket*/
DataInputStream clientIn = new DataInputStream(Client.getInputStream());
while (packetLength < packetLengthMax) {
packetLength += clientIn.read(dataBuffer);
outMessage.write(dataBuffer);
if (packetLength >= 43) {
packetLengthMax += bytes.toByteArray()[43];
}
}
My explanation:
First a socket (Client) is passed to the code. Then it does the setup of all variables. In the while loop, it reads all data that comes from the socket. Then it also writes this data to the DataOutputStream.
But in the if statement - it adds a byte array to an integer.
How does it work? I don't get that point. Thank you for helping!
It's not adding the whole byte array, it's just adding the byte at position 43. (i.e. the 44th byte in the array).

can't work with BufferedInputStream and BufferedReader together

I'm trying to read first line from socket stream with BufferedReader from BufferedInputStream, it reads the first line(1), this is size of some contents(2) in this content i have the size of another content(3)
Reads correctly... ( with BufferedReader, _bin.readLine() )
Reads correctly too... ( with _in.read(byte[] b) )
Won't read, seems there's more content than my size read in (2)
I think problem is that I'm trying to read using BufferedReader and then BufferedInputStream... can anyone help me ?
public HashMap<String, byte[]> readHead() throws IOException {
JSONObject json;
try {
HashMap<String, byte[]> map = new HashMap<>();
System.out.println("reading header");
int headersize = Integer.parseInt(_bin.readLine());
byte[] parsable = new byte[headersize];
_in.read(parsable);
json = new JSONObject(new String(parsable));
map.put("id", lTob(json.getLong(SagConstants.KEY_ID)));
map.put("length", iTob(json.getInt(SagConstants.KEY_SIZE)));
map.put("type", new byte[]{(byte)json.getInt(SagConstants.KEY_TYPE)});
return map;
} catch(SocketException | JSONException e) {
_exception = e.getMessage();
_error_code = SagConstants.ERROR_OCCOURED_EXCEPTION;
return null;
}
}
sorry for bad english and for bad explanation, i tried to explain my problem, hope you understand
file format is so:
size1
{json, length is given size1, there is size2 given}
{second json, length is size2}
_in is BufferedInputStream();
_bin is BufferedReader(_in);
with _bin, i read first line (size1) and convert to integer
with _in, i read next data, where is size2 and length of this data is size1
then im trying to read the last data, its size is size2
something like this:
byte[] b = new byte[secondSize];
_in.read(b);
and nothing happens here, program is paused...
can't work with BufferedInputStream and BufferedReader together
That's correct. If you use any buffered stream or reader on a socket [or indeed any data source], you can't use any other stream or reader with it whatsoever. Data will get 'lost', that is to say read-ahead, in the buffer of the buffered stream or reader, and will not be available to the other stream/reader.
You need to rethink your design.
You create one BufferedReader _bin and BufferedInputStream _in and read a file both of them, but their cursor position is different so second read start from beginning because you use 2 object to read it. You should read size1 with _in too.
int headersize = Integer.parseInt(readLine(_in));
byte[] parsable = new byte[headersize];
_in.read(parsable);
Use below readLine to read all data with BufferedInputStream.
private final static byte NL = 10;// new line
private final static byte EOF = -1;// end of file
private final static byte EOL = 0;// end of line
private static String readLine(BufferedInputStream reader,
String accumulator) throws IOException {
byte[] container = new byte[1];
reader.read(container);
byte byteRead = container[0];
if (byteRead == NL || byteRead == EOL || byteRead == EOF) {
return accumulator;
}
String input = "";
input = new String(container, 0, 1);
accumulator = accumulator + input;
return readLine(reader, accumulator);
}

Example code for java.util.zip.Deflater that uses batch modes for both input and output

The example code in the javadoc for java.util.zip.Deflater is too optimistic, and assumes that you have a byte array containing all input, and a byte array that is large enough for the output. >:(
Is there an example somewhere that calls both needsInput() to add input in batches, and finished() to get output in batches? I can't seem to find one, and the docs are a little hazy about what is the right order of operations.
Example:
I have a ByteBuffer that is a 100MB memory-mapped file
I'm writing the output to a stream
I have a these byte arrays of length 1024 for batches of input and output:
byte[] inbatch = new byte[1024];
byte[] outbatch = new byte[1024];
It seems like something like this might work, but I'm not sure, and I suspect there may be some subtle edge cases...
Deflater deflater = ...;
byte[] inbatch = new byte[1024];
byte[] outbatch = new byte[1024];
boolean inputDone = false;
while (true)
{
if (!inputDone && deflater.needsInput())
{
int n = get_more_input(..., inbatch);
if (n == 0)
inputDone = true;
else
deflater.setInput(inbatch, 0, n);
}
if (deflater.finished()
{
if (inputDone)
break;
}
else
{
int n = deflater.deflate(outbatch);
handle_output(outbatch, n);
}
}

Reading binary stream until "\r\n" is encountered

I'm working on a Java application which will stream video from an IP Camera. The video streams from the IP Camera in MJPEG format. The protocol is the following...
--ipcamera (\r\n)
Content-Type: image/jpeg (\r\n)
Content-Length: {length of frame} (\r\n)
(\r\n)
{frame}
(\r\n)
--ipcamera (\r\n)
etc.
I've tried using classes such as BufferedReader and Scanner to read until the "\r\n", however those are meant for text and not binary data, so it becomes corrupt. Is there any way to read the binary stream until it encounters a "\r\n"? Here is my current (broken) code.
EDIT: I've gotten it to work. I updated the code below. However, it's really slow in doing so. I'm not sure if it has anything to do with the ArrayList or not, but it could be the culprit. Any pointers to speed up the code? It's currently taking 500ms to 900ms for a single frame.
public void run() {
long startTime = System.currentTimeMillis();
try {
URLConnection urlConn = url.openConnection();
urlConn.setReadTimeout(15000);
urlConn.connect();
urlStream = urlConn.getInputStream();
DataInputStream dis = new DataInputStream(urlStream);
ByteArrayOutputStream baos = new ByteArrayOutputStream();
ArrayList<Byte> bytes = new ArrayList<Byte>();
byte cur;
int curi;
byte[] curBytes;
int length = 0;
while ((curi = dis.read()) != -1) {
cur = (byte) curi;
bytes.add(cur);
curBytes = getPrimativeArray(bytes);
String curBytesString = new String(curBytes, "UTF-8");
if (curBytesString.equals("--ipcamera\r\n")) {
bytes.clear();
continue;
} else if (curBytesString.equals("Content-Type: image/jpeg\r\n")) {
bytes.clear();
continue;
} else if (curBytesString.matches("^Content-Length: ([0-9]+)\r\n$")) {
length = Integer.parseInt(curBytesString.replace("Content-Length: ", "").trim());
bytes.clear();
continue;
} else if (curBytesString.equals("\r\n")) {
if (length == 0) {
continue;
}
byte[] frame = new byte[length];
dis.readFully(frame, 0, length);
writeFrame(frame);
bytes.clear();
break;
}
}
} catch (Exception e) {
e.printStackTrace();
}
long curTime = System.currentTimeMillis() - startTime;
System.out.println(curTime);
}
private byte[] getPrimativeArray(ArrayList<Byte> array) {
byte[] bytes = new byte[array.size()];
for (int i = 0; i < array.size(); i++) {
bytes[i] = array.get(i).byteValue();
}
return bytes;
}
private void writeFrame(byte[] bytes) throws IOException {
File file = new File("C:\\test.jpg");
FileOutputStream fos = new FileOutputStream(file);
fos.write(bytes);
fos.close();
System.out.println("done");
}
Currently you do not cope with the case when data is read in the frame part.
A rough assumption is:
Current version:
else if (line.equals("") && length != 0)
Probably more correct version:
else if (!line.equals("") && length != 0)
You cannot use BufferedReader to read binary, it will corrupt it. I you want to keep things simple, use DataInputStream.readLine(). Though not ideal, it may be the simplest in your case.
Other than using some bad practices and assuming that your URLConnection correctly delivers the data, the example you posted seems to work if you reset the length to zero after reading the frame data.
} else if (line.equals("") && length != 0) {
char[] buf = new char[length];
reader.read(buf, 0, length);
baos.write(new String(buf).getBytes());
//break;
length = 0; // <-- reset length
}
Please note this way all the frame data are written in the same ByteArrayOutputStream consecutively. If you don't want that, you should create a new ByteArrayOutputStream for every new frame you encounter.
You can't use a BufferedReader for part of the transmission and then some other stream for the rest of it. The BufferedReader will fill its buffer and steal some of the data you want to read with the other stream. Use DataInputStream.readLine(), noting that it's deprecated, or else roll your own line-reading code, using the input stream provided by the URLConnection.
Surely you don't have to? URLConnection reads the headers for you. If you want the content-length, use the API to get it. The stuff you get to read starts at the body of the transmission.

Categories

Resources