C# equivalent of this stream related code (Java) - java

I'm more of a Java developer than a C# developer, but I work with both languages; Unfortunately I'm nowhere near the level of most people in either language, but that's why I'm constantly asking questions and reading to expand my knowledge.
Currently I've been working on a Server/Client in Java which works wonderfully, I've written a test client in Java for a game that I've been working on in Unity3D. Honestly, I would just use Java for the entire game if the community was there to easily get level designers etc.
In this code, I'm using a BufferedInputStream in java, and the setup looks like this:
DataInputStream dataIn = new DataInputStream(new BufferedInputStream(socket.getInputStream()));
Throughout this code, I perform the following logic to check to make sure all of the data for a specified packet as arrived:
if(dataIn.available() < 4) {
continue;
}
dataIn.mark(4);
int length = dataIn.readInt();
System.out.println("Packet length is" + length);
if(dataIn.available() < length) {
System.out.println("Only read " + dataIn.available() + "/ " + length + " bytes.");
dataIn.reset();
continue;
}
and I've been struggling to find an equivalent to this in C#. -- Other issues I've noticed is that a byte being sent by java's DataOutputStream, and the byte being read by C#'s BinaryReader is not always the same, but that's another problem.

Something like this reads all the expected data in a memorystream. Further processing of the received data is possible by using the memorystream as a stream, or by getting the bytes in it with memoryStream.ToArray().
using (var ns = new NetworkStream(socket))
{
int dataLength = 0;
// reading the datalength
for (int i = 0; i < 4; i++)
{
while (!ns.DataAvailable)
{
System.Threading.Thread.Sleep(20);
}
dataLength = (dataLength << 8) + ns.ReadByte();
}
// reading the data
byte[] buffer = new byte[1024];
int bytesRead;
using (var memoryStream = new MemoryStream())
{
while (memoryStream.Length < dataLength)
{
while (!ns.DataAvailable)
{
System.Threading.Thread.Sleep(20);
}
bytesRead = ns.Read(buffer, 0, buffer.Length);
memoryStream.Write(buffer, 0, bytesRead);
}
}
}
Edit: minimalistic approach
Beware of the socket.ReceiveBufferSize when using this approach! If it's smaller than data size you're in for a long sleep.
Socket socket = listener.AcceptSocket();
while (socket.Available < 4)
{
System.Threading.Thread.Sleep(20);
}
byte[] lengthBuffer = new byte[4];
socket.Receive(lengthBuffer);
if (BitConverter.IsLittleEndian) Array.Reverse(lengthBuffer);
int dataLength = BitConverter.ToInt32(lengthBuffer, 0);
while (socket.Available < dataLength)
{
System.Threading.Thread.Sleep(20);
}
byte[] dataBuffer = new byte[dataLength];
socket.Receive(dataBuffer);

Related

InputStream audio mixing (MODE_STREAM)

I'm making a drum sequencer in Android...
I'm writing to an AudioTrack in MODE_STREAM, so that I can achieve synchronized audio playback with all InputStreams (availible via a list of 'active' InputStreams, activeStreams in the code below)
The audio is always: PCM (WAV), 16bit Stereo 44100 Hz.
Obviously, I can't composite audio in real time on the UI thread, so I'm using an AsyncTask to queue up all the audio buffering.
I got buffered playback working, but when it comes to merging the buffers of two (or more) InputStream's, the internet seems to be in some kind of debate of what to do next. "Convert the byte[] to short[]!", "No, do the bit mixing on-the-fly!", "But if you don't use shorts the byte Endianness is ignored!", "It gets ignored anyway!" - I don't even know any more.
How do I mix the buffer of two or more InputStreams? I don't understand why my current implementation is failing
I've tried like, 4 different StackOverflow solutions to convert the byte[] to short[] so I can add the samples together, but the conversion always instantly crashes Java with some cryptic error message that I can't get my head around. So now I give up. Here's my code implementing one such StackOverflow solution...
protected Long doInBackground(Object ... Object) {
int bytesWritten = 0;
InputStream inputStream;
int si = 0, i = 0;
//The combined buffers. The 'composition'
short[] cBuffer = new short[Synth.AUDIO_BUFFER_SIZE];
//The 'current buffer', the segment of inputStream audio.
byte[] bBuffer = new byte[Synth.AUDIO_BUFFER_SIZE];
//The 'current buffer', converted to short?
short[] sBuffer = new short[Synth.AUDIO_BUFFER_SIZE];
int curStreamNum;
int numStreams = activeStreams.size();
short mix;
//Start with an empty 'composition'
cBuffer = new short[Synth.AUDIO_BUFFER_SIZE];
boolean bufferEmpty = false;
try {
while(true) { // keep going forever, until stopped or paused.
for(curStreamNum = 0;curStreamNum < numStreams;curStreamNum++){
inputStream = activeStreams.get(curStreamNum);
i = inputStream.read(bBuffer);
bufferEmpty = i<=-1;
if(bufferEmpty){
//Input stream buffer was empty. It's out of audio. Close and remove the stream.
inputStream.close();
activeStreams.remove(curStreamNum);
curStreamNum--; numStreams--; continue; // hard continue.
}else{
//Take the now-read buffer, and convert to shorts.
ByteBuffer.wrap(bBuffer).order(ByteOrder.LITTLE_ENDIAN).asShortBuffer().get(sBuffer);
//Take the short buffer, merge into composition buffer.
//TODO: Optimize by making the 'first layer' of the composition the first buffer, on its own.
for(si=0;si<Synth.AUDIO_BUFFER_SIZE;si++){
mix = (short) (sBuffer[si] + cBuffer[si]);
//This part is probably completely wrong too. I'm not up to here yet to evaluate whats needed...
if(mix >= 32767){
mix = 32767;
}else if (mix <= -32768){
mix = -32768;
}
cBuffer[si] = mix;
}
}
}
track.write(sBuffer, 0, i);
//It's always full; full buffer of silence, or of composited audio.
totalBytesWritten += Synth.AUDIO_BUFFER_SIZE;
//.. queueNewInputStreams ..
publishProgress(totalBytesWritten);
if (isCancelled()) break;
}
} catch (IOException e) {e.printStackTrace();}
return Long.valueOf(totalBytesWritten);
}
I'm currently getting a BufferUnderflowException on this line: ByteBuffer.wrap(bBuffer).order(ByteOrder.LITTLE_ENDIAN).asShortBuffer().get(sBuffer);.
How is it possible to have buffer underrun? I'm only converting a byte[] to a short[].
Please help!
I've posted my whole function in the hopes that this more complete code sample and fairly adaptable usage can help other people out there.
(P.S. the byte[] to short[] conversion is followed by some flimsy hard clipping which I'm not even up to debugging yet, but advice there would also be appreciated)
Your solution seems almost good, I see two issues and a potential one:
the length of the short array: it MUST be the half of the byte array, otherwise you get the underflow
the sum of the short must be the average of the shorts and not just the sum, or you'll get just noise
(potential issue) the length of the array you read by InputStream cannot be totally free, since you have to sum 2bytes for every InputStream (then it must be an even array) and you should take care of mono vs. stereo audio files (if stereo you have 2bytes for the left channel and 2bytes for the right channel interleaved)
Here you can find a snippet that I would use to sum of two WAV array (16bit, mono)
Random random = new Random();
int bufferLength = 20;
byte[] is1 = new byte[bufferLength];
byte[] is2 = new byte[bufferLength];
byte[] average = new byte[bufferLength];
random.nextBytes(is1);
random.nextBytes(is2);
short[] shorts1 = new short[bufferLength/2];
ByteBuffer.wrap(is1).order(ByteOrder.LITTLE_ENDIAN).asShortBuffer().get(shorts1);
short[] shorts2 = new short[bufferLength/2];
ByteBuffer.wrap(is2).order(ByteOrder.LITTLE_ENDIAN).asShortBuffer().get(shorts2);
short[] result = new short[bufferLength/2];
for (int i=0; i<result.length; i++) {
result[i] = (short) ((shorts1[i] + shorts2[i])/2);
}
ByteBuffer.wrap(average).order(ByteOrder.LITTLE_ENDIAN).asShortBuffer().put(result);
For a 32bit stereo, the solution could be
Random random = new Random();
int bufferLength = 8 * 50;
byte[] is1 = new byte[bufferLength];
byte[] is2 = new byte[bufferLength];
byte[] average = new byte[bufferLength];
random.nextBytes(is1);
random.nextBytes(is2);
System.out.println(bytesToHex(is1));
System.out.println(bytesToHex(is2));
int[] ints1 = new int[bufferLength/4];
ByteBuffer.wrap(is1).order(ByteOrder.LITTLE_ENDIAN).asIntBuffer().get(ints1);
int[] ints2 = new int[bufferLength/4];
ByteBuffer.wrap(is2).order(ByteOrder.LITTLE_ENDIAN).asIntBuffer().get(ints2);
int[] result = new int[bufferLength/4];
for (int i=0; i<result.length; i++) {
result[i] = ((ints1[i] + ints2[i])/2);
}
ByteBuffer.wrap(average).order(ByteOrder.LITTLE_ENDIAN).asIntBuffer().put(result);

Download size larger than available stream/file size. (Asha 501, J2ME)

I'm trying to download a song file. The following code (well, the original code, this is just an example of what I'm doing) is working perfectly on an Asha 310 device. However, on the newer Asha 501 devices, the resulting downloaded file is much larger than the actual file size.
A 2.455.870 byte file ends up downloading 2.505.215 bytes if I use a 512 buffer, and it doesn't load either. Using a 4096 buffer, the file ends up being 3.342.335 bytes in size!!
What could be the reason for this happening? It's working perfectly on the other phone, and I'm using very reasonable buffers.
downloadedFile = (FileConnection) Connector.open(saveLocation+"testing.m4a", Connector.READ_WRITE);
if (!downloadedFile.exists()) {
downloadedFile.create();
}
ops = downloadedFile.openOutputStream();
hc = (HttpConnection) Connector.open(url);
hc.setRequestMethod(HttpsConnection.POST);
hc.setRequestProperty("Content-Type", "application/x-www-form-urlencoded");
String postData = "sid=" + session.sid + "&fileid=" + file.getId();
byte[] request_body = postData.getBytes();
DataOutputStream dos = null;
dos = hc.openDataOutputStream();
for (int i = 0; i < request_body.length; i++) {
dos.writeByte(request_body[i]);
}
byte[] buf = new byte[512];
dis = hc.openInputStream();
int downloadSize = 0;
while (dis.read(buf) != -1) {
ops.write(buf, 0, buf.length);
downloadedSize += buf.length;
}
Turns out the buffer isn't being fully filled out, so the rest of each buffer that isn't filled out is junk. Which explains why when a bigger buffer is set, the file is bigger, as it has more junk.
http://developer.nokia.com/Community/Discussion/showthread.php/244179-Download-size-larger-than-available-stream-file-size-(Asha-501)
int len;
while((len=dis.read(buf))!=-1)
{
ops.write(buf,0,len);
downloadedSize += len;
}
Edit: It was working on the older phones because they filled out the entire buffer all the time with actual data. The newer devices don't.

Send file length with outputstream and receive length and byte[] with inputstream for streaming frames from one device to the other Android/Java

I have searched and searched and everything I have found has been helpful but I keep getting an out of memory error. The images I send are .06 MB so I know the problem isn't from decoding the byte[] into a bitmap. When I remove the while loops this works like a charm for one frame but I want multiple frames. I am getting a byte[] and sending it to a different device using sockets but I am at a loss how to do this. My problem is that I don't send and receive the correct byte[] length. This is what i am doing currently.
while (count != -1) {
//first send the byte[] length
dataOutputStream.writeInt(sendPackage.length);
//pass a byte array
publishProgress("sending file to client");
showMyToastOnUiThread(String.valueOf(sendPackage.length));
outputStream.write(sendPackage, 0, sendPackage.length);
outputStream.flush();
}
Receive byte[] on different device:
int count = inputStream.read();
while (count != -1) {
int byteArrayLength = dataInputStream.readInt();
Log.i(MainActivity.TAG, "Starting convert to byte array");
byte[] receivedBytes = convertInputStreamToByteArray(inputStream, byteArrayLength);
Bitmap bitmap = BitmapFactory.decodeByteArray(receivedBytes, 0, receivedBytes.length);
publishProgress(bitmap);
}
//convert inputstream to byte[]
public byte[] convertInputStreamToByteArray(InputStream inputStream, int readLength) {
ByteArrayOutputStream bos = new ByteArrayOutputStream();
byte[] data = new byte[readLength];
try {
Log.i(MainActivity.TAG, "Starting convert to byte array while loop");
int readTotal = 0;
int count = 0;
while (count >= 0 && readTotal < readLength) {
count = inputStream.read(data, readTotal, readLength - readTotal);
if (readLength > 0) {
readTotal += count;
}
}
Log.i(MainActivity.TAG, "Finished convert to byte array while loop");
} catch (IOException e) {
Log.e(MainActivity.TAG, "error: " + e.getMessage());
e.printStackTrace();
}
return data;
}
This is the problem:
int count = inputStream.read();
while (count != -1) {
You're consuming a byte and then ignoring it. That means the next value you read (the size) will be incorrect. You need a different way of telling whether you're at the end of the stream. Some options:
Send a -1 when you're finished; that way you can stop as soon as readInt returns -1
If you know it, send the number of images you're going to send before you start sending them
Use mark(1), then read(), then reset() - if your stream supports marking. I don't know whether it will or not. You could always wrap it in BufferedInputStream if not.
Reimplement DataInputStream.readInt yourself in a way which detects the end of the stream as being an expected possibility instead of throwing an exception
Just catch an exception in readInt (not nice - getting to the end of the stream isn't really exceptional)

Reading binary stream until "\r\n" is encountered

I'm working on a Java application which will stream video from an IP Camera. The video streams from the IP Camera in MJPEG format. The protocol is the following...
--ipcamera (\r\n)
Content-Type: image/jpeg (\r\n)
Content-Length: {length of frame} (\r\n)
(\r\n)
{frame}
(\r\n)
--ipcamera (\r\n)
etc.
I've tried using classes such as BufferedReader and Scanner to read until the "\r\n", however those are meant for text and not binary data, so it becomes corrupt. Is there any way to read the binary stream until it encounters a "\r\n"? Here is my current (broken) code.
EDIT: I've gotten it to work. I updated the code below. However, it's really slow in doing so. I'm not sure if it has anything to do with the ArrayList or not, but it could be the culprit. Any pointers to speed up the code? It's currently taking 500ms to 900ms for a single frame.
public void run() {
long startTime = System.currentTimeMillis();
try {
URLConnection urlConn = url.openConnection();
urlConn.setReadTimeout(15000);
urlConn.connect();
urlStream = urlConn.getInputStream();
DataInputStream dis = new DataInputStream(urlStream);
ByteArrayOutputStream baos = new ByteArrayOutputStream();
ArrayList<Byte> bytes = new ArrayList<Byte>();
byte cur;
int curi;
byte[] curBytes;
int length = 0;
while ((curi = dis.read()) != -1) {
cur = (byte) curi;
bytes.add(cur);
curBytes = getPrimativeArray(bytes);
String curBytesString = new String(curBytes, "UTF-8");
if (curBytesString.equals("--ipcamera\r\n")) {
bytes.clear();
continue;
} else if (curBytesString.equals("Content-Type: image/jpeg\r\n")) {
bytes.clear();
continue;
} else if (curBytesString.matches("^Content-Length: ([0-9]+)\r\n$")) {
length = Integer.parseInt(curBytesString.replace("Content-Length: ", "").trim());
bytes.clear();
continue;
} else if (curBytesString.equals("\r\n")) {
if (length == 0) {
continue;
}
byte[] frame = new byte[length];
dis.readFully(frame, 0, length);
writeFrame(frame);
bytes.clear();
break;
}
}
} catch (Exception e) {
e.printStackTrace();
}
long curTime = System.currentTimeMillis() - startTime;
System.out.println(curTime);
}
private byte[] getPrimativeArray(ArrayList<Byte> array) {
byte[] bytes = new byte[array.size()];
for (int i = 0; i < array.size(); i++) {
bytes[i] = array.get(i).byteValue();
}
return bytes;
}
private void writeFrame(byte[] bytes) throws IOException {
File file = new File("C:\\test.jpg");
FileOutputStream fos = new FileOutputStream(file);
fos.write(bytes);
fos.close();
System.out.println("done");
}
Currently you do not cope with the case when data is read in the frame part.
A rough assumption is:
Current version:
else if (line.equals("") && length != 0)
Probably more correct version:
else if (!line.equals("") && length != 0)
You cannot use BufferedReader to read binary, it will corrupt it. I you want to keep things simple, use DataInputStream.readLine(). Though not ideal, it may be the simplest in your case.
Other than using some bad practices and assuming that your URLConnection correctly delivers the data, the example you posted seems to work if you reset the length to zero after reading the frame data.
} else if (line.equals("") && length != 0) {
char[] buf = new char[length];
reader.read(buf, 0, length);
baos.write(new String(buf).getBytes());
//break;
length = 0; // <-- reset length
}
Please note this way all the frame data are written in the same ByteArrayOutputStream consecutively. If you don't want that, you should create a new ByteArrayOutputStream for every new frame you encounter.
You can't use a BufferedReader for part of the transmission and then some other stream for the rest of it. The BufferedReader will fill its buffer and steal some of the data you want to read with the other stream. Use DataInputStream.readLine(), noting that it's deprecated, or else roll your own line-reading code, using the input stream provided by the URLConnection.
Surely you don't have to? URLConnection reads the headers for you. If you want the content-length, use the API to get it. The stuff you get to read starts at the body of the transmission.

Sending data using a python client to a java server

I have been searching the web for a while now looking for an answer to this.
The python code for sending a file:
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
with open(path, mode='rb') as f:
s.sendall(f.read())
The java code for receiving the data:
private BufferedInputStream bis = new BufferedInputStream(socket.getInputStream());
try (BufferedOutputStream bos = new BufferedOutputStream(new FileOutputStream(path))) {
while (true) {
int size = bis.read(by);
if (size > 0) {
bos.write(by, 0, size);
bos.flush();
total += size;
System.out.println(size + "(" + total + ")");
if (total == length) {
break;
}
}
}
}.....
I see that the data is being sent, and i know the data is in the stream at the java end. However the bis.read(by) call hangs, and refuses to read the data until the data the connection dies.
I suspect it has something to do with the "flush" stuff in java, but i can not find any way to do a "flush" using python.
Any clues why this might happen?
I figured it out, for some reason a race condition is occuring, putting a sleep(0,2) in the python code made the java server able to responde.
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
sleep(0.2)
with open(path, mode='rb') as f:
s.sendall(f.read())

Categories

Resources