I'm having some problem sending datagram packets in java. I have part of my code below.
Sender:
String str = "abcdefghijk.txt"
byte[] data = new byte[1000];
ByteBuffer buf = ByteBuffer.wrap(data);
buf.put(str.getBytes());
//data = str.getBytes(); line 1
//checksum
crc.reset();
crc.update(data, 8, data.length-8);
checksum = crc.getValue();
buf.rewind();
buf.putLong(checksum);
packet = new DatagramPacket(data, data.length, address);
Receiver:
packet.setLength(data.length);
socket.receive(packet);
data = packet.getData();
str = new String(data);
str = str.trim();
buf.rewind();
checksum = buf.getLong();
crc.reset();
crc.update(data, 8, packet.getLength()-8);
I will then do a check by using checksum==crc.getValue(). If i run the code as it is, my checksum is valid but the str received will be like this -> ##$%ijk.txt (garbage values infront). First 8 characters are gone in this case, which I think has something to do with the getLong().
However if i use line 1 in my code, the str received is correct (abcdefghijk.txt), but the checksum will be wrong.
Note that the code is not the entire thing but only the part that is affecting the output. Any help will be appreciated.
I believe your problem here is you consider that your packet will arrive in one chunk, but Streams have the property to cut the data into slices.
On the output, you have to encapsulate your data to know where you start and where you stop.
At the input, you have to rebuild your buffer chunk by chunk until you find that 'end tag'.
Are you using ObjectStreams ? If so, be aware they send and receive their own identifiers through the streams. It could explain the missing 8 bytes.
Related
I have a situation where I have a String and three binary literals i need to add to a Byte array to send to a server.
The Client:
String arbitrary = "/AN/ARBITRARY/STRING";
int b_f32b = 0b00000000000000000000000011111111;
int b_seconds = 0b00000000000000001111111111111111;
int b_fraction = 0b01000000000000000000000000000000;
ByteBuffer bb = ByteBuffer.allocate(1024);
bb.put(arbitrary.getBytes());
bb.putInt(b_f32b);
bb.putInt(b_seconds);
bb.putInt(b_fraction);
bb.clear();
byte[] sendDataBytes = new byte[bb.capacity()];
bb.get(sendDataBytes, 0, sendDataBytes.length);
DatagramPacket sendPacket = new DatagramPacket(sendDataBytes, sendData.length, IPAddress, 7000);
clientSocket.send(sendPacket);
On the Server:
DatagramPacket receivePacket = new DatagramPacket(receiveData, receiveData.length);
serverSocket.receive(receivePacket);
String sentence = new String( receivePacket.getData());
The result is:
RECEIVED: /AN/ARBITRARY/STRING ÿ ÿÿ#
The String works fine - but the binary values do not is there something fundamental I am missing?
You're missing the fundamental understanding between binary and character data.
Strings are character data, meaning they're displayable. Binary data is not necessary displayable (especially in this case, since you know that you're not writing character data).
The putInt() method puts 4 bytes in the buffer. Read it with getInt() and then you'll get a value you can display.
Edit:
You're also creating a 1024 byte array with byte[] sendDataBytes = new byte[bb.capacity()]; and sending that over the wire. You'll never know how much of the 1024 bytes are actual data, and how much is just empty garbage.
Yes, you're missing a lot of fundamentals here and it's too broad to address everything in an answer.
This is the subsequent question of my previous one:
Java UDP send - receive packet one by one
As I indicated there, basically, I want to receive a packet one by one as it is via UDP.
Here's an example code:
ds = new DatagramSocket(localPort);
byte[] buffer1 = new byte[1024];
DatagramPacket packet = new DatagramPacket(buffer1, buffer1.length);
ds.receive(packet);
Log.d("UDP-receiver", packet.getLength()
+ " bytes of the actual packet received");
Here, the actual packet size is say, 300bytes, but the buffer1 is allocated as 1024 byte, and to me, it's something wrong with to deal with buffer1.
How to obtain the actual packet size byte[] array from here?
and, more fundamentally, why do we need to preallocate the buffer size to receive UDP packet in Java like this? ( node.js doesn't do this )
Is there any way not to pre-allocate the buffer size and directly receive the UDP packet as it is?
Thanks for your thought.
You've answered your own question. packet.getLength() returns the actual number of bytes in the received datagram. So, you just have to use buffer[] from index 0 to index packet.getLength()-1.
Note that this means that if you're calling receive() in a loop, you have to recreate the DatagramPacket each time around the loop, or reset its length to the maximum before the receive. Otherwise getLength() keeps shrinking to the size of the smallest datagram received so far.
self answer. I did as follows:
int len = 1024;
byte[] buffer2 = new byte[len];
DatagramPacket packet;
byte[] data;
while (isPlaying)
{
try
{
packet = new DatagramPacket(buffer2, len);
ds.receive(packet);
data = new byte[packet.getLength()];
System.arraycopy(packet.getData(), packet.getOffset(), data, 0, packet.getLength());
Log.d("UDPserver", data.length + " bytes received");
}
catch()//...........
//...........
I'm using array of bytes to store data packet received from another computer.
receivedData = new byte[1024];
receivedPacket = new DatagramPacket(receivedData, receivedData.length);
socket.receive(receivedPacket);
receivedData = receivedPacket.getData();
String res = new String(receivedData); // PROBLEM HERE
the problem is at last line: because I declare receivedData as a byte stream with length 1024. so, always last line will create a new string from whole array, although it doesn't know exactly how many real byte I received. So, I meet frustrated error: res is not received as I wish. (because length of real bytes that I received not fix whole array)
So, my question is: how can I fix this point, how can I know how many bytes I really received to convert to string?
Try using DatagramPacket.getLength().
receivedData = new byte[1024];
receivedPacket = new DatagramPacket(receivedData, receivedData.length);
socket.receive(receivedPacket);
receivedData = receivedPacket.getData();
String charsetName = "US-ASCII"; // set to desired charset
String res = new String(receivedData, 0, receivedPacket.getLength(), charsetName);
Edited to add charset. Thanks, parsifal.
From the javadoc for DatagramSocket.receive():
The length field of the datagram packet object contains the length of
the received message
You can then construct your String using the constructor that takes a byte array and offsets.
Call DatagramPacket.getLength() to find out how many bytes were actually received.
And when you construct the String from those bytes, be sure to specify the encoding (as it is, you're using the JDK default encoding, which may differ from the server's encoding).
Trying to convert a received DatagramPacket to string, but I have a small problem. Not sure what's the best way to go about it.
The data I'll be receiving is mostly of unknown length, hence I have some buffer[1024] set on my receiving side. The problem is, suppose I sent string "abc" and the do the following on my receiver side...
buffer = new byte[1024];
packet = new DatagramPacket(buffer, buffer.length);
socket.receive(packet);
buffer = packet.getData();
System.out.println("Received: "+new String(buffer));
I get the following output: abc[][][][]][][][]..... all the way to the buffer length.
I'm guessing all the junk/null at the end should've been ignored, so I must be doing something wrong." I know the buffer.length is the problem because if I change it to 3 (for this example), my out comes out just fine.
Thanks.
new String(buffer, 0, packet.getLength())
Using this code instead:
String msg = new String(packet.getData(), packet.getOffset(), packet.getLength());
The DatagramPacket's length field gives the length of the actual packet received. Refer to the javadoc for DatagramPacket.receive for more details.
So you simply need to use a different String constructor, passing the byte array and the actual received byte count.
See #jtahlborn or #GiangPhanThanhGiang's answers for example.
However, that still leaves the problem of which character encoding should be used when decoding the bytes into a UTF-16 string. For your particular example it probably doesn't matter. But it you are passing data that could include non-ASCII characters, then you need to decode using the correct charset. If you get that wrong, you are liable to get garbled characters in your String values.
As I understand it, the DatagramPacket just has a bunch of junk at the end. As Stephen C. suggests, you might be able to find the actual length received. In that case, use:
int realSize = packet.getLength() //Method suggested by Stephen C.
byte[] realPacket = new byte[realSize];
System.arrayCopy(buffer, 0, realPacket, 0, realSize);
As for finding the length, I don't know.
Try
System.out.println("Received: "+new String(buffer).trim());
or
String sentence = new String(packet.getData()).trim();
System.out.println("Received: "+sentence);
Use this Code instead
buffer = new byte[1024];
packet = new DatagramPacket(buffer, buffer.length);
socket.receive(packet);
String data = new String(packet.getData());
System.out.println("Received: "+data);
I have a server-client application that is using a datagram socket to exchange messages. I have initially set the buffer size to be 1024 bytes because I dont know the length of the messages. When I send something that is shorter than 1024 bytes I get the rest of my string displayed as some weird characters (null characters or I am not sure how they are called).
Here is a screen:
Client code:
byte[] buf = ("This is another packet.\n").getBytes();
DatagramPacket packet = new DatagramPacket(buf, buf.length, inetAddress, serverport);
socket.send(packet)
Server code:
byte[] buf = new byte[1024];
DatagramPacket packet = new DatagramPacket(buf, buf.length);
socket.receive(packet);
socket.receive(packet);
byte[] data = new byte[packet.getLength()];
System.arraycopy(packet.getData(), packet.getOffset(), data, 0, packet.getLength());
DatagramPacket.getLength() returns the actual length of the received packet. Unless you created the packet with a non-zero offet, that means the data is at {0..getLength()-1}.
Note that this means the original length you created the DatagramPacket with is lost, which in turn implies that you must either use a new DatagramPacket per receive, or at least re-initalize its data buffer via setData(). Otherwise the DatagramPacket will keep shrinking to the size of the smallest packet received.
You have to check packet.getOffset() to find where in the buffer the received data starts and packet.getLength() to get the length of the data (in number of bytes).
You should also consider that if the received packet is too large to fit in the provided buffer (in your case >1024 bytes), the extra data is simply discarded. Unless you have to be very careful on memory usage, you should use a larger buffer to make sure that the entire packet will fit. In case of UDP, the maximum packet size is 64kB.
Ok so I came up with a solution that worked for me:
public String getRidOfAnnoyingChar(DatagramPacket packet){
String result = new String(packet.getData());
char[] annoyingchar = new char[1];
char[] charresult = result.toCharArray();
result = "";
for(int i=0;i<charresult.length;i++){
if(charresult[i]==annoyingchar[0]){
break;
}
result+=charresult[i];
}
return result;
}
EDIT:
There exists a better solution using ByteArrayOutputStream which can be found here: How to reinitialize the buffer of a packet?