I have client-server app. Client on C++, server on Java.
I am sending byte-stream form client to server, and from server to client.
Tell me please, when I sent char(-1) from C++, what value equals to it in Java?
And what value I must sent from Java to C++, to get char(-1) in Cpp code?
As you are writing through a byte stream, your char(-1) arrives as 255, as byte streams normally transmit unsigned bytes.
The -1 which is read when you read the end of a stream can not be send explicitely but only through closing the stream.
There's no single answer; it depends on how C++ encodes the data and how Java interprets it. The most common encoding of char(-1) is the number 255. Note that this isn't defined by C++; a one's-complement system might encode it as 254. But also note that there are innumerable ways to encode data across the wire: Elias coding, various ASN.1 encodings, decimal digits, hex, etc.
At the Java end, even assuming a simple char-to-byte encoding, it depends on how you de-serialise the byte and into what type.
Related
I am creating an easy to use server-client model with an extensible protocol, where the server is in Java and clients can be Java, C#, what-have-you.
I ran into this issue: Java data streams write strings with a short designating the length, followed by the data.
C# lets me specify the encoding I want, but it only reads one byte for the length. (actually, it says '7 bits at a time'...this is odd. This might be part of my problem?)
Here is my setup: The server sends a string to the client once it connects. It's a short string, so the first byte is 0 and the second byte is 9; the string is 9 bytes long.
//...
_socket.Connect(host, port);
var stream = new NetworkStream(_socket);
_in = new BinaryReader(stream, Encoding.UTF8);
Console.WriteLine(_in.ReadString()); //outputs nothing
Reading a single byte before reading the string of course outputs the expected string. But, how can I set up my stream reader to read a string using two bytes as the length, not one? Do I need to subclass BinaryReader and override ReadString()?
The C# BinaryWriter/Reader behavior uses, if I recall correctly, the 8th bit to signify where the last byte of the count is. This allows for counts up to 127 to fit in a single byte while still allowing for actual count values much larger (i.e. up to 2^31-1); it's a bit like UTF8 in that respect.
For your own purposes, note that you are writing the whole protocol (presumably), so you have complete control over both ends. Both behaviors you describe, in C# and Java, are implemented by what are essentially helper classes in each language. There's nothing saying that you have to use them, and both languages offer a way to simply encode text directly into an array of bytes which you can send however you like.
If you do want to stick with the Java-based protocol, you can use BitConverter to convert between a short to a byte[] so that you can send and receive those two bytes explicitly. For example:
_in = new BinaryReader(stream, Encoding.UTF8);
byte[] header = _in.ReadBytes(2);
short count = BitConverter.ToInt16(header, 0);
byte[] data = _in.ReadBytes(count);
string text = Encoding.UTF8.GetString(data);
Console.WriteLine(text); // outputs something
I have a multi-threaded client-server application that uses Vector<String> as a queue of messages to send.
I need, however, to send a file using this application. In C++ I would not really worry, but in Java I'm a little confused when converting anything to string.
Java has 2 byte characters. When you see Java string in HEX, it's usually like:
00XX 00XX 00XX 00XX
Unless some Unicode characters are present.
Java also uses Big endian.
These facts make me unsure, whether - and eventually how - to add the file into the queue. Preferred format of the file would be:
-- Headers --
2 bytes Size of the block (excluding header, which means first four bytes)
2 bytes Data type (text message/file)
-- End of headers --
2 bytes Internal file ID (to avoid referring by filenames)
2 bytes Length of filename
X bytes Filename
X bytes Data
You can see I'm already using 2 bytes for all numbers to avoid some horrible operations required when getting 2 numbers out of one char.
But I have really no idea how to add the file data correctly. For numbers, I assume this would do:
StringBuilder packetData = new StringBuilder();
packetData.append((char) packetSize);
packetData.append((char) PacketType.BINARY.ordinal()); //Just convert enum constant to number
But file is really a problem. If I have also described anything wrongly regarding the Java data types please correct me - I'm a beginner.
Does it have to send only Strings? I think if it does then you really need to encode it using base64 or similar. The best approach overall would probably be to send it as raw bytes. Depending on how difficult it would be to refactor your code to support byte arrays instead of just Strings, that may be worth doing.
To answer your String question I just saw pop up in the comments, there's a getBytes method on a String.
For the socket question, see:
Java sending and receiving file (byte[]) over sockets
I am trying to implement a binary protocol in a Java Android application. The variables in this protocol are unsigned and are either uint32, uint16 or uint8.
I am having trouble with sending integer values. For example, when I try to send a short with a value of 1, the server (written in C++) receives a value of 256.
After searching a bit, I saw some posts talking about endianess and all but it doesn't really give me an answer.
How can I manage to get the bits stored in my variables in Java aligned in the same fashion as the one in C++.
Thanks
C++ does not define the order in which bytes in multibyte integers are stored. You should pick one standard and make sure that everyone uses it.
The standard API in Java has many classes that use big-endian byte order, so you might as well use that as the standard. To receive these correctly in C++ you can use the ntohl and ntohs functions for the conversion, for example.
I solved it! It was a problem of endianess. I solved it using the order() method of ByteBuffer.
I have a client in Python that sends data (preceded by a data length message):
s = socket.socket()
s.connect((host, port))
data = 'hello world'
s.sendall('%16s' % len(data)) #send data length
s.sendall(data) #send data
s.close()
And a server in Java that receives the data. The server uses DataInputStream.readInt() to read the data length before reading the data. However I seem to be getting really large numbers returned by readInt(). What is the problem?
Java expects the binary representation of your integer. You can use the struct module to generate binary representations.
In your case, this would be:
import struct
s.sendall(struct.pack('i', len(data)))
Also make sure you use the correct byte order. Java could be expecting network byte order.
As #mensi says, absent any other processing Java expects to receive the binary representation of the data, which differs from what Python is sending. A common solution to this sort of issue is to serialize your data -- that is, to translate it into a format more suitable for network transmission, and reconstitute the data on the receiving side.
A common serialization format for which both Python and Java have support is JSON. Recent versions of Python have the json module as part of the standard library.
I have a Java Client and C++ server. All values are sent as byte array. The numeric values are received fine but the string values when stored in char array in C++, have special characters like new page or new line feed at the end of the value. Can someone suggest a solution to the problem?
Yes - use google protocol buffers for serialization/deserialization. It's an open-source, stable, easy-to-use cross-platform package.
How are you serialising / deserialising? You should decide on an encoding (for example ASCII) then write the length of the string first as an int, that way the server can read the int and will know how many bytes to read of the string.
Once its read the bytes it just needs to tail the char* with a '\0' to terminate the string in the array.
Depending on what you are using to write the string in Java you would do something like:
writeInt(string.length());
writeBytes(string.getBytes("ASCII"));
and in your C++ server you would do the reverse.
1) Make sure the server code is complying with your protocol at the byte level.
2) Make sure the client code is complying with your protocol at the byte level.
3) If you have done 1 and 2, and you still have problems, your protocol is broken. Most likely, it fails to properly specify how the server specifies where the strings end and how the client establishes where the strings end.