I am trying to make a tcp request in Java to a existing TCP server that is already available.
the interface specification is:
Field Length Type
Length 2 bytes 16-bits binary
Message ID 1 byte 8-bits binary
MSGTYPE 1 byte 8-bits binary
Variable1 4 bytes 32-bits binary
Variable2 30 bytes ASCII
Variable3 1 byte 8-bits binary
I understand how to convert a String to Binary using BigInteger.
String testing = "Test Binary";
byte[] bytes = testing.getBytes();
BigInteger bi = new BigInteger(bytes);
System.out.println(bi.toString(2));
My Understanding is that if i wanted to make a TCP request i would first
need to convert each binary to a string
append the values to a StringBuffer.
Unfortunately my understanding is limited so i wanted some advice on creating the TCP request correctly.
I wouldn't use String (as you have binary data), StringBuffer (ever), or BigInteger (as it not what its designed for).
Assuming you have a Big Endian data stream I would use DataOutputStream
DataOutptuStream out = new DataOutputStream(new BufferedOutputStream(socket.getOutputStream()));
out.writeShort(length);
out.write(messageId);
out.write(msgtype);
out.write((var1+"\0\0\0\0").substring(0, 4).getBytes("ISO-8859-1"));
out.write(var2.getBytes("ISO-8859-1"));
out.write(var2);
out.flush(); // optionally.
If you have a little endian protocol, you need to use ByteBuffer in which case I would use a blocking NIO SocketChannel.
BTW I would use ISO-8859-1 (8-bit bytes) rather than US-ASCII (7-bit bytes)
You are going into the wrong direction. The fact that the message specification states that, for example, first field is a 16 bit binary doesn't mean that you will send a binary string. You will just send an (unsigned?) 16 bit number which will be, as a matter of fact, codified in binary since it's internal representation can just be that one.
When writing onto a socket through an DataOutputStream like in
int value = 123456;
out.writeInt(value);
you are already writing it in binary.
Related
I want to convert any length of String to byte32 in Java.
Code
String s="9c46267273a4999031c1d0f7e40b2a59233ce59427c4b9678d6c3a4de49b6052e71f6325296c4bddf71ea9e00da4e88c4d4fcbf241859d6aeb41e1714a0e";
//Convert into byte32
From the comments it became clear that you want to reduce the storage space of that string to 32 bytes.
The given string can easily be compressed from the 124 bytes to 62 bytes by doing a hexadecimal conversion.
However, there is no algorithm and there will not be an algorithm that can compress any data to 32 bytes. Imagine that would be possible: it would have been implemented and you would be able to get ZIP files of just 32 bytes for any file you compress.
So, unfortunately, the answer is: it's not possible.
You can not convert any length string to a byte array of length 32.
Java uses UTF-16 as it's string encoding, so in order to store 100% of the string, 1:1 as a fixed length byte array, you would be at a surface glance be limited to 16 characters.
If you are willing to live with the limitation of 16 characters, byte[] bytes = s.getBytes(); should give you a variable length byte array, but it's best to specify an explicit encoding. e.g. byte [] array2 = str.getBytes("UTF-16");
This doesn't completely solve your problem. You will now likely have to check that the byte array doesn't exceed 32 bytes, and come up with strategies for padding, possible null termination (which may potentially eat into your character budget)
Now, if you don't need the entire UTF-16 string space that Java uses for strings by default, you can get away with longer strings, by using other encodings.
IF this is to be used for any kind of other standard or something ( I see references to etherium being thrown around) then you will need to follow their standards.
Unless you are writing your own library for dealing with it directly, I highly recommend using a library that already exists, and appears to be well tested, and used.
You can achieve with the following function
byte[] bytes = s.getBytes();
I have a Java SocketServer that sends doubles to a C# client. The sever sends the doubles with DataOutputStream.writeDouble() and the client reads the double with BinaryReader.ReadDouble().
When I send dos.writeDouble(0.123456789); and flush it from the server the client reads and outputs 3.1463026401691E+151 which is different from what I sent.
Are the C# and Java doubles each encoded differently?
In Java, DataOutputStream.writeDouble() converts the double to a long before sending, writing it High byte first (Big endian).
However, C#, BinaryReader.ReadDouble() reads in Little Endian Format.
In other words: The byte order is different, and changing one of them should fix your problem.
The easiest way to change the byte order in Java from Big to Little Endian is to use a ByteBuffer, where you can specify the endian type: eg:
ByteBuffer buffer = ByteBuffer.allocate(yourvaluehere);
buffer.order(ByteOrder.LITTLE_ENDIAN);
// add stuff to the buffer
byte[] bytes = buffer.array();
Then, use DataOutputStream.write()
The issues is in fact with the encoding, specifically endianness. Java uses big endian format, which is the standard network endianness, while your C# client is using little endian format.
So here is what happened: 0.123456789 is stored in the IEEE754 Double precision format as 0x3FBF9ADD3739635F. When this is read in C#, the byte order is switched, so it is stored as 0x5F633937DD9ABF3F. This corresponds to the decimal number 3.14630264016909969143315814746e151.
Take a look at this questing to see about reversing the byte order on the C# client side
I have this code in JAVA :
socket = new Socket("127.0.0.1", 10);
OutputStream os = socket.getOutputStream();
os = socket.getOutputStream();
int data=50000;
os.w.write(data.toByteArray());
os.write(ByteBuffer.allocate(4).putInt(data).array());
And in the C# :
byte[] ba = readint(networkStream);
networkStream.Flush();
if (BitConverter.IsLittleEndian) Array.Reverse(ba);
int i = BitConverter.ToInt32(ba, 0); //50000
It's all working fine but :
I saw this image :
But the part that interested me was the "Network Order - under Big Endian"
I read in wiki that
Many IETF RFCs use the term network order, meaning the order of
transmission for bits and bytes over the wire in network protocols.
Among others, the historic RFC 1700 (also known as Internet standard
STD 2) has defined its network order to be big endian, though not all
protocols do.
Question
if Java uses big endian and the tcp also uses "Networked order" - big Endian -
So Why - in my C# - I had to check if it's big endian ?
I mean I could do :
Array.Reverse(ba);
Without checking : if (BitConverter.IsLittleEndian) Array.Reverse(ba);
Am I right ?
If so , What about the case that some unknown source sends me data and I don't know if he sent it big or small endian ? He would have to send me a first byte to indicate right ? but the first byte is also subject to endianness.....Where is my misunderstanding ?
You are assuming that this is a 32-bit in big endian, you could also assume that BitConverter has a default of little endian (or you could change it to big endian and not reverse it in the first place) BTW ByteBuffer supports little endian too.
You could also send little endian
os.write(ByteBuffer.allocate(4)
.order(ByteOrder.LITTLE_ENDIAN)
.putInt(data)
.array());
What about the case the some unknown source sends me data and I don't know if he sent it big or small endian ?
Then you don't know how to decode it, for sure. You can guess if you have enough data.
He would have to send me a first byte to indicate right ?
She could but you would have to know that the first byte tells you the order and how to interperate that first byte. Simpler to assume a given byte order.
but the first byte is also subject to endianness
So imagine a byte is written in little or big endian. How would it be any different? ;)
I am sending a byte by a TCP connection, when I send a single negative number (like -30 in this example) I get three bytes:
Client Side:
PrintWriter out = new PrintWriter(new BufferedWriter(new OutputStreamWriter(socket.getOutputStream())));
out.write((byte)-30);
out.flush();
out.close();
Server Side:
is = new DataInputStream(clientSocket.getInputStream());
is.readFully(bbb);
for (int i=0;i<bbb.length;i++)
System.out.println(i+":"+bbb[i]);
what i get is:
0:-17
1:-65
2:-94
but I sent just -30
You're using a writer, and you're calling Writer.write(int):
Writes a single character. The character to be written is contained in the 16 low-order bits of the given integer value; the 16 high-order bits are ignored.
So you've got an conversion to int, then the bottom 16 bits of that int are taken. So you're actually writing Unicode character 65506 (U+FFE2), in your platform default encoding (which appears to be UTF-8). That's not what you want to write, but that's what you are writing.
If you only want to write binary data, you shouldn't be using a Writer at all. Just use OutputStream - wrap it in DataOutputStream if you want, but don't use a Writer. The Writer classes are for text.
I'm reading a file from serialport using x-modem protocol and 133 bytes packet. I'm reading
in that
1 byte is SOH
2 byte packet number
3 byte nagative of packet number
next 128 bytes data
2 bytes CRC sent from other side.
I have to calculate CRC of 128 bytes data and 2 bytes crc sent from other side that I have to make it single byte and have to comapare with my calculated crc. How can I do this in java?
Try using Jacksum.
Sun JDK 1.6 contains sun.misc.CRC16, but there is a possibility this is not the CRC16 you're looking for, since there's several different polynomials in use.
Here is my C code, which is trivial to port to Java - you are free to use it in any way you like. The references to word are for a 16 bit unsigned value - you should be able to use a char instead in Java.
It's been too long since I worked with 16 bit CRC's so I don't recall if there are variations based on seeding. I am pretty sure I used this code in a C implementation of X-Modem way back when.
The source is posted on tech.dolhub.com.