Java equivalent of struct.unpack('d', s.decode('hex'))[0] - java

I am reading a file which is stored binaries.
In python I can decode the file easily
>>> s = '0000000000A0A240'
>>> s.decode('hex')
'\x00\x00\x00\x00\x00\xa0\xa2#'
>>> import struct
>>> struct.unpack('d', s.decode('hex'))[0]
2384.0
Now I want to do the same decoding in Java, do we have anything similar?

Since those bytes are in Little-Endian order, being C code on an Intel processor, use ByteBuffer to help flip the bytes:
String s = "0000000000A0A240";
double d = ByteBuffer.allocate(8)
.putLong(Long.parseUnsignedLong(s, 16))
.flip()
.order(ByteOrder.LITTLE_ENDIAN)
.getDouble();
System.out.println(d); // prints 2384.0
Here I'm using Long.parseUnsignedLong(s, 16) as a quick way to do the decode('hex') for 8 bytes.
If data is already a byte array, do this:
byte[] b = { 0x00, 0x00, 0x00, 0x00, 0x00, (byte) 0xA0, (byte) 0xA2, 0x40 };
double d = ByteBuffer.wrap(b)
.order(ByteOrder.LITTLE_ENDIAN)
.getDouble();
System.out.println(d); // prints 2384.0
Imports for the above are:
import java.nio.ByteBuffer;
import java.nio.ByteOrder;

Related

LengthFieldBasedFrameDecoder not parsing correctly when buffer size is less than frame size

I am unit testing a netty pipeline using the frame based decoder. It looks like the framing is incorrect if I use buffer size that is smaller that the largest frame. I am testing with a file that contains two messages. The length field is the second work and includes the length of the entire message including the length field and the work before it.
new LengthFieldBasedFrameDecoder(65536, 4, 4, -8, 0)
I am reading a file with various block sizes. The size of the first message is 348 bytes, the second is 456 bytes. If block size of 512, 3456, or larger, is used both messages are read and correctly framed to the next handler which for diagnostic purposes will print out as a hexadecimal string the contents of the buffer it received. If a smaller block size is used framing errors occur. The code used to read and write the file is shown below.
public class NCCTBinAToCSV {
private static String inputFileName = "/tmp/combined.bin";
private static final int BLOCKSIZE = 456;
public static void main(String[] args) throws Exception {
byte[] bytes = new byte[BLOCKSIZE];
EmbeddedChannel channel = new EmbeddedChannel(
new LengthFieldBasedFrameDecoder(65536, 4, 4, -8, 0),
new NCCTMessageDecoder(),
new StringOutputHandler());
FileInputStream fis = new FileInputStream(new File(inputFileName));
int bytesRead = 0;
while ((bytesRead = fis.read(bytes)) != -1) {
ByteBuf buf = Unpooled.wrappedBuffer(bytes, 0, bytesRead);
channel.writeInbound(buf);
}
channel.flush();
}
}
Output from a successful run with block size of 356 bytes is show below (with the body of the messages truncated for brevity
LOG:DEBUG 2017-04-24 04:19:24,675[main](netty.NCCTMessageDecoder) - com.ticomgeo.mtr.ncct.netty.NCCTMessageDecoder.decode(NCCTMessageDecoder.java:21) ]received 348 bytes
Frame Start========================================
(byte) 0xbb, (byte) 0x55, (byte) 0x05, (byte) 0x16,
(byte) 0x00, (byte) 0x00, (byte) 0x01, (byte) 0x5c,
(byte) 0x01, (byte) 0x01, (byte) 0x02, (byte) 0x02,
(byte) 0x05, (byte) 0x00, (byte) 0x00, (byte) 0x00,
(byte) 0x50, (byte) 0x3a, (byte) 0xc9, (byte) 0x17,
....
Frame End========================================
Frame Start========================================
(byte) 0xbb, (byte) 0x55, (byte) 0x05, (byte) 0x1c,
(byte) 0x00, (byte) 0x00, (byte) 0x01, (byte) 0xc8,
(byte) 0x01, (byte) 0x01, (byte) 0x02, (byte) 0x02,
(byte) 0x05, (byte) 0x00, (byte) 0x00, (byte) 0x00,
(byte) 0x04, (byte) 0x02, (byte) 0x00, (byte) 0x01,
If I change the block size to 256, the wrong bytes seem to be read as the length field.
Exception in thread "main" io.netty.handler.codec.TooLongFrameException: Adjusted frame length exceeds 65536: 4294967040 - discarded
at io.netty.handler.codec.LengthFieldBasedFrameDecoder.fail(LengthFieldBasedFrameDecoder.java:499)
at io.netty.handler.codec.LengthFieldBasedFrameDecoder.failIfNecessary(LengthFieldBasedFrameDecoder.java:477)
at io.netty.handler.codec.LengthFieldBasedFrameDecoder.decode(LengthFieldBasedFrameDecoder.java:403)
TL;DR; Your problem is caused because netty reuses the passed in bytebuf, and then you are overwriting the contents.
LengthFieldBasedFrameDecoder is designed through inheritance to reuse the passed in ByteBuf, because it is useless to let the object decay through garbage collection when you can reuse it because its reference count is 1. The problem however comes from the fact that you are changing the internals of the passed in bytebuf, and therefore changing the frame on the fly. Instead of making a wrappedBuffer, that uses your passed in variable as storage, you should use copiedBuffer, because that one properly makes a copy of it, so the internals of LengthFieldBasedFrameDecoder can do freely things with it.

int to unsigned char array in Java

I'm trying to connect Android(Java) with Linux(Qt C++) using socket. After that I want to transfer length of message in bytes. For converting int in unsigned char array on the C++ side I use:
QByteArray IntToArray(qint32 source)
{
QByteArray tmp;
QDataStream data(&temp, QIODevice::ReadWrite);
data << source;
return tmp;
}
But I don't know how I can do the same converting on the Java side, because Java hasn't unsigned types. I tried to use some examples but always got different results. So, I need Java method which returns this for source = 17:
0x00, 0x00, 0x00, 0x11
I understand that it's a very simple question, but I'm new in Java so it's not clear to me.
UPD:
Java:
PrintWriter out = new PrintWriter(socket.getOutputStream(), true);
out.print(ByteBuffer.allocate(4).putInt(17).array());
Qt C++:
QByteArray* buffer = new QByteArray();
buffer->append(socket->readAll());
Output:
buffer = 0x5b, 0x42, 0x40, 0x61, 0x39, 0x65, 0x31,
0x62, 0x66, 0x35.
UPD2:
Java:
out.print(toBytes(17));
...
byte[] toBytes(int i)
{
byte[] result = new byte[4];
result[0] = (byte) (i >> 24);
result[1] = (byte) (i >> 16);
result[2] = (byte) (i >> 8);
result[3] = (byte) (i /*>> 0*/);
return result;
}
Qt C++: same
Output:
buffer = 0x5b, 0x42, 0x40, 0x63, 0x38, 0x61, 0x39,
0x33, 0x38, 0x33.
UPD3:
Qt C++:
QByteArray buffer = socket->readAll();
for(int i = 0; i < buffer.length(); ++i){
std::cout << buffer[i];
}
std::cout<<std::endl;
Output:
[B#938a15c
First of all, don't use PrintWriter.
Here's something to remember about Java I/O:
Streams are for bytes, Readers/Writers are for characters.
In Java, a character is not a byte. Characters have an encoding associated with them, like UTF-8. Bytes don't.
When you wrap a Stream in a Reader or a Writer, you are taking a byte stream and imposing a character encoding on that byte stream. You don't want that here.
Just try this:
OutputStream out = socket.getOutputStream();
out.write(toBytes(17));

Android - Writing to ISO15693 Tags

I'm currently trying to write a couple of bytes to a specific block. My read commands work fine and I am able to read any block of my tag using the code below:
command = new byte[]{
(byte) 0x02, // Flags
(byte) 0x23, // Command: Read multiple blocks
(byte) 0x09, // First block (offset)
(byte) 0x03 // Number of blocks // MAX READ SIZE: 32 blocks:1F
};
byte[] data = nfcvTag.transceive(command);
When I try to write with the code below, my app crashes.
Write = new byte[]{
(byte) 0x02, // Flags
(byte) 0x21, // Command: Write 1 blocks
(byte) 0x5A, // First block (offset)
(byte) 0x41 // Data
};
nfcvTag.transceive(Write);
I'm doing this in an AsyncTask and getting the java.lang.RuntimeException: Can't create handler inside thread that has not called Looper.prepare() exception.
Any tips? The tag is a STMicroelectronics M24LR04E-R
Figured it out. I was only writing 8 bits of data while the tag has 32 bits per block. Added 3 0x00's and the write was successful.
Write = new byte[]{
(byte) 0x02, // Flags
(byte) 0x21, // Command: Write 1 blocks
(byte) 0x5A, // First block (offset)
(byte) 0x41,
(byte) 0x00,
(byte) 0x00,
(byte) 0x00
};
nfcvTag.transceive(Write);

How to use Java Card crypto sample?

I'm trying to make run example from IBM website.
I wrote this method:
public static byte[] cipher(byte[] inputData) {
Cipher cipher
= Cipher.getInstance(
Cipher.ALG_DES_CBC_NOPAD, true);
DESKey desKey = (DESKey) KeyBuilder.buildKey(
KeyBuilder.TYPE_DES,
KeyBuilder.LENGTH_DES,
false);
byte[] keyBytes = {(byte) 0x01, (byte) 0x02, (byte) 0x03, (byte) 0x04};
desKey.setKey(keyBytes, (short) 0);
cipher.init(desKey, Cipher.MODE_ENCRYPT);
byte[] outputData = new byte[8];
cipher.doFinal(inputData, (short) 0, (short) inputData.length, outputData, (short) 0);
return outputData;
}
And call this method cipher("test".getBytes());. When I call this servlet server gives me Internal server error and javacard.security.CryptoException.
I tried ALG_DES_CBC_ISO9797_M1, ALG_DES_CBC_ISO9797_M2 (and others) and got the same exception.
How to make run simple example of cipher on Java Card Connected?
UPDATE
As #vojta said, key must be 8 bytes long. So it must be something like this:
byte[] keyBytes = {(byte) 0x01, (byte) 0x02, (byte) 0x03, (byte) 0x04, (byte) 0x01, (byte) 0x02, (byte) 0x03, (byte) 0x04};
I don't know why, but it works only if replace
Cipher cipher = Cipher.getInstance(Cipher.ALG_DES_CBC_NOPAD, true);
with
Cipher cipher = Cipher.getInstance(Cipher.ALG_DES_CBC_ISO9797_M2, false);
I could not find anything about it in documentation.
These lines seem to be wrong:
byte[] keyBytes = {(byte) 0x01, (byte) 0x02, (byte) 0x03, (byte) 0x04};
desKey.setKey(keyBytes, (short) 0);
DES key should be longer than 4 bytes, right? Standard DES key is 8 bytes long (with strength of 56 bits).
In addition to #vojta's answer, the input data should be block aligned.
Your input data "test".getBytes() have length 4 which is not valid for Cipher.ALG_DES_CBC_NOPAD (but valid for Cipher.ALG_DES_CBC_ISO9797_M2).
Strange is that this should cause CryptoException.ILLEGAL_USE reason (which is 5 opposed to 3 you are getting)...

Tokenising binary data in java

I am writing some code to handle a stream of binary data. It is received in chunks represented by byte arrays. Combined, the byte arrays represent sequential stream of messages, each of which ends with the same constant terminator value (0xff in my case). However, the terminator value can appear at any point in any given chunk of data. A single chunk can contain part of a message, multiple messages and anything in between.
Here is a small sampling of what data handled by this might look like:
[0x00, 0x0a, 0xff, 0x01]
[0x01, 0x01]
[0xff, 0x01, 0xff]
This data should be converted into these messages:
[0x00, 0x0a, 0xff]
[0x01, 0x01, 0x01, 0xff]
[0x01, 0xff]
I have written a small class to handle this. It has a method to add some data in byte array format, which is then placed in a buffer array. When the terminator character is encountered, the byte array is cleared and the complete message is placed in a message queue, which can be accessed using hasNext() and next() methods (similar to an iterator).
This solution works fine, but as I finished it, I realized that there might already be some stable, performant and tested code in an established library that I could be using instead.
So my question is - do you know of a utility library that would have such a class, or maybe there is something in the standard Java 6 library that can do this already?
I don't think you need a framework as a custom parser is simple enough.
InputStream in = new ByteArrayInputStream(new byte[]{
0x00, 0x0a, (byte) 0xff,
0x01, 0x01, 0x01, (byte) 0xff,
0x01, (byte) 0xff});
ByteArrayOutputStream baos = new ByteArrayOutputStream();
for (int b; (b = in.read()) >= 0; ) {
baos.write(b);
if (b == 0xff) {
byte[] bytes = baos.toByteArray();
System.out.println(Arrays.toString(bytes));
baos = new ByteArrayOutputStream();
}
}
prints as (byte) 0xFF == -1
[0, 10, -1]
[1, 1, 1, -1]
[1, -1]

Categories

Resources