Netty java getting data from ByteBuf - java

How to get a byte array from ByteBuf efficiently in the code below? I need to get the array and then serialize it.
package testingNetty;
import io.netty.buffer.ByteBuf;
import io.netty.channel.ChannelHandlerContext;
import io.netty.channel.ChannelInboundHandlerAdapter;
public class ServerHandler extends ChannelInboundHandlerAdapter {
#Override
public void channelRead(ChannelHandlerContext ctx, Object msg) {
System.out.println("Message receive");
ByteBuf buff = (ByteBuf) msg;
// There is I need get bytes from buff and make serialization
byte[] bytes = BuffConvertor.GetBytes(buff);
}
#Override
public void exceptionCaught(ChannelHandlerContext ctx, Throwable cause) {
// Close the connection when an exception is raised.
cause.printStackTrace();
ctx.close();
}
}

ByteBuf buf = ...
byte[] bytes = new byte[buf.readableBytes()];
buf.readBytes(bytes);
If you don't want the readerIndex to change:
ByteBuf buf = ...
byte[] bytes = new byte[buf.readableBytes()];
int readerIndex = buf.readerIndex();
buf.getBytes(readerIndex, bytes);
If you want to minimize the memory copy, you can use the backing array of the ByteBuf, if it's available:
ByteBuf buf = ...
byte[] bytes;
int offset;
int length = buf.readableBytes();
if (buf.hasArray()) {
bytes = buf.array();
offset = buf.arrayOffset();
} else {
bytes = new byte[length];
buf.getBytes(buf.readerIndex(), bytes);
offset = 0;
}
Please note that you can't simply use buf.array(), because:
Not all ByteBufs have backing array. Some are off-heap buffers (i.e. direct memory)
Even if a ByteBuf has a backing array (i.e. buf.hasArray() returns true), the following isn't necessarily true because the buffer might be a slice of other buffer or a pooled buffer:
buf.array()[0] == buf.getByte(0)
buf.array().length == buf.capacity()

Another option is ByteBufUtil.getBytes(ByteBuf buf, int start, int length, boolean copy)
See ByteBufUtil

Related

Decompressing byte[] using LZ4

I am using LZ4 for compressing and decompressing a string.I have tried the following way
public class CompressionDemo {
public static byte[] compressLZ4(LZ4Factory factory, String data) throws IOException {
final int decompressedLength = data.getBytes().length;
LZ4Compressor compressor = factory.fastCompressor();
int maxCompressedLength = compressor.maxCompressedLength(decompressedLength);
byte[] compressed = new byte[maxCompressedLength];
compressor.compress(data.getBytes(), 0, decompressedLength, compressed, 0, maxCompressedLength);
return compressed;
}
public static String deCompressLZ4(LZ4Factory factory, byte[] data) throws IOException {
LZ4FastDecompressor decompressor = factory.fastDecompressor();
byte[] restored = new byte[data.length];
decompressor.decompress(data,0,restored, 0,data.length);
return new String(restored);
}
public static void main(String[] args) throws IOException, DataFormatException {
String string = "kjshfhshfashfhsakjfhksjafhkjsafhkjashfkjhfjkfhhjdshfhhjdfhdsjkfhdshfdskjfhksjdfhskjdhfkjsdhfk";
LZ4Factory factory = LZ4Factory.fastestInstance();
byte[] arr = compressLZ4(factory, string);
System.out.println(arr.length);
System.out.println(deCompressLZ4(factory, arr) + "decom");
}
}
it is giving following excpetion
Exception in thread "main" net.jpountz.lz4.LZ4Exception: Error decoding offset 92 of input buffer
The problem here is that decompressing is working only if i pass the actual String byte[] length i.e
public static String deCompressLZ4(LZ4Factory factory, byte[] data) throws IOException {
LZ4FastDecompressor decompressor = factory.fastDecompressor();
byte[] restored = new byte[data.length];
decompressor.decompress(data,0,restored, 0,"kjshfhshfashfhsakjfhksjafhkjsafhkjashfkjhfjkfhhjdshfhhjdfhdsjkfhdshfdskjfhksjdfhskjdhfkjsdhfk".getBytes().length);
return new String(restored);
}
It is expecting the actual string byte[] size.
Can someone help me with this
As the compression and decompressions may happen on different machines, or the machine default character encoding is not one of the Unicode formats, one should indicate the encoding too.
For the rest it is using the actual compression and decompression lengths, and better store the size of the uncompressed data too, in plain format, so it may be extracted prior to decompressing.
public static byte[] compressLZ4(LZ4Factory factory, String data) throws IOException {
byte[] decompressed = data.getBytes(StandardCharsets.UTF_8).length;
LZ4Compressor compressor = factory.fastCompressor();
int maxCompressedLength = compressor.maxCompressedLength(decompressed.length);
byte[] compressed = new byte[4 + maxCompressedLength];
int compressedSize = compressor.compress(decompressed, 0, decompressed.length,
compressed, 4, maxCompressedLength);
ByteBuffer.wrap(compressed).putInt(decompressed.length);
return Arrays.copyOf(compressed, 0, 4 + compressedSize);
}
public static String deCompressLZ4(LZ4Factory factory, byte[] data) throws IOException {
LZ4FastDecompressor decompressor = factory.fastDecompressor();
int decrompressedLength = ByteBuffer.wrap(data).getInt();
byte[] restored = new byte[decrompressedLength];
decompressor.decompress(data, 4, restored, 0, decrompressedLength);
return new String(restored, StandardCharsets.UTF_8);
}
It should be told, that String is not suited for binary data, and your compression/decompression is for text handling only. (String contains Unicode text in the form of UTF-16 two-byte chars. Conversion to binary data always involves a conversion with the encoding of the binary data. That costs in memory, speed and possible data corruption.)
I just faced the same error on Android and resolved it based on issue below:
https://github.com/lz4/lz4-java/issues/68
In short make sure you are using the same factory for both operations (compression + decompression) and use Arrays.copyOf() as below:
byte[] compress(final byte[] data) {
LZ4Factory lz4Factory = LZ4Factory.safeInstance();
LZ4Compressor fastCompressor = lz4Factory.fastCompressor();
int maxCompressedLength = fastCompressor.maxCompressedLength(data.length);
byte[] comp = new byte[maxCompressedLength];
int compressedLength = fastCompressor.compress(data, 0, data.length, comp, 0, maxCompressedLength);
return Arrays.copyOf(comp, compressedLength);
}
byte[] decompress(final byte[] compressed) {
LZ4Factory lz4Factory = LZ4Factory.safeInstance();
LZ4SafeDecompressor decompressor = lz4Factory.safeDecompressor();
byte[] decomp = new byte[compressed.length * 4];//you might need to allocate more
decomp = decompressor.decompress(Arrays.copyOf(compressed, compressed.length), decomp.length);
return decomp;
Hope this will help.
restored byte[] length is to small, you should not use compressed data.length, instead you should use data[].length * 3 or more than 3.
I resoved like this:
public static byte[] decompress( byte[] finalCompressedArray,String ... extInfo) {
int len = finalCompressedArray.length * 3;
int i = 5;
while (i > 0) {
try {
return decompress(finalCompressedArray, len);
} catch (Exception e) {
len = len * 2;
i--;
if (LOGGER.isInfoEnabled()) {
LOGGER.info("decompress Error: extInfo ={} ", extInfo, e);
}
}
}
throw new ItemException(1, "decompress error");
}
/**
* 解压一个数组
*
* #param finalCompressedArray 压缩后的数据
* #param length 原始数据长度, 精确的长度,不能大,也不能小。
* #return
*/
private static byte[] decompress(byte[] finalCompressedArray, int length) {
byte[] desc = new byte[length ];
int decompressLen = decompressor.decompress(finalCompressedArray, desc);
byte[] result = new byte[decompressLen];
System.arraycopy(desc,0,result,0,decompressLen);
return result;
}

Serialization gives wrong size of object

I'm working on an Android app where I use serialization to convert an object to a byte array. After the conversion I read the size and I got a much bigger value of the byte array.
The method that I have made is as followed:
public void Send(testpacket packet){
try
{
// First convert the CommStruct to a byte array
// Then send the byte array
byte [] buffer = toByteArray(packet);
int size = buffer.length;
System.out.println("SIZE OF BYTE ARRAY: " + size);
server.send(buffer);
}
catch (IOException e)
{
Log.e("USBCommunicator", "problem sending TCP message", e);
}
}
The serialization method toByteArray converts a object to a byte array and looks as followed:
public static byte[] toByteArray(Object obj) throws IOException {
byte[] bytes = null;
ByteArrayOutputStream bos = null;
ObjectOutputStream oos = null;
try {
bos = new ByteArrayOutputStream();
oos = new ObjectOutputStream(bos);
oos.writeObject(obj);
oos.flush();
bytes = bos.toByteArray();
} finally {
if (oos != null) {
Log.i(TAG, "not null");
oos.close();
}
if (bos != null) {
bos.close();
Log.i(TAG, "not null");
}
}
return bytes;
}
The object packet consists of two classes with a total of 7 integers (so the size should be 28 bytes). And is defined as followed:
public class testpacket implements java.io.Serializable {
public ObjectInfo VisionData;
public SensorDataStruct SensorData;
//Constructor
public testpacket(){
// Call constructors
VisionData = new ObjectInfo();
SensorData = new SensorDataStruct();
}
}
ObjectInfo consists of the following:
//ObjectInfo struct definition
public class ObjectInfo implements java.io.Serializable
{
public int ObjectXCor;
public int ObjectYCor;
public int ObjectMass;
//Constructor
public ObjectInfo(){
ObjectMass = 0;
ObjectXCor = 0;
ObjectYCor = 0;
}
};
And SensorDataStruct is as followed:
//ObjectInfo struct definition
public class SensorDataStruct implements java.io.Serializable
{
public int PingData;
public int IRData;
public int ForceData;
public int CompassData;
//Constructor
public SensorDataStruct(){
CompassData = 0;
ForceData = 0;
IRData = 0;
PingData = 0;
}
};
But when I read out the length of the byte buffer after the convertion the size is 426. Does anybody have a idea or suggestion why this is not 28 bytes? If i need to supply more information please say so! Any tips and suggestions are welcome!
Update
I have changed the code with the help of EJP. I use the DataOutputStream to convert the object data (the actual variable data) to bytes. The object decribed above in this post contains 7 integers and when the object it created the starting values is for all these integers 0.
The convertion function is as followed:
public static byte[] toByteArray(testpacket obj) throws IOException {
byte[] bytes = null;
ByteArrayOutputStream baos = new ByteArrayOutputStream();
DataOutputStream w = new DataOutputStream(baos);
w.write(obj.SensorData.CompassData);
w.write(obj.SensorData.ForceData);
w.write(obj.SensorData.IRData);
w.write(obj.SensorData.PingData);
w.write(obj.VisionData.ObjectMass);
w.write(obj.VisionData.ObjectXCor);
w.write(obj.VisionData.ObjectYCor);
//w.flush();
bytes = baos.toByteArray();
int size = bytes.length;
System.out.println("SIZE OF BYTE ARRAY IN CONVERTION FUNCTION: " + size);
return bytes;
}
Now i only have one question: the size is 7 when i read out the size of the byte buffer. This is (i think) because of the that all values (0's) of the integers are so small that they fit in one byte each. My question is how can i make this so for each integer value Always four bytes will be used in the datastream? Any suggestions are welcome!
The serialized stream for your object contains:
An object stream header.
Tag information saying the next item is an object.
Class information for the object.
Version information for the object.
Type-name-value tuples, for each serialized member of the object.

Netty Framedecoder error

I have implemented a frame decoder for an application I am building. So far when I run the code the packets are well decoded and I can see the packet received and is correct. However I am getting an exception:
decode() method must read at least one byte if it returned a frame (caused by: class com.smsgh.unitysmpp.Plugins.Smp.SmpPduFrameDecoder)
at org.jboss.netty.handler.codec.frame.FrameDecoder.callDecode(FrameDecoder.java:434)
at org.jboss.netty.handler.codec.frame.FrameDecoder.messageReceived(FrameDecoder.java:303)
at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:268)
at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:255)
at org.jboss.netty.channel.socket.nio.NioWorker.read(NioWorker.java:84)
at org.jboss.netty.channel.socket.nio.AbstractNioWorker.processSelectedKeys(AbstractNioWorker.java:471)
at org.jboss.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:332)
at org.jboss.netty.channel.socket.nio.NioWorker.run(NioWorker.java:35)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
This is the code I wrote:
/**
* #author Arsene
*
*/
public class SmpPduFrameDecoder extends FrameDecoder {
private static Logger logger = LoggerFactory.getLogger(SmpPduFrameDecoder.class);
#Override
protected Object decode(ChannelHandlerContext ctx, Channel channel,
ChannelBuffer buffer) throws Exception {
ChannelBuffer leFrame = null;
// Check the Endianness of the packet sent
if(buffer.order() == ByteOrder.BIG_ENDIAN){
if(buffer.hasArray()){
leFrame = ChannelBuffers.wrappedBuffer(ByteOrder.LITTLE_ENDIAN, buffer.array(), buffer.arrayOffset(), buffer.capacity());
}
else{
leFrame = ChannelBuffers.wrappedBuffer(buffer.toByteBuffer().order(ByteOrder.LITTLE_ENDIAN));
}
}
// Read the byte length
// wait until the length prefix is available
if (leFrame.readableBytes() < 4) {
logger.debug("Unable to Read the Length");
return null;
}
// parse the frame length (first 4 bytes)
int frameLength = leFrame.getInt(leFrame.readerIndex());
logger.info("Packet Received Length " + frameLength);
// wait until the whole data is available
if (leFrame.readableBytes() < frameLength) {
logger.debug("Unable to read the full PDU received");
return null;
}
leFrame.skipBytes(4);
int readerIndex = leFrame.readerIndex();
ChannelBuffer frame = leFrame.readBytes(frameLength);
leFrame.readerIndex(readerIndex+frameLength);
byte[] byteArray = frame.array();
StringBuilder sb = new StringBuilder();
for(byte b : byteArray){
sb.append(HexUtil.toHexString(b));
}
logger.info("Decode Frame Received without the length " +sb.toString());
logger.debug("Full PDU has been read");
return frame;
}
}
Can someone tell me what is wrong eith my code and help me solve that exception?
I have solve the issue. Instead of copying the buffer into a new buffer I just have to use the swapInt to get the actual frame lenght as in this code:
public class SmpPduFrameDecoder extends FrameDecoder {
private static Logger logger = LoggerFactory.getLogger(SmpPduFrameDecoder.class);
#Override
protected Object decode(ChannelHandlerContext ctx, Channel channel,
ChannelBuffer buffer) throws Exception {
// Read the byte length
// wait until the length prefix is available
if (buffer.readableBytes() < 4) {
logger.debug("Unable to Read the Length");
return null;
}
// parse the frame length (first 4 bytes)
//int frameLength = leFrame.getInt(leFrame.readerIndex());
int frameLength = ChannelBuffers.swapInt(buffer.getInt(buffer.readerIndex()));
logger.info("Packet Received Length " + frameLength);
// wait until the whole data is available
if (buffer.readableBytes() < frameLength) {
logger.debug("Unable to read the full PDU received");
return null;
}
buffer.skipBytes(4);
ChannelBuffer frame = buffer.readBytes(frameLength);
buffer.readerIndex(frameLength + 4);
byte[] byteArray = frame.array();
StringBuilder sb = new StringBuilder();
for(byte b : byteArray){
sb.append(HexUtil.toHexString(b));
}
logger.info("Decode Frame Received without the length " +sb.toString());
logger.debug("Full PDU has been read");
return frame;
}
}

decodeByteArray and copyPixelsToBuffer not working. SkImageDecoder::Factory returned null

I have a class TouchPoint which implements Serializable and because it contains Bitmap I wrote writeObject and readObject for that class:
private void writeObject(ObjectOutputStream oos) throws IOException {
long t1 = System.currentTimeMillis();
oos.defaultWriteObject();
if(_bmp!=null){
int bytes = _bmp.getWidth()*_bmp.getHeight()*4;
ByteBuffer buffer = ByteBuffer.allocate(bytes);
_bmp.copyPixelsToBuffer(buffer);
byte[] array = buffer.array();
oos.writeObject(array);
}
Log.v("PaintFX","Elapsed Time: "+(System.currentTimeMillis()-t1));
}
private void readObject(ObjectInputStream ois) throws IOException, ClassNotFoundException{
ois.defaultReadObject();
byte[] data = (byte[]) ois.readObject();
if(data != null && data.length > 0){
_bmp = BitmapFactory.decodeByteArray(data, 0, data.length);
}
}
The problem is that I get
SkImageDecoder::Factory returned null
So how can I fix it. I know that possible solution is to change writeObject() to
ByteArrayOutputStream byteStream = new ByteArrayOutputStream();
_bmp.compress(Bitmap.CompressFormat.PNG, 100, byteStream);
oos.writeObject(byteStream.toByteArray);
BUT this method is slower almost 10+ times.
copyPixelsToBuffer ~14ms for writing image
_bmp.compress ~ 160ms
UPDATE
Find out that the actual problem is that after
buffer.array();
All byte[] array elements are: 0
Finally I find a way to make it work and be faster at the same time. I was encountered two issues using this method:
I should pass the Bitmap.Config param also, without that I can't decode the byte array
_bmp.compress and _bmp.copyPixelsToBuffer give different arrays so I couldn't use decodeByteArray.
I solved them this way
private void writeObject(ObjectOutputStream oos) throws IOException {
oos.defaultWriteObject();
if(_bmp!=null){
int bytes = _bmp.getWidth()*_bmp.getHeight()*4;
ByteBuffer buffer = ByteBuffer.allocate(bytes);
_bmp.copyPixelsToBuffer(buffer);
byte[] array = new byte[bytes]; // looks like this is extraneous memory allocation
if (buffer.hasArray()) {
try{
array = buffer.array();
} catch (BufferUnderflowException e) {
e.printStackTrace();
}
}
String configName = _bmp.getConfig().name();
oos.writeObject(array);
oos.writeInt(_bmp.getWidth());
oos.writeInt(_bmp.getHeight());
oos.writeObject(configName);
} else {
oos.writeObject(null);
}
}
private void readObject(ObjectInputStream ois) throws IOException, ClassNotFoundException{
ois.defaultReadObject();
byte[] data = (byte[]) ois.readObject();
if (data != null) {
int w = ois.readInt();
int h = ois.readInt();
String configName = (String) ois.readObject();
Bitmap.Config configBmp = Bitmap.Config.valueOf(configName);
Bitmap bitmap_tmp = Bitmap.createBitmap(w, h, configBmp);
ByteBuffer buffer = ByteBuffer.wrap(data);
bitmap_tmp.copyPixelsFromBuffer(buffer);
_bmp = bitmap_tmp.copy(configBmp,true);
bitmap_tmp.recycle();
} else {
_bmp = null;
}
}
This is enough fast for me - about 15x faster then the bmp.compress way. hope this helps :)
Bitmap to byte[]:
Bitmap bmp; // your bitmap
ByteArrayOutputStream stream = new ByteArrayOutputStream();
bmp.compress(Bitmap.CompressFormat.PNG, 100, stream);
byte[] byteArray = stream.toByteArray();
Use Bufferedstreams for better performance.

Blowfish code should be equivalent but is not

We have a class which wraps BouncyCastle (actually SpongyCastle for Android) Blowfish to encrypt data to stream:
public class BlowfishOutputStream extends OutputStream
{
private final OutputStream os;
private final PaddedBufferedBlockCipher bufferedCipher;
Our original code encrypted a whole byte array before writing to the output stream in a single operation
public void write(byte[] raw, int offset, int length) throws IOException
{
byte[] out = new byte[bufferedCipher.getOutputSize(length)];
int result = this.bufferedCipher.processBytes(raw, 0, length, out, 0);
if (result > 0)
{
this.os.write(out, 0, result);
}
}
When sending images (ie large amount of data at once) it results in two copies being retained in memory at once.
The following code is meant to be equivalent but is not, and I do not know why. I can verify that data is being sent (sum of c2 is equivalent to the length) but an intermediate process when it is received on our server discards the image before we get to see what arrives. All I know at this stage is that when the initial code is used, the response is received and the included images can be extracted, when the replacement code is used the response is received (and accepted) but images do not appear to be extracted.
public void write(byte[] raw, int offset, int length) throws IOException
{
// write to the output stream as we encrypt, not all at once.
final byte[] inBuffer = new byte[Constants.ByteBufferSize];
final byte[] outBuffer = new byte[Constants.ByteBufferSize];
ByteArrayInputStream bis = new ByteArrayInputStream(raw);
// read into inBuffer, encrypt into outBuffer and write to output stream
for (int len; (len = bis.read(inBuffer)) != -1;)
{
int c2 = this.bufferedCipher.processBytes(inBuffer, 0, len, outBuffer, 0);
this.os.write(outBuffer, 0, c2);
}
}
Note that the issue is not due to a missing call to doFinal, as this is called when the stream is closed.
public void close() throws IOException
{
byte[] out = new byte[bufferedCipher.getOutputSize(0)];
int result = this.bufferedCipher.doFinal(out, 0);
if (result > 0)
{
this.os.write(out, 0, result);
}
*nb try/catch omitted*
}
Confirmed, although ironically the issue was not with the images but in previous data, but that data was writing the complete raw byte array and not just the range specified. The equivalent code for encrypting the byte array on the fly is:
#Override
public void write(byte[] raw, int offset, int length) throws IOException
{
// write to the stream as we encrypt, not all at once.
final byte[] inBuffer = new byte[Constants.ByteBufferSize];
final byte[] outBuffer = new byte[Constants.ByteBufferSize];
int readStart = offset;
// read into inBuffer, encrypt into outBuffer and write to output stream
while(readStart<length)
{
int readAmount = Math.min(length-readStart, inBuffer.length);
System.arraycopy(raw, readStart, inBuffer, 0, readAmount);
readStart+=readAmount;
int c2 = this.bufferedCipher.processBytes(inBuffer, 0, readAmount, outBuffer, 0);
this.os.write(outBuffer, 0, c2);
}
}

Categories

Resources