Zlib from python to java - java

I use this in python:
test = zlib.compress(test, 1)
And now I want to use this in java, but I don't know how.
At the end I need to convert the result to a string...
I wait your help! thx

You need to be more specific in describing what you want to achieve.
Are you just seeking to decompress a zlib-compressed piece of data?
Or are you seeking for a way to exchange data between Python and Java, possibly by transferring it across a network?
For the former, it's basically sticking the compressed datastream in an Inflater (not Deflater!). Something like:
Inflater decompresser = new Inflater();
decompresser.setInput(data);
ByteArrayOutputStream bos = new ByteArrayOutputStream(data.length);
byte[] buffer = new byte[8192];
while (!decompresser.finished()) {
int size = decompresser.inflate(buffer);
bos.write(buffer, 0, size);
}
byte[] unzippeddata = bos.toByteArray();
decompresser.end();
For the latter, alex's answer already points to Pyro/Pyrolite.

You can try using a RPC to create a server.
https://github.com/irmen/Pyro4
And access that server using java.
https://github.com/irmen/Pyrolite

Related

Compressiopn using GZIPOutputStream and decompression in php

I have been communicating with my Java application using php by creating a socket, from Java end, I will compress the result and write those bytes to the php.
public static byte[] compressString(byte[] b) throws IOException{
byte[] compressedBytes = null;
ByteArrayOutputStream out = new ByteArrayOutputStream(b.length);
try{
GZIPOutputStream gzip = new GZIPOutputStream(out);
gzip.write(b, 0, b.length);
gzip.finish();
gzip.flush();
gzip.close();
compressedBytes = out.toByteArray();
System.out.println("comp1::"+compressedBytes+", length::"+compressedBytes.length);
}finally{
out.close();
}
return compressedBytes;
}
But when i am performing
$data = fgets($fp);
$decompData = gzuncompress($data);
$decompData is returned null.
Kindly provide a solution as i have tried Deflater with gzuncompress, gzdecode and every possible option. There must be something i am missing here. Consider me as newbie in php.
You need to use gzdecode() for gzip streams, and you may need to open the file using "rb" to read binary data without translation.

Decompressing PHP's gzcompress in Java

I'm trying to decompress a json object in Java that was initially compressed in PHP. Here's how it gets compressed into PHP:
function zip_json_encode(&$arr) {
$uncompressed = json_encode($arr);
return pack('L', strlen($uncompressed)).gzcompress($uncompressed);
}
and decoded (again in PHP):
function unzip_json_decode(&$data) {
$uncompressed = #gzuncompress(substr($data,4));
return json_decode($uncompressed, $array_instead_of_object);
}
That gets put into MySQL and now it must be pulled out of the db by Java. We pull it out from the ResultSet like this:
String field = rs.getString("field");
I then pass that string to a method to decompress it. This is where it falls apart.
private String decompressHistory(String historyString) throws SQLException {
StringBuffer buffer = new StringBuffer();
try {
byte[] historyBytes = historyString.substring(4).getBytes();
ByteArrayInputStream bin = new ByteArrayInputStream(historyBytes);
InflaterInputStream in = new InflaterInputStream(bin, new Inflater(true));
int len;
byte[] buf = new byte[1024];
while ((len = in.read(buf)) != -1) {
// buf should be decoded, right?
}
} catch (IOException e) {
e.getStackTrace();
}
return buffer.toString();
}
Not quite sure what's going wrong here, but any pointers would be appreciated!
You need to get rid of the true in Inflater(true). Use just Inflater(). The true makes it expect raw deflate data. Without the true, it is expecting zlib-wrapped deflate data. PHP's gzcompress() produces zlib-wrapped deflate data.
Gzipped data is binary, byte[]. Using String, Unicode text, not only needs conversion, but is faulty.
For instance this involves a conversion:
byte[] historyBytes = historyString.substring(4).getBytes();
byte[] historyBytes = historyString.substring(4).getBytes("ISO-8859-1");
The first version uses the default platform encoding, making the application non-portable.
The first to-do is to use binary data in the database as VARBINARY or BLOB.
ImputStream field = rs.getBinaryStream("field");
try (InputStream in = new GZIPInputStream(field)) {
...
}
Or so. Mind the other answer.
In the end, neither of the above solutions worked, but both have merits. When we pulled the data out of mysql and cast it to bytes we have a number of missing character bytes (67). This made it impossible to decompress on the java side. As for the answers above. Mark is correct that gzcompress() uses zlib and therefore you should use the Inflater() class in Java.
Joop is correct that the data conversion is faulty. Our table was too large to convert it to varbinary or blob. That may have solved the problem, but didn't work for us. We ended up having java make a request to our PHP app, then simply unpacked the compressed data on the PHP side. This worked well. Hopefully this is helpful to anyone else that stumbles across it.

Java: is it possible to store raw data in source file?

OK, I know this is a bit of a weird question:
I'm writing this piece of java code and need to load raw data (approx 130000 floating points):
This data never changes, and since I don't want to write different loading methods for PC and Android, I was thinking of embedding it into the source file as a float[].
Too bad, there seems to be a limit of 65535 entries; is there an efficient way to do it?
Store that data in a file in the classpath; then read that data as a ByteBuffer which you then "convert" to a FloatBuffer. Note that the below code assumes big endian:
final InputStream in = getClass().getResourceAsStream("/path/to/data");
final ByteArrayOutputStream out = new ByteArrayOutputStream();
final byte[] buf = new byte[8192];
int count;
try {
while ((count = in.read(buf)) != -1)
out.write(buf, 0, count);
} finally {
out.close();
in.close();
}
final FloatBuffer buf = ByteBuffer.wrap(out.toByteArray()).asFloatBuffer();
You can then .get() from the FloatBuffer.
You could use 2 or 3 arrays to get around the limit, if that was your only problem with that approach.

Java mutable byte array data structure

I'm trying to find an easy way to create a mutable byte array that can automatically append any primitive Java data type. I've been searching but could not find anything useful.
I'm looking for something like this
ByteAppender byteStructure = new ByteAppender();
byteStructure.appendInt(5);
byteStructure.appendDouble(10.0);
byte[] bytes = byteStructure.toByteArray();
There is ByteByffer which is great, but you have to know the size of the buffer before you start, which won't work in my case. There is a similar thing (StringBuilder) for creating Strings, but I cannot find one for Bytes.
I thought this would be obvious in Java.
I guess you are looking for java.io.DataOutputStream
ByteArrayOutputStream out = new ByteArrayOutputStream();
DataOutputStream dout = new DataOutputStream(out);
dout.writeInt(1234);
dout.writeLong(123L);
dout.writeFloat(1.2f);
byte[] storingData = out.toByteArray();
How to use storingData?
//how to use storingData?
ByteArrayInputStream in = new ByteArrayInputStream(storingData);
DataInputStream din = new DataInputStream(in);
int v1 = din.readInt();//1234
long v2 = din.readLong();//123L
float v3 = din.readFloat();//1.2f

Java reading file into memory and how not to blow up memory

I'm a bit of a newbie in Java and I trying to perform a MAC calculation on a file.
Now since the size of the file is not known at runtime, I can't just load all of the file in to memory. So I wrote the code so it would read in bits (4k in this case).
The issue I'm having is I tried loading the entire file into memory to see if both methods produce the same hash. However they seem to be producing different hashes
Here's the bit by bit code:
FileInputStream fis = new FileInputStream("sbs.dat");
byte[] file = new byte[4096];
m = Mac.getInstance("HmacSHA1");
int i=fis.read(file);
m.init(key);
while (i != -1)
{
m.update(file);
i=fis.read(file);
}
mac = m.doFinal();
And here's the all at once approach:
File f = new File("sbs.dat");
long size = f.length();
byte[] file = new byte[(int) size];
fis.read(file);
m = Mac.getInstance("HmacSHA1");
m.init(key);
m.update(file);
mac = m.doFinal();
Shouldn't they both produce the same hash?
The question however is more generic. Is the 1st code the correct way of loading a file into memory into pieces and perform whatever we want to do inside the while cycle? (socket send, cipher a file, etc...).
This question is useful because every tutorial I've seen just loads everything at once...
Update: Working :-D. Will this approach work properly sending a file in pieces through a socket?
No. You have no guarantee that in fis.read(file) will read file.length bytes. This is why read() is returning an int to tell you how many bytes it has actually read.
You should instead do this:
m.init(key);
int i=fis.read(file);
while (i != -1)
{
m.update(file, 0, i);
i=fis.read(file);
}
taking advantage of Mac.update(byte[] data, int offset, int len) method that allows you to specify length of actual data in in byte[] array.
The read function will not necessarily fill up your entire array. So, you need to check how many bytes were returning from the read function, and only use that many bytes of your buffer.
Just like Jason LeBrun says - The read method will not always read the specified amount of bytes. For example: What do you think will happen if the file does not contain a multiple of 4096 bytes?
I would go for something like this:
FileInputStream fis = new FileInputStream(filename);
byte[] buffer = new byte[buffersize];
Mac m = Mac.getInstance("HmacSHA1");
m.init(key);
int n;
while ((n = fis.read(buffer)) != -1)
{
m.update(buffer, 0, n);
}
byte[] mac = m.doFinal();

Categories

Resources