PrintWriter and byte[] problem - java

byte[] data = (byte[])opBinding.execute();
PrintWriter out = new PrintWriter(outputStream);
out.println(data);
out.flush();
out.close();
but instead of text i get #84654. How can i add byte[] to PrintWriter? I need byte[] and not strinf becouse i have encoing problems with čćžšđ

You can use the outputstream directly to write the bytes.
outputStream.write(byte[] b);

PrintWriter is meant for text data, not binary data.
It sounds like you should quite possibly be converting your byte[] to a String, and then writing that string out - assuming the PrintWriter you're writing to uses an encoding which supports the characters you're interested in.
You'll also need to know the encoding that the original text data has been encoded in for the byte[], in order to successfully convert to text to start with.

The problem is, that your code calls (implicitly) data.toString() before returning the result to your println statement.

try this
byte[] data = (byte[])opBinding.execute();
PrintWriter out = new PrintWriter(outputStream);
out.println(new String(data));
out.flush();
out.close();

It worked for me.. when i used
PrintWriter out=new PrintWriter(System.out);
Also it converts the byte data to String using toString() method.. So it may be a reason for your encoding problem

Related

What can be replaced by fileInputStream.available()?

When learning Java IO, I found that fileInputStream has an availabl() method, which can be equal to the file size when reading local files. So if you can directly know the size of the file, then in the case of the need to read the entire file, it is necessary to use BufferedInputStream to decorate it?
like this:
FileInputStream fileInputStream=new FileInputStream("F:\\test.txt");
byte[] data=new byte[fileInputStream.available()];
if (fileInputStream.read(data)!=-1) {
System.out.println(new String(data));
}
or
BufferedReader bufferedReader=new BufferedReader(new
FileReader("F:\\test.txt"));
StringBuilder stringBuilder=new StringBuilder();
for (String line;(line=bufferedReader.readLine())!=null;){
stringBuilder.append(line);
}
System.out.println(stringBuilder.toString());
or
BufferedInputStream bufferedInputStream=new BufferedInputStream(new FileInputStream("F:\\test.txt"));
byte[] data=new byte[bufferedInputStream.available()];
if (bufferedInputStream.read(data)!=-1) {
System.out.println(new String(data));
}
What are the pros and cons of these methods? Which one is better?
thx.
You are wrong about the meaning of available(). It returns the possible number of bytes you can read without blocking. From documentation:
Note that while some implementations of InputStream will return the total number of bytes in the stream, many will not. It is never correct to use the return value of this method to allocate a buffer intended to hold all data in this stream.
So, if you want convert stream to byte array you should use corresponding libraries, such as IOUtils:
byte[] out = IOUtils.toByteArray(stream);

GZIPInputStream unable to decode at receiver side (invalid code lengths set)

I'm attempting to encode a String in a client using GZIPOutputStream then decoding the String in a server using GZIPOutputStream.
The client's side code (after the initial socket connection establishment) is:
// ... Establishing connection, getting a socket object.
// ... Now proceeding to send data using that socket:
DataOutputStream out = new DataOutputStream(socket.getOutputStream());
String message = "Hello World!";
ByteArrayOutputStream out = new ByteArrayOutputStream();
GZIPOutputStream gzip = new GZIPOutputStream(out);
gzip.write(message);
gzip.close();
String encMessage = out.toString();
out.writeInt(encMessage.getBytes().length);
out.write(encMessage.getBytes());
out.flush();
And the server's side code (again, after establishing a connection):
DataInputStream input = new DataInputStream(socket.getInputStream());
int length = input.readInt();
byte[] buffer = new byte[length];
input.readFully(buffer);
GZIPInputStream gz = new GZIPInputStream(new ByteArrayInputStream(buffer));
BufferedReader r = new BufferedReader(new InputStreamReader(gz));
String s = "";
String line;
while ((line = r.readLine()) != null)
{
s += line;
}
I checked and the buffer length (i.e., the coded message's size) is passed correctly, so the right number of bytes is transferred.
However, I'm getting this:
java.util.zip.ZipException: invalid code lengths set
at java.util.zip.InflaterInputStream.read(InflaterInputStream.java:164)
at java.util.zip.GZIPInputStream.read(GZIPInputStream.java:117)
at java.util.zip.InflaterInputStream.read(InflaterInputStream.java:122)
at parsing.ReceiveResponsesTest$TestReceiver.run(ReceiveResponsesTest.java:147)
at java.lang.Thread.run(Thread.java:745)
Any ideas?
Thanks in advance for any assistance!
You're calling toString() on the ByteArrayOutputStream - that is incorrect, and it opens up all kinds of character encoding problems that are probably biting you here. You need to call toByteArray instead:
byte[] encMessage = out.toByteArray();
out.writeInt(encMessage.length);
out.write(encMessage);
Detail:
if you use toString(), Java will encode your bytes in your platform default character encoding. That could be some Windows codepage, UTF-8, or whatnot.
However not all characters can be encoded properly, and some will be replaced by an alternative character - a question mark perhaps. Without knowing the details, it's hard to tell.
But in any case, encoding the byte array to a String, and then decoding it to a byte array again when you write it out, is very likely to change the data in the byte array. And there is not need to do it, you can just get the byte array straight away as shown in the code above.
Why on earth are you indulging in all this complication? You can reduce it all to this:
GZIPOutputStream gzip = new GZIPOutputStream(socket.getOutputStream());
DataOutputStream out = new DataOutputStream(gzip);
String message = "Hello World!";
out.writeUTF(message);
out.close();
// ...
GZIPInputStream gz = new GZIPInputStream(new ByteArrayInputStream(socket.getInputStream()));
DataInputStream input = new DataInputStream(gz);
String line = input.readUTF();
I further note that your code doesn't actually compile. I would further note that unless the messages are several orders of magnitude larger, there is no benefit to the GZipping.

Writing byte array to an UTF8-encoded file

Given a byte array in UTF-8 encoding (as result of base64 decoding of a String) - what is please a correct way to write it to a file in UTF-8 encoding?
Is the following source code (writing the array byte by byte) correct?
OutputStreamWriter osw = new OutputStreamWriter(
new FileOutputStream(tmpFile), Charset.forName("UTF-8"));
for (byte b: buffer)
osw.write(b);
osw.close();
Don't use a Writer. Just use the OutputStream. A complete solution using try-with-resource looks as follows:
try (FileOutputStream fos = new FileOutputStream(tmpFile)) {
fos.write(buffer);
}
Or even better, as Jon points out below:
Files.write(Paths.get(tmpFile), buffer);

Java PrintWriter Not sending Byte Array

A byte array of unknown size is needed to be sent over a socket. When i try to write the byte array to printwriter like
writeServer = new PrintWriter(socketconnection.getOutputStream());
writeServer.flush();
writeServer.println("Hello World");
writeServer.println(byteArray.toString());
It is received at the server but is only a string of 5-6 characters always starting from [B#..... But when i send it through the output stream like
writeServer.println("Hello World");
socketconnection.getOutputStream().write(byteArray);
It is received at the server correctly. But the issue is in the second option the "Hello World" String does not go through to the server. I want both of things delivered to the server.
How should i do it?
You are trying to mix binary and text which is bound to be confusing. I suggest you use one or the other.
// as text
PrintWriter pw = new PrintWriter(socketconnection.getOutputStream());
pw.println("Hello World");
pw.println(new String(byteArray, charSet);
pw.flush();
or
// as binary
BufferedOutputStream out = socketconnection.getOutputStream();
out.write("Hello World\n".getBytes(charSet));
out.write(byteArray);
out.write(`\n`);
out.flush();
byteArray.toString() will return you human readable form of byteArray. Even though for Arrays it is never human readable.
If you want to transfer byteArray as String then you should use
String str = new String(bytes, Charset.defaultCharset());//Specify different charset value if required.

how to insert and read utf-8 text on RecordStore

how can i insert a UTF-8 String into RecordStore and read this as a UTF-8 String ?
thanks
//write
ByteArrayOutputStream boStream = new ByteArrayOutputStream();
DataOutputStream doStream = new DataOutputStream(boStream);
doStream.writeUTF(myString);
temp.addRecord(boStream.toByteArray(), 0, boStream.size());
//read
ByteArrayInputStream biStream = new ByteArrayInputStream(temp.getRecord(id));
DataInputStream diStream = new DataInputStream(biStream);
myString = diStream.readUTF();
I got the handle wrong on the question. RecordStore still store byte arrays. What you need to do is convert it into a byte array and back again. Just use string.getBytes() and then store it like that, and then the opposite is String str = new String(bytes);. Hope that helps. The default charset of either J2ME or J2SE is UTF-8, so there's no messing about there.

Categories

Resources