can't work with BufferedInputStream and BufferedReader together - java

I'm trying to read first line from socket stream with BufferedReader from BufferedInputStream, it reads the first line(1), this is size of some contents(2) in this content i have the size of another content(3)
Reads correctly... ( with BufferedReader, _bin.readLine() )
Reads correctly too... ( with _in.read(byte[] b) )
Won't read, seems there's more content than my size read in (2)
I think problem is that I'm trying to read using BufferedReader and then BufferedInputStream... can anyone help me ?
public HashMap<String, byte[]> readHead() throws IOException {
JSONObject json;
try {
HashMap<String, byte[]> map = new HashMap<>();
System.out.println("reading header");
int headersize = Integer.parseInt(_bin.readLine());
byte[] parsable = new byte[headersize];
_in.read(parsable);
json = new JSONObject(new String(parsable));
map.put("id", lTob(json.getLong(SagConstants.KEY_ID)));
map.put("length", iTob(json.getInt(SagConstants.KEY_SIZE)));
map.put("type", new byte[]{(byte)json.getInt(SagConstants.KEY_TYPE)});
return map;
} catch(SocketException | JSONException e) {
_exception = e.getMessage();
_error_code = SagConstants.ERROR_OCCOURED_EXCEPTION;
return null;
}
}
sorry for bad english and for bad explanation, i tried to explain my problem, hope you understand
file format is so:
size1
{json, length is given size1, there is size2 given}
{second json, length is size2}
_in is BufferedInputStream();
_bin is BufferedReader(_in);
with _bin, i read first line (size1) and convert to integer
with _in, i read next data, where is size2 and length of this data is size1
then im trying to read the last data, its size is size2
something like this:
byte[] b = new byte[secondSize];
_in.read(b);
and nothing happens here, program is paused...

can't work with BufferedInputStream and BufferedReader together
That's correct. If you use any buffered stream or reader on a socket [or indeed any data source], you can't use any other stream or reader with it whatsoever. Data will get 'lost', that is to say read-ahead, in the buffer of the buffered stream or reader, and will not be available to the other stream/reader.
You need to rethink your design.

You create one BufferedReader _bin and BufferedInputStream _in and read a file both of them, but their cursor position is different so second read start from beginning because you use 2 object to read it. You should read size1 with _in too.
int headersize = Integer.parseInt(readLine(_in));
byte[] parsable = new byte[headersize];
_in.read(parsable);
Use below readLine to read all data with BufferedInputStream.
private final static byte NL = 10;// new line
private final static byte EOF = -1;// end of file
private final static byte EOL = 0;// end of line
private static String readLine(BufferedInputStream reader,
String accumulator) throws IOException {
byte[] container = new byte[1];
reader.read(container);
byte byteRead = container[0];
if (byteRead == NL || byteRead == EOL || byteRead == EOF) {
return accumulator;
}
String input = "";
input = new String(container, 0, 1);
accumulator = accumulator + input;
return readLine(reader, accumulator);
}

Related

How to properly handle InpuStream as a method argument

I've been googling for almost two days and i still can't figure it out. I have this exercise when it passes InpuStream is as an argument and expects me to store whatever is passed and return the count, but I don't know and can't seem to figure how to handle InputStream properly. I always get argument error.
Code:
class Subtitles {
int redenBroj;
int vrPocetok;
int vrKraj;
String text;
public Subtitles() {
redenBroj = 0;
vrPocetok = 0;
vrKraj = 0;
text = null;
}
int loadSubtitles(InputStream is) {
}
}
InputStream is an abstract class. Therefore, the implementation of the method int loadSubtitles should not care on how the given InputStream is implemented - it can be anything, as long as it is a type of InputStream.
You can choose from different subclasses of InputStream so that you can test your method with your own data format:
FileInputStream -- You can use this type of input stream if you want to stream a file:
File sourceFile = new File("source.txt");
InputStream inputStream = new FileInputStream(sourceFile)
ByteArrayInputStream -- This is used to stream an array of bytes.
byte[] input = "this is an example array".getBytes();
InputStream inputStream = new ByteArrayInputStream(input);
Now that you have built an input stream, you can now use them regardless on how it is built:
// Java 9+
byte[] content = inputStream.readAllBytes();
// do something with `content`
-
// before Java 9
int data = inputStream.read();
while (data != -1) {
// doSomething with `data`
data = inputStream.read(); // read next data
}
inputStream.close(); // or use the try-with-resources syntax

How to read /write XORed txt file UTF8 in java?

what i did so far :
I read a file1 with text, XORed the bytes with a key and wrote it back to another file2.
My problem: I read for example 'H' from file1 , the byte value is 72;
72 XOR -32 = -88
Now i wrote -88 in to the file2.
when i read file2 i should get -88 as first byte, but i get -3.
public byte[] readInput(String File) throws IOException {
Path path = Paths.get(File);
byte[] data = Files.readAllBytes(path);
byte[]x=new byte[data.length ];
FileInputStream fis = new FileInputStream(File);
InputStreamReader isr = new InputStreamReader(fis);//utf8
Reader in = new BufferedReader(isr);
int ch;
int s = 0;
while ((ch = in.read()) > -1) {// read till EOF
x[s] = (byte) (ch);
}
in.close();
return x;
}
public void writeOutput(byte encrypted [],String file) {
try {
FileOutputStream fos = new FileOutputStream(file);
Writer out = new OutputStreamWriter(fos,"UTF-8");//utf8
String s = new String(encrypted, "UTF-8");
out.write(s);
out.close();
}
catch (IOException e) {
e.printStackTrace();
}
}
public byte[]DNcryption(byte[]key,byte[] mssg){
if(mssg.length==key.length)
{
byte[] encryptedBytes= new byte[key.length];
for(int i=0;i<key.length;i++)
{
encryptedBytes[i]=Byte.valueOf((byte)(mssg[i]^key[i]));//XOR
}
return encryptedBytes;
}
else
{
return null;
}
}
You're not reading the file as bytes - you're reading it as characters. The encrypted data isn't valid UTF-8-encoded text, so you shouldn't try to read it as such.
Likewise, you shouldn't be writing arbitrary byte arrays as if they're UTF-8-encoded text.
Basically, your methods have signatures accepting or returning arbitrary binary data - don't use Writer or Reader classes at all. Just write the data straight to the stream. (And don't swallow the exception, either - do you really want to continue if you've failed to write important data?)
I would actually remove both your readInput and writeOutput methods entirely. Instead, use Files.readAllBytes and Files.write.
In writeOutput method you convert encrypted byte array into UTF-8 String which changes the actual bytes you are writing later to the file. Try this code snippet to see what is happening when you try to convert byte array with negative values to UTF-8 String:
final String s = new String(new byte[]{-1}, "UTF-8");
System.out.println(Arrays.toString(s.getBytes("UTF-8")));
It will print something like [-17, -65, -67]. Try using OutputStream to write bytes to the file.
new FileOutputStream(file).write(encrypted);

Safely reading http request headers in java

I'm building my own HTTP webserver in java and would like to implement some security measures while reading the http request header from a socket inputstream.
I'm trying to prevent scenario's where someone sending extremely long single line headers or absurd amounts of header lines would cause memory overflows or other things you wouldn't want.
I'm currently trying to do this by reading 8kb of data into a byte array and parse all the headers within the buffer I just created.
But as far as I know this means your inputstream's current offset is always already 8kb from it's starting point, even if you had only 100bytes of header.
the code I have so far:
InputStream stream = socket.getInputStream();
HashMap<String, String> headers = new HashMap<String, String>();
byte [] buffer = new byte[8*1024];
stream.read( buffer , 0 , 8*1024);
ByteArrayInputStream bytestream = new ByteArrayInputStream( buffer );
InputStreamReader streamReader = new InputStreamReader( bytestream );
BufferedReader reader = new BufferedReader( streamReader );
String requestline = reader.readLine();
for ( ;; )
{
String line = reader.readLine();
if ( line.equals( "" ) )
break;
String[] header = line.split( ":" , 2 );
headers.put( header[0] , header[1] ); //TODO: check for bad header
}
//if contentlength > 0
// read body
So my question is, how can I be sure that I'm reading the body data (if any) starting from the correct position in the inputstream?
I don't exactly use streams a lot so I don't really have a feel for them and google hasn't been helpful so far
I figured out an answer myself. (was easier than I thought it would be)
If I were to guess it's not buffered (I've no idea when something is buffered anyway) but it works.
public class SafeHttpHeaderReader
{
public static final int MAX_READ = 8*1024;
private InputStream stream;
private int bytesRead;
public SafeHttpHeaderReader(InputStream stream)
{
this.stream = stream;
bytesRead = 0;
}
public boolean hasReachedMax()
{
return bytesRead >= MAX_READ;
}
public String readLine() throws IOException, Http400Exception
{
String s = "";
while(bytesRead < MAX_READ)
{
String n = read();
if(n.equals( "" ))
break;
if(n.equals( "\r" ))
{
if(read().equals( "\n" ))
break;
throw new Http400Exception();
}
s += n;
}
return s;
}
private String read() throws IOException
{
byte b = readByte();
if(b == -1)
return "";
return new String( new byte[]{b} , "ASCII");
}
private byte readByte() throws IOException
{
byte b = (byte) stream.read();
bytesRead ++;
return b;
}
}

InputStreamReader don't limit returned length

I am working on learning Java and am going through the examples on the Android website. I am getting remote contents of an XML file. I am able to get the contents of the file, but then I need to convert the InputStream into a String.
public String readIt(InputStream stream, int len) throws IOException, UnsupportedEncodingException {
InputStreamReader reader = null;
reader = new InputStreamReader(stream, "UTF-8");
char[] buffer = new char[len];
reader.read(buffer);
return new String(buffer);
}
The issue I am having is I don't want the string to be limited by the len var. But, I don't know java well enough to know how to change this.
How can I create the char without a length?
Generally speaking it's bad practice to not have a max length on input strings like that due to the possibility of running out of available memory to store it.
That said, you could ignore the len variable and just loop on reader.read(...) and append the buffer to your string until you've read the entire InputStream like so:
public String readIt(InputStream stream, int len) throws IOException, UnsupportedEncodingException {
String result = "";
InputStreamReader reader = null;
reader = new InputStreamReader(stream, "UTF-8");
char[] buffer = new char[len];
while(reader.read(buffer) >= 0)
{
result = result + (new String(buffer));
buffer = new char[len];
}
return result;
}

Changing encoding in java

I am writting a function that is should detect used charset and then switch it to utf-8. I am using juniversalchardet which is java port for universalchardet by mozilla.
This is my code:
private List<List<String>> setProperEncoding(List<List<String>> input) {
try {
// Detect used charset
UniversalDetector detector = new UniversalDetector(null);
int position = 0;
while ((position < input.size()) & (!detector.isDone())) {
String row = null;
for (String cell : input.get(position)) {
row += cell;
}
byte[] bytes = row.getBytes();
detector.handleData(bytes, 0, bytes.length);
position++;
}
detector.dataEnd();
Charset charset = Charset.forName(detector.getDetectedCharset());
Charset utf8 = Charset.forName("UTF-8");
System.out.println("Detected charset: " + charset);
// rewrite input using proper charset
List<List<String>> newLines = new ArrayList<List<String>>();
for (List<String> row : input) {
List<String> newRow = new ArrayList<String>();
for (String cell : row) {
//newRow.add(new String(cell.getBytes(charset)));
ByteBuffer bb = ByteBuffer.wrap(cell.getBytes(charset));
CharBuffer cb = charset.decode(bb);
bb = utf8.encode(cb);
newRow.add(new String(bb.array()));
}
newLines.add(newRow);
}
return newLines;
} catch (Exception e) {
e.printStackTrace();
return input;
}
}
My problem is that when I read file with chars of for example Polish alphabet, letters like ł,ą,ć and similiar are replaced by ? and other strange things. What am I doing wrong?
EDIT:
For compilation I am using eclipse.
Method parameter is a result of reading MultipartFile. Just using FileInputStream to get every line and then splitting everyline by some separator (it is prepaired for xls, xlsx and csv files). Nothing special there.
First of all, you have your data somewhere in a binary format. For the sake of simplicity, I suppose it comes from an InputStream.
You want to write the output as an UTF-8 String, I suppose it can be an OutputStream.
I would recommend to create an AutoDetectInputStream:
public class AutoDetectInputStream extends InputStream {
private InputStream is;
private byte[] sampleData = new byte[4096];
private int sampleLen;
private int sampleIndex = 0;
public AutoDetectStream(InputStream is) throws IOException {
this.is = is;
// pre-read the data
sampleLen = is.read(sampleData);
}
public Charset getCharset() {
// detect the charset
UniversalDetector detector = new UniversalDetector(null);
detector.handleData(sampleData, 0, sampleLen);
detector.dataEnd();
return detector.getDetectedCharset();
}
#Override
public int read() throws IOException {
// simulate the stream for the reader
if(sampleIndex < sampleLen) {
return sampleData[sampleIndex++];
}
return is.read();
}
}
The second task is quite simple because Java stores the strings (characters) in UTF-8, so just use a simple OutputStreamWriter. So, here's your code:
// open input with Detector stream
// we use BufferedReader so we could read lines
InputStream is = new FileInputStream("in.txt");
AutoDetectInputStream detector = new AutoDetectInputStream(is);
Charset charset = detector.getCharset();
// here we can use the charset to decode the bytes into characters
BufferedReader rdr = new BufferedReader(new InputStreamReader(detector, charset));
// open output to write to
OutputStream os = new FileOutputStream("out.txt");
Writer utf8Writer = new OutputStreamWriter(os, Charset.forName("UTF-8"));
// copy the whole file
String line;
while((line = rdr.readLine()) != null) {
utf8Writer.append(line);
}
// close streams
rdr.close();
utf8Writer.flush();
utf8Writer.close();
So, finally you got all your txt file transcoded to UTF-8.
Note, that the buffer size should be big enough to feed the UniversalDetector.

Categories

Resources