why this function returns null outstream - java

this function when called in a loop is sometimes giving null as outstream while other times not .. any reason why ? i am writing the outstream into text file sometimes i get empty text file . why ? if i run the loop 20 times .. i sometimes get empty text file on 3 random occasions sometimes 4 or 2 random occasions. what should i do ?
public void decrypt(InputStream in, OutputStream out) {
try {
// Bytes read from in will be decrypted
in = new CipherInputStream(in, dcipher);
// Read in the decrypted bytes and write the cleartext to out
int numRead = 0;
while ((numRead = in.read(buf)) >= 0) {
out.write(buf, 0, numRead);
}
out.close();
}
catch (java.io.IOException e) {
}
}

I think this happens because you are closing the output stream in your function. This way, the next iteration of your cycle will try to write to an already closed output stream. It will throw an IOException but you are ignoring it. Try closing the output stream after your loop and not in the method.
InputStream in = null;
OutputStream out = null;
try {
in = Initialize input stream
out = Initialize output stream
for (int i = 0; i < 10; i++) {
decrypt(in, out);
}
}finally {
try {
if (out != null)
out.close();
}finally {
if (in != null)
in.close();
}
}

If an exception is thrown by any code in your try block , it is ignored (since you have nothing in your catch clause.
You might want to :
actually do something in the catch clause (at least print the message of the exception - try e.printStackTrace())
instead of doing the out.close() call in the try block, do it in a finally clause after the catch block (so that it happens even if there is an error)
also , as pointed out by bruno, if you're always reusing the same output stream for evey calls of decrypt, you should not close it inside the function. However you might want to flush() it inside you loop.

you should definitely fix this part of your code:
catch (java.io.IOException e) {
}
and do at least some logging there. That way you'll find out why you have the problem you described.

"Never close something that you haven't opened" - don't know if that's a golden rule, but it nearly always leads to trouble when you close a resource in a subroutine - either the ressource is closed next time you need it or the resource is not closed because you changed the code...

Related

Reading and writing into the same file with FileInputStream and FileOutputStream

The problem with my code is an infinite loop of reading and writing.
I can't find a solution or a concept for this problem.
FileInputStream in = new FileInputStream("toto.txt");
FileOutputStream out = new FileOutputStream("toto.txt",false);
int m;
while ((m = in.read()) != 0) {
System.out.print(m);
out.write(m);
}
in.close();
out.close();
alter the loop condition to below:
while ((m = in.read()) != -1)
The problem with my code in an infinite loop of reading and writing. I
can't find a solution or a concept for this problem.
There's a number of problems with your code:
The file will be treated as empty after the FileOutputStream gets instantiated because you've set append flag to false. End method read() will always return -1 because there's no content to read.
Condition is incorrect and method read() and only because of that control enters the loop and EOF (-1) is being repeatedly written into the file. If you fixed the condition to (m = in.read()) != -1, the loop would be ignored because the file is blank from the start.
If you would do both: fix the condition and change the append flag to true then you would get another flavor of infinite loop. All the contents of the file will be successfully read and repeatedly appended to the file.
So at any condition, reading and writing simultaneously to the same file isn't a good idea.
One important note in regard to exception handling.
Because there's no catch block in your code snippet, I assume that you've added a throws to the main() - it's not a nice idea. Methods close() in your code will be invoked only in case of successful execution, but if exception occur resources will never get released.
Instead, I suggest you to make use of try with resources. That will provide an implicit finally block for you that will take care of closing the resources regardless whether exception occurred or not (now your invocations of close() will not get executed in case of exception). Another option is to declare finally block explicitly, and close the resources inside it.
Try with resource is more concise and cleaner way to ensure that resources will get released.
Also consider wrapping both streams with buffered high-level streams to improve performance. It'll significantly reduce the number of time your application will need to access the file system.
try (var in = new BufferedInputStream(new FileInputStream("source.txt"));
var out = new BufferedOutputStream(new FileOutputStream("destination.txt", false))) {
int next; // a subsequent byte that has been read from the source
while ((next = in.read()) != -1) {
out.write(next);
}
} catch (IOException e) {
e.printStackTrace();
}
It goes into an infinite loop because reads will see the results of past writes.
Reading and Writing the same file using FileInputStream and FileOutputStream is not possible. Use RandomAccessFile if you want to read/write to the same file. You can specify the position as well if you want to write at a specific place in your file.
If you want to write to the end of the file and then read all the lines on the file then here is a sample for that:
RandomAccessFile file = new RandomAccessFile("toto.txt", "rw");
file.seek(file.length());
file.writeBytes("This is a temp file");
file.seek(0); //sets the pointer to the first byte
String line;
while((line = file.readLine()) != null) {
System.out.println(line);
}

how to fix sonar issue: stream is not closed when stream is really closed but in lambda

Sonar raises the issue that fileStream is not closed in the below code. However it is, but in the lambda expression.
try {
final InputStream fileStream = new FileInputStream(copy);
return (OutputStream outputStream) -> {
int n;
byte[] buffer = new byte[1024];
while ((n = fileStream.read(buffer)) > -1) {
outputStream.write(buffer, 0, n);
}
fileStream.close();
};
} catch (IOException exception) {
//...
}
When I change it and use try-with-resource pattern then I get the exception: java.io.IOException: Stream Closed in the line of reading fileStream:
try (final InputStream fileStream = new FileInputStream(copy)) {
return (OutputStream outputStream) -> {
int n;
byte[] buffer = new byte[1024];
while ((n = fileStream.read(buffer)) > -1) {
outputStream.write(buffer, 0, n);
}
};
} catch (IOException exception) {
//...
}
Thus the second solution resolves the bug detected by sonar, however it just doesn't work as fileStream is closed before lambda code is invoked.
What would you suggest to fix it?
As noted in the comments by #Krashen, your first version could throw an exception before close() is called.
Your section version creates the InputStream in a try-with-resources in this method and then tries to return it as part of the lambda expression. But a try-with-resources ensures its resources are closed, and as far as I can tell, that close happens just before the method exits. Explicitly, by the time the caller receives the returned lambda, the InputStream has already been closed.
So... your best bet is to either extract your logic from the lambda and return the results, or assign the lambda results to a variable and then return that variable. Doing the latter will likely raise an issue from S1488 (Local variables should not be declared and then immediately returned or thrown) which I would simply close Won't Fix.

Reading file >4GB file in java

I have mainframe data file which is greater than 4GB. I need to read and process the data for every 500 bytes. I have tried using FileChannel, however I am getting error with message Integer.Max_VALUE exceeded
public void getFileContent(String fileName) {
RandomAccessFile aFile = null;
FileChannel inChannel = null;
try {
aFile = new RandomAccessFile(Paths.get(fileName).toFile(), "r");
inChannel = aFile.getChannel();
ByteBuffer buffer = ByteBuffer.allocate(500 * 100000);
while (inChannel.read(buffer) > 0) {
buffer.flip();
for (int i = 0; i < buffer.limit(); i++) {
byte[] data = new byte[500];
buffer.get(data);
processData(new String(data));
buffer.clear();
}
}
} catch (Exception ex) {
// TODO
} finally {
try {
inChannel.close();
aFile.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
Can you help me out with a solution?
The worst problem of you code is the
catch (Exception ex) {
// TODO
}
part, which implies that you won’t notice any exceptions thrown by your code. Since there is nothing in the JRE printing a “Integer.Max_VALUE exceeded” message, that problem must be connected to your processData method.
It might be worth noting that this method will be invoked way too often with repeated data.
Your loop
for (int i = 0; i < buffer.limit(); i++) {
implies that you iterate as many times as there are bytes within the buffer, up to 500 * 100000 times. You are extracting 500 bytes from the buffer in each iteration, processing a total of up to 500 * 500 * 100000 bytes after each read, but since you have a misplaced buffer.clear(); at the end of the loop body, you will never experience a BufferUnderflowException. Instead, you will invoke processData each of the up to 500 * 100000 times with the first 500 bytes of the buffer.
But the whole conversion from bytes to a String is unnecessarily verbose and contains unnecessary copy operations. Instead of implementing this yourself, you can and should just use a Reader.
Besides that, your code makes a strange detour. It starts with a Java 7 API, Paths.get, to convert it to a legacy File object, create a legacy RandomAccessFile to eventually acquire a FileChannel. If you have a Path and want a FileChannel, you should open it directly via FileChannel.open. And, of course, use a try(…) { … } statement to ensure proper closing.
But, as said, if you want to process the contents as Strings, you surely want to use a Reader instead:
public void getFileContent(String fileName) {
try( Reader reader=Files.newBufferedReader(Paths.get(fileName)) ) {
CharBuffer buffer = CharBuffer.allocate(500 * 100000);
while(reader.read(buffer) > 0) {
buffer.flip();
while(buffer.remaining()>500) {
processData(buffer.slice().limit(500).toString());
buffer.position(buffer.position()+500);
}
buffer.compact();
}
// there might be a remaining chunk of less than 500 characters
if(buffer.position()>0) {
processData(buffer.flip().toString());
}
} catch(Exception ex) {
// the *minimum* to do:
ex.printStackTrace();
// TODO real exception handling
}
}
There is no problem with processing files >4GB, I just tested it with a 8GB file. Note that the code above uses the UTF-8 encoding. If you want to retain the behavior of your original code of using whatever happens to be your system’s default encoding, you may create the Reader using
Files.newBufferedReader(Paths.get(fileName), Charset.defaultCharset())
instead.

Read Java socket inputstream without thread.sleep() in the below code?

public static void waitUntil(String prompt, InputStream instr) {
while (true) {
try {
try {
Thread.sleep(1000);
} catch (InterruptedException e) {
e.printStackTrace();
}
if (instr.available() >= 5) {
byte[] buff = new byte[1024];
int ret_read = 0;
ret_read = instr.read(buff);
if (ret_read > 0) {
if ((new String(buff, 0, ret_read)).contains(prompt)
&& flag) {
break;
}
}
}
} catch (IOException e) {
e.printStackTrace();
}
}
}
If if remove that thread.sleep(1000) or even i reduce the to less than 1000 its not working properly.
Question : How to read java socket inputstream without thread.sleep() till all all incoming bytes are arrived?
if (instr.available() >= 5) {
Don't do that.
Instead of checking how many bytes are available, just try to read some into a buffer.
That will block until at least one byte is available, and then return as many as there are (that also fit into the buffer).
If that does not return all the bytes you need, loop until you get them.
If you just want to read everything, check out this thread: Convert InputStream to byte array in Java . Personally, I use Commons IO for this.
Use DataInputStream.readFully() with a buffer size of 5 (in this case, or more generally the size of data you're expecting), and get rid of both the sleep and the available() test.

Unable to read data bytes from input stream in java tcp socket

In my Server code, I send different request to client and get back the response but only first read request is accessed, during accessing second read statement,it is unable to read Data bytes,my code is as followed.
private static boolean Rt_Request(int id,Object client)throws Exception
{
int size=5;
byte[] buf=new byte[size];
char[] cbuf=new char[32];
int byteRead; Socket s=(Socket)client;
BufferedReader in1= new BufferedReader(new InputStreamReader(s.getInputStream()));
PrintStream out=new PrintStream(s.getOutputStream());
try {
buf[0]=0x02;
buf[1]=0x09;
buf[2]=0x01;
buf[3]=0x00;
buf[4]=0x03;
Thread.sleep(5000);
out.write(buf, 0, 5);
} catch(Exception e) {
System.out.println("Error Occured...."+e);
}
byteRead=0;
while(byteRead!=1) {
try {
byteRead=in1.read(cbuf, 0, 1);// Have problem on this line,here I am unable to read data bytes.
for(int i=0;i<byteRead;i++)
{
System.out.println(cbuf[i]);
}
if(byteRead==0)
{
System.out.println("Breaking.....");
return false;
}
}
catch(Exception e) {
System.out.println("Error Occured...."+e);
return false;
}
}
return true;
} catch(Exception e) {
System.out.println("System is not Connected..."+e);
return false;
}
almost tried every thing socket is not closed, read.available();,read.fully(); etc..unable to get the solution.I have written this function in the run method of TimerTask class.
any help will be greatly appreciated
the javadocs says BufferedReader#read(char[], int, int) Returns:
The number of characters read, or -1 if the end of the stream has been reached
since you do
byteRead=in1.read(cbuf, 0, 1);
in
while(byteRead!=1)
change it to
while(byteRead != -1)
byteRead=in1.read(cbuf, 0, 1);
This line only reads in one value and as you don't call it again before you enter the for loop, you should be getting 1 println of the value that was read in displayed in stdout.
read() blocks until at least one byte is available. Maybe you haven't sent it, or flushed it properly, or maybe you are creating multiple BufferedReaders on the same socket.
NB bytesRead can never be zero after a successful read(cbuf, 0, 1).
The read method of the underlying InputStream will block (i.e. hang/wait) if no data is available.
This method blocks until input data is available, end of file is detected, or an exception is thrown.
I strongly suspect this is the case.
You can check this by calling in1.ready() on the reader.
Flush the output buffer
out.flush();
after writing the bytes, or they may get buffered locally.

Categories

Resources