Failing to redirect standard output in Java - java

I am attempting to redirect the console output of a Java library to a JTextArea in my application. The library in question is JPL, a bridge between Java and SWI-Prolog (though I doubt this has much relevance conceptually).
Here is my code:
PrintStream out = new PrintStream(new MyOutputStream(), true, "UTF-8");
System.setOut(out);
System.setErr(out);
System.out.println("This text is successfully redirected.");
Hashtable<String, Term> solutions[] = Query.allSolutions("append([1],[2],[1,2])");
When I run this code, the string "This text is successfully redirected" is redirected to my PrintStream as expected. However, the last line of the above snippet generates some text output which is still printed at the Java console instead of in my PrintStream.
What could the method allSolutions(String) be doing so that its output is not redirected? Could it be the fact that it calls an OS process that generates the output (it does call the swipl process)? If so, can I redirect its output without modifying the JPL code?

Yes, your assumption is correct: the process being executed is printing directly to the console. What you can do is capture the output of that process and pass it to your PrintStream as shown here by #StijnGeukens: (copy pasted code follows)
String line;
Process p = Runtime.getRuntime().exec(...);
BufferedReader input = new BufferedReader(new InputStreamReader(p.getInputStream()));
while ((line = input.readLine()) != null) {
System.out.println(line);
}
input.close();
But that's only if you actually have access to the internals of allSolutions.
I do not know a way of capturing any other output, I'm afraid.

Related

Java Process causing python script JSON error

I have a Java program that manages data. When it wants to create a report from said data it does so via saving a JSON file with the relevant data for the report and then setting off the python script via the use of a ProcessBuilder object. I'm however having a weird error should I try and extract data from the output of the python script.
ProcessBuilder pythonProcess = new
ProcessBuilder("python","ReportingTool.py");
pythonProcess.directory(new File("invoice_python_files\\"));
Process pythonRunnable =pythonProcess.start();
/*
BufferedReader outputReader = new BufferedReader(new
InputStreamReader(pythonRunnable.getInputStream()));
BufferedReader errorReader = new BufferedReader(new
InputStreamReader(pythonRunnable.getErrorStream()));
String line =null;
System.out.println("<ERROR>");
while ( (line = errorReader.readLine()) != null)
System.out.println(line);
System.out.println("</ERROR>");
System.out.println("<Output">)
while ( (line = outputReader.readLine()) != null)
System.out.println(line);
System.out.println("</OUTPUT>");
*/
This works fine and produces the report as expected (without the input stream code). If I then uncomment the code I get an error from the python script.
File "C:\Users\o.cohen\AppData\Local\Programs\Python\Python36-
32\lib\json\decoder.py", line 357, in raw_decode
raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)
I don't understand how the java process is causing the error and more importantly how to fix it. Below is the code that causes the error (specifically the last line:
with open("InvoiceMakerDoc.json") as json_file:
json_data=json_file.read()
decoded_data =json.loads(json_data)
Your java application need to wait till completion of python job first. Hence you need to add pythonRunnable.waitFor() before reading the input stream.
Process pythonRunnable =pythonProcess.start();
pythonRunnable.waitFor();
Managed to fix it.
The program was writing to the JSON file before it set the python program off and the FileWriter object hadn't been closed causing the issue.

Get STDOUT from Runtime.getRuntime().exec() line-by-line rather than all at once

I'm executing a script from Java with the following code.
try (BufferedReader input = new BufferedReader(new InputStreamReader(Runtime.getRuntime()
.exec("ruby test.rb").getInputStream()))) {
String line = null;
while ((line = input.readLine()) != null) {
System.out.println("Got line: " + line);
}
}
Where test.rb is simply:
puts "one"
sleep 1
puts "two"
The problem is that all of the input arrives in the BufferedReader at the same time. There should be a 1 second delay between the two lines. Is there a way to flush the Inputstream after every line in the script?
What I'm trying to accomplish
I want to call some Ruby scripts from a Java GUI and have the output of the script show up incrementally in a textpane. Using the code above, STDOUT from the script shows up in the textpane all at once when the script terminates (and the Inputstream is presumably flushed).
Edit:
The issue was that Ruby was buffering STDOUT. Fixed by $stdout.sync = true.
Have you tried Thread.sleep(1000) after System.out.println(). You must catch some InterruptedException. Your editor will tell you for sure.

No input from ProcessBuilder's InputStream until external process is closed

Whilst building an wrapper for an console application I came across this weird issue where the Input Stream connected to the output (stdout) of the external process is completely blank until the external process exits.
My code as below:
import java.io.BufferedReader;
import java.io.File;
import java.io.IOException;
public class Example{
public static void main(String args[]) throws IOException{
File executable = ...
ProcessBuilder pb = new ProcessBuilder(executable.getCanonicalPath());
pb.redirectErrorStream(true);
Process p = pb.start();
BufferedReader br = new BufferedReader(new InputStreamReader(p.getInputStream(), Charset.forName("UTF-8")));
String line;
while((line = br.readLine()) != null){
System.out.println(line);
}
}
}
I've tried several variants of reading from the input stream and all resulted in the same behavior.
I've tried:
CharBuffer charBuf = CharBuffer.allocate(1000);
InputStreamReader isr = new InputStreamReader(p.getInputStream(), Charset.forName("UTF-8"));
while(isr.read(charBuf) != -1){
System.out.print(charBuf.flip().toString());
}
and
byte[] buf = new byte[1000];
int r;
while((r = p.getInputStream().read(buf)) != -1){
System.out.print(new String(buf, 0, r));
}
all to no avail.
Somewhere along the line the output from the external process is being buffered (indefinitely) and I can't really figure out where. Loading the process from the command line seems to work fine where I see output coming out instantaneously. The strangest part is where the fact that the termination of the external process results in a flood of all the "buffered" output at once (a lot of stuff).
Unfortunately I don't have access to the source of the external process but given that it writes to stdout fine when in a console shouldn't really make a difference there (as far as I know).
Any ideas are welcome.
Edit:
One answer recommended me to rewrite the reader for the output and error streams to run on a separate thread. My actual implementation is doing that! And yet the problem still exists. The code posted above is a SSCCE of my actual code condensed for readability purposes, the actual code involves a separate thread for reading from the InputStream.
Edit 2:
User FoggyDay seems to have provided the answer which defines how the behavior of output buffering change when outputting between console and non-consoles. Whilst processes which detect that they are writing to a console use line buffering (buffered flushed every new line), writing to non-consoles (everything that it detects to not be a console) may be fully buffered (to a size of something like 8K). If I make the external process spam (8K of lorem ipsum in a for loop) output does indeed appear. I guess my question now is how to make my java program trigger line buffering on the external process.
To your question "how to make my java program trigger line buffering on the external process":
On Linux you can use the "stdbuf" program (coreutils package): stdbuf -oL your_program program_args
You only need to change stdout since stderr is unbuffered by default. The man page of setlinebuf gives additional background information if you're interested: http://linux.die.net/man/3/setlinebuf
Some software checks if it is writing to a terminal and switches behavior to unbuffered output. This could be a reason, why it works in the terminal. Pipe the output to cat and see if the output still appears immediately.
Another reason could be that the program is waiting for input or a close of its stdin before it does something, although this does not really match the symptoms described so far.

Java: Redirecting inner process output when running from command line

I use the following code, for redirecting the output of a process I launch from my Java app:
ProcessBuilder builder = new ProcessBuilder("MyProcess.exe");
builder.redirectOutput(Redirect.INHERIT);
builder.redirectErrorStream(true);
Now, this works fine when I run the code from eclipse - I can see the output in Eclipse's console.
Yet when I create a jar file and run it from a cmd window, e.g. java -jar MyJar.jar, it doesn't print the output of the process. What could be the reason for this?
I know I'm late in answering, but I came across this question before coming across the answer, and wanted to save anybody else in the same boat some searching.
This is actually a known bug for Windows: https://bugs.openjdk.java.net/browse/JDK-8023130
You can get around it by redirecting the streams yourself:
Process p = pb.start();
BufferedReader br = new BufferedReader(new InputStreamReader(p.getInputStream()));
String line = null;
while ((line = br.readLine()) != null) {
System.out.println(line);
}
p.waitFor();
br.close();
It may be, that process is printing an error and exiting for some reason. So, the actual output goes into Err stream and not into the Out stream. Your code redirects Out stream only, so important process error information may be lost. I would suggest to inherit both Out and Err streams using this code:
ProcessBuilder builder = new ProcessBuilder("MyProcess.exe");
builder.inheritIO();
One more reason to redirect both streams is related to the output buffering for child process. If parent process (your java application) is not reading or redirecting standard streams (Out and Err) of the child process, then the latter may be blocked after a while, unable to make any further progress.
It definitely wouldn't hurt to have possible errors in the output anyway.

interacting with console app using java

i want to open an external app using java.
Process p = Runtime.getRuntime().exec("/Users/kausar/myApp");
this runs the process as i can see in activity monitor.
Now the file i run is actually console app which then takes commands and gives response based on those commands.
for example if i go to terminal and put the same
Kausars-MacBook-Air:~ kausar$ /Users/kausar/myApp
myApp>
Now i can give commands to app as for example
myApp> SHOW 'Hi There'
These are commands taken as keyboard input in the console app, these are not parameters. I have seen different approaches with parameters. I tried the following as well but couldnt get it to work.
String res;
String cmnd = "SHOW \'Hi There\'";
OutputStream stdin = null;
InputStream stdout = null;
stdout = p.getInputStream();
stdin = p.getOutputStream();
stdin.write(cmnd.getBytes());
stdin.flush();
p.waitFor();
BufferedReader input = new BufferedReader(
new InputStreamReader(stdout));
while ((res = input.readLine()) != null) {
System.out.println(res)
}
input.close();
p.destroy();
Its displaying nothing while the same procedure with "/bin/bash -c ls" works just fine.
please help!
Of hand I would say the problem is with p.*wait*For()
Exactly what object and when to usee notify() or notifyAll() call to wake up the object thread would be something like on stdout and maybe a restructure of the process.
note: an interesting feature is the class field in BufferedReader called "lock", the api docs do mention some way of structuring your program so it can be notified.

Categories

Resources