I have a Java project which help to find all videos information in youtube playlist using youtube-dl. Here is the Main.java
import java.io.*;
public class Main {
public static void main(String[] args) throws Exception {
String command1 = "/usr/local/bin/youtube-dl --flat-playlist --dump-single-json https://www.youtube.com/playlist?list=PLFcOH1YaRqUo1yEjY5ly09RFbIpUePF7G";
String command2 = "/usr/local/bin/youtube-dl --flat-playlist --dump-single-json https://www.youtube.com/playlist?list=PLC6A0625DCA9AAE2D";
System.out.println(executeCommand(command1));
}
private static String executeCommand(String command) throws IOException, InterruptedException {
int exitCode = 0;
String result = "";
Process process;
ProcessBuilder builder = new ProcessBuilder(command.replaceAll("[ ]+", " ").split(" "));
builder.directory(new File("/tmp/test"));
process = builder.start();
exitCode = process.waitFor();
return getStringFromInputStream(process.getInputStream());
}
private static String getStringFromInputStream(InputStream is) {
BufferedReader br = null;
StringBuilder sb = new StringBuilder();
String line;
try {
br = new BufferedReader(new InputStreamReader(is));
while ((line = br.readLine()) != null) {
sb.append(line);
}
} catch (IOException e) {
e.printStackTrace();
} finally {
if (br != null) {
try {
br.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
return sb.toString();
}
}
The command1 and command2 are identical except for the Youbute playlist parameter. The playlist in command2 has 409 videos and has 200 videos in command1. I can get result successfully for both command in terminal. Just the command2 takes more time, but only several seconds. But when I run the Main.java (javac Main.java; java Main), for command1 it prints result successfully, but for command2 it hangs there for several minutes without any result. Here is the jstack for this process
"main" #1 prio=5 os_prio=0 tid=0x00007f828c009800 nid=0xce7 in Object.wait() [0x00007f8293cf7000]
java.lang.Thread.State: WAITING (on object monitor)
at java.lang.Object.wait(Native Method)
- waiting on <0x0000000771df9fe0> (a java.lang.UNIXProcess)
at java.lang.Object.wait(Object.java:502)
at java.lang.UNIXProcess.waitFor(UNIXProcess.java:396)
- locked <0x0000000771df9fe0> (a java.lang.UNIXProcess)
at Main.executeCommand(Main.java:18)
at Main.main(Main.java:8)
It hangs at exitCode = process.waitFor();. I have no idea about it. Can anyone help me? Many thanks.
As mentioned in the comments, by default the output of the subprocess is sent to a pipe which can be read using Process.getInputStream(). If the subprocess generates lots of output and the Java program doesn't consume it, the pipe's buffer will fill up and the subprocess will block on writing.
The easiest solution is to call .inheritIO() on the ProcessBuilder. That will send the output to the console instead of buffering it in memory (same for input and error streams).
Related
I would like to run a hidden script file that resides in the current location using process builder. with the following code
// System.out.println("line"+reader.readLine());
ProcessBuilder builder = new ProcessBuilder(shfile.getAbsolutePath());
builder.redirectErrorStream(true);
Process process = builder.start();
BufferedReader br = new BufferedReader(new InputStreamReader(process.getInputStream()));
String output = null;
System.out.println("out"); //===printing this
while (null != (output = br.readLine()))
{
System.out.println("in"); //not printing this
System.out.println(">>"+output);
}
int rs = process.waitFor();
but it hangs in the br.readline()..
but when I run the same script file using the following command in terminal
sh .script.sh
it executes and gives me the expected results
I looked into all the loops in the forum everyone asks to handle input stream and error stream in threads or do a redirect error stream. I have added a redirect error stream but still it hangs.
when i press ctrl+c it prints the initial lines of the output and exits.
Content of my script file
#!/bin/sh
cd /home/ats/cloudripper/lke_factory_asb_v2/lk_assets_factory_release/
sh ./LKE_run_Diablo.sh 0a0e0c3dc893
So how to handle this situation.
Process builder have special API to redirect child process input, output and error streams. See documentation
If you need both child and parent process to use same console you should use INHERIT mode redirection. An example:
public class ChildProcessOutputProxy {
public static void main(String[] args) {
ProcessBuilder builder = new ProcessBuilder("whoami");
builder.redirectOutput(Redirect.INHERIT);
builder.redirectErrorStream(true);
try {
var child = builder.start();
child.waitFor();
} catch (IOException e) {
System.err.println(e.getMessage());
} catch (InterruptedException e) {
Thread.currentThread().interrupt();
}
}
}
I've configured a PDF printer that uses Ghostscript to convert the document to a PDF, which is then processed and used by my Java desktop application. It redirects the printer data via a RedMon port. For most documents I print, it works fine and produces the PDF file as expected. However, with documents with a certain number of pages, the process simply freezes: no error is thrown, the process simply holds. It seems independent of filesize or printer properties (though the latter seems to influence the number of pages that do get printed).
After stopping the Java application, I'm left with a document with a fixed number of pages (usually 265 pages, but it also happened to end with 263 pages or 247 pages). The second to last page is incomplete (as in, partially-printed tables and texts), whereas the last page prints as an error:
ERROR: syntaxerror
OFFENDING COMMAND: --nostringval--
STACK:
/[NUMBER]
Where [NUMBER] is any given single-digit number.
Here is my Ghostscript integrator class:
public class GhostScriptIntegrator {
public static void createPDF(String[] args, String filename) {
if (args.length > 0) {
try {
Process process = Runtime.getRuntime().exec(
args[0] + " -sOutputFile=\"" + filename
+ "\" -c save pop -f -");
OutputStream os = process.getOutputStream();
BufferedReader sc = null;
try (PrintWriter writer = new PrintWriter(os)) {
sc = new BufferedReader(new InputStreamReader(System.in));
String line;
while ((line = sc.readLine()) != null) {
writer.println(line);
}
writer.flush();
} catch (Exception ex) {
Logger.getLogger(GhostScriptIntegrator.class.getName()).log(Level.SEVERE, null, ex);
} finally {
if (sc != null) {
sc.close();
}
}
process.waitFor();
} catch (InterruptedException | IOException ex) {
Logger.getLogger(GhostScriptIntegrator.class.getName()).log(Level.SEVERE, null, ex);
}
}
}
}
The args parameter is handled by my virtual printer (similarly to how it was presented in my previous post):
Full argument:
-jar "C:\Program Files (x86)\Impressora SPE\ImpressoraSPE.jar" "C:\Program Files (x86)\gs\gs9.21\bin\gswin32c -I\"C:\Program Files (x86)\gs\gs9.21\lib\" -dSAFER -dBATCH -dNOPAUSE -sDEVICE=pdfwrite -sPAPERSIZE=a4 -q -dPDFA=2 -dPDFACompatibilityPolicy=1 -dSimulateOverprint=true -dCompatibilityLevel=1.3 -dPDFSETTINGS=/screen -dEmbedAllFonts=true -dSubsetFonts=true -dAutoRotatePages=/None -dColorImageDownsampleType=/Bicubic -dColorImageResolution=150"
I have a second virtual printer that works perfectly, and there seems to be no significant difference between them: same drivers, same port arguments, same setup, very similar code. Yet, it does not freeze after a certain number of pages, and the output file is as expected.
What's causing my printer to stop responding?
It turns out there is no problem with your printer, but rather with your code. More specifically, how you [do not] handle the Runtime streams. What your process is missing is a StreamGobbler.
A StreamGobbler is an InputStream that uses an internal worker thread to constantly consume input from another InputStream. It uses a buffer to store the consumed data. The buffer size is automatically adjusted, if needed.
Your process hangs because it cannot fully read the input stream. The following articles provide a very in-depth explanation as to why it happens and how to fix it:
When Runtime.exec() won't - Part 1
When Runtime.exec() won't - Part 2
But to quote the article itself (which, in turn, quotes the JDK Javadoc):
Because some native platforms only provide limited buffer size for standard input and output streams, failure to promptly write the input stream or read the output stream of the subprocess may cause the subprocess to block, and even deadlock.
The solution is to simply exhaust each input stream from your process by implementing a StreamGobbler class:
public class GhostScriptIntegrator {
public static void createPDF(String[] args, String filename) throws FileNotFoundException {
if (args.length > 0) {
try {
Process process = Runtime.getRuntime().exec(
args[0] + " -sOutputFile=\"" + filename
+ "\" -c save pop -f -");
OutputStream os = process.getOutputStream();
BufferedReader sc = null;
InputStreamReader ir = new InputStreamReader(System.in);
try (PrintWriter writer = new PrintWriter(os)) {
StreamGobbler errorGobbler = new StreamGobbler(
process.getErrorStream(), "ERROR");
StreamGobbler outputGobbler = new StreamGobbler(
process.getInputStream(), "OUTPUT");
errorGobbler.start();
outputGobbler.start();
sc = new BufferedReader(ir);
String line;
while ((line = sc.readLine()) != null) {
writer.println(line);
writer.flush();
}
} catch (IOException ex) {
Logger.getLogger(GhostScriptIntegrator.class.getName()).log(Level.SEVERE, null, ex);
} finally {
if (sc != null) {
sc.close();
}
ir.close();
if (os != null) {
os.close();
}
}
process.waitFor();
} catch (InterruptedException | IOException ex) {
Logger.getLogger(GhostScriptIntegrator.class.getName()).log(Level.SEVERE, null, ex);
}
}
}
}
class StreamGobbler extends Thread {
InputStream is;
String type;
StreamGobbler(InputStream is, String type) {
this.is = is;
this.type = type;
}
public void run() {
try {
InputStreamReader isr = new InputStreamReader(is);
BufferedReader br = new BufferedReader(isr);
long contador = 0;
while (br.readLine() != null) {
//Do nothing
}
} catch (IOException ex) {
Logger.getLogger(StreamGobbler.class.getName()).log(Level.SEVERE, null, ex);
}
}
}
I'm trying to make program which runs some executable program(call it p), given time limit t ms. It does following tasks:
If program p has executed normally, print it's output to console.
If program p couldn't execute completely within time limit, print "Sorry, needs more time!" and then terminate execution of p.
If program p has terminated abnormally (e.g. RuntimeError), print "Can I've some debugger?"
I'm using ProcessResultReader class in the following program from here. My program is working as long as p finishes it's execution normally or terminate abnormally. But, it doesn't terminate if p itself doesn't terminate after timeout.(Try p with simple while(true) loop with no exit condition). It seems that thread stdout is alive even after execution of stdout.stop(). What am I doing wrong in this code?
Thanks.
import java.util.concurrent.TimeUnit;
import java.io.*;
class ProcessResultReader extends Thread
{
final InputStream is;
final StringBuilder sb;
ProcessResultReader(final InputStream is)
{
this.is = is;
this.sb = new StringBuilder();
}
public void run()
{
try
{
final InputStreamReader isr = new InputStreamReader(is);
final BufferedReader br = new BufferedReader(isr);
String line = null;
while ((line = br.readLine()) != null)
{
this.sb.append(line).append("\n");
}
}
catch (final IOException ioe)
{
System.err.println(ioe.getMessage());
throw new RuntimeException(ioe);
}
}
#Override
public String toString()
{
return this.sb.toString();
}
public static void main(String[] args) throws Exception
{
int t = 1000;
Process p = Runtime.getRuntime().exec(cmd); //cmd is command to execute program p
ProcessResultReader stdout = new ProcessResultReader(p.getInputStream());
stdout.start();
if(!p.waitFor(t, TimeUnit.MILLISECONDS))
{
stdout.stop();
p.destroy();
System.out.println("Sorry, needs more time!");
}
else
{
if(p.exitValue()==0) System.out.println(stdout.toString());
else System.out.println("Can I've some debugger?");
}
}
}
According to java docs,
stdout.stop() was deprecated and even stdout.destroy() is never implemented.
For more information, see Why are Thread.stop, Thread.suspend and Thread.resume Deprecated?.
you could try this instead.
String cmd="cmd /c sleep 5";
int timeout = 1;
Process p = Runtime.getRuntime().exec(cmd); //cmd is command to execute program p
ProcessResultReader stdout = new ProcessResultReader(p.getInputStream());
stdout.start();
if(!p.waitFor(timeout, TimeUnit.MILLISECONDS))
{
stdout.stop();
p.destroy();
System.out.println("Sorry, needs more time!");
System.out.flush();
}
else
{
if(p.exitValue()==0) System.out.println(stdout.toString());
else System.out.println("Can I've some debugger?");
}
I'm having a problem calling some simple command line functions with r.exec - for some reason, given a file X the command
'echo full/path/to/X' works fine (both in the display and with 'p.exitValue()==0', but 'cat full/path/to/X' does not (and has 'p.exitValue()==1') - both 'cat' and 'echo' live in /bin/ on my OSX - am I missing something? Code is below (as it happens, any suggestions to improve the code generally are welcome...)
private String takeCommand(Runtime r, String command) throws IOException {
String returnValue;
System.out.println("We are given the command" + command);
Process p = r.exec(command.split(" "));
InputStream in = p.getInputStream();
BufferedInputStream buf = new BufferedInputStream(in);
InputStreamReader inread = new InputStreamReader(buf);
BufferedReader bufferedreader = new BufferedReader(inread);
// Read the ls output
String line;
returnValue = "";
while ((line = bufferedreader.readLine()) != null) {
System.out.println(line);
returnValue = returnValue + line;
}
try {// Check for failure
if (p.waitFor() != 0) {
System.out.println("XXXXexit value = " + p.exitValue());
}
} catch (InterruptedException e) {
System.err.println(e);
} finally {
// Close the InputStream
bufferedreader.close();
inread.close();
buf.close();
in.close();
}
try {// should slow this down a little
p.waitFor();
} catch (InterruptedException e) {
e.printStackTrace();
}
return returnValue;
}
You should be consuming stdout and stderr asynchronously.
Otherwise it's possible for the output of the command to block the input buffers and then everything grinds to a halt (that's possibly what's happening with your cat command since it'll dump much more info than echo).
I would also not expect to have to call waitFor() twice.
Check out this SO answer for more info on output consumption, and this JavaWorld article for more Runtime.exec() pitfalls.
I have some issues regarding ProcessBuilder.
The program is basically a simple wrapper invoking a command line script.
When running the script on its own via the terminal, the memory consumption stays below 2G.
When running the script via the java wrapper, the memory consumption explodes and even 8G is quickly filled up, resulting in out-of-memory errors.
The code to launch the process is simply:
public static int execute(String command) throws IOException
{
System.out.println("Executing: " + command);
ProcessBuilder pb = new ProcessBuilder(command.split(" +"));
Process p = pb.start();
// display any output in stderr or stdout
StreamConsumer stderr = new StreamConsumer(p.getErrorStream(), "stderr");
StreamConsumer stdout = new StreamConsumer(p.getInputStream(), "stdout");
new Thread(stderr).start();
new Thread(stdout).start();
try {
return p.waitFor();
} catch (InterruptedException e) {
throw new RuntimeException(e);
}
}
The StreamConsumer class is simply a class which consumes the stdout/stderr streams and display them on the console.
...the question is: why on earth does the memory consumption explode?
Regards,
Arnaud
Edit:
Whether I use ProcessBuilder or
Runtime.getRuntime.exec(...), the
result is the same.
The memory bursts tend to appear during unix 'sort' invoked by the
shell script called:
sort big-text-file > big-text-file.sorted
Edit 2 on request of Jim Garrison:
Ok, here is the StreamConsumer class which I omitted because it is rather simple:
class StreamConsumer implements Runnable
{
InputStream stream;
String descr;
StreamConsumer(InputStream stream, String descr) {
this.stream = stream;
this.descr = descr;
}
#Override
public void run()
{
String line;
BufferedReader brCleanUp =
new BufferedReader (new InputStreamReader (stream));
try {
while ((line = brCleanUp.readLine ()) != null)
System.out.println ("[" + descr + "] " + line);
brCleanUp.close();
} catch (IOException e) {
// TODO: handle exception
}
}
}
if you change your command like this :
sort -o big-text-file.sorted big-text-file
is it always the same ?
Maybe its because those StreamConsumer threads are not daemons so they don't die and get garbage collected when your processes return? You could try:
//...
final StreamConsumer stderr = new StreamConsumer(p.getErrorStream(), "stderr");
final StreamConsumer stdout = new StreamConsumer(p.getInputStream(), "stdout");
final Thread stderrThread = new Thread(stderr);
final Thread stdoutThread = new Thread(stdout);
stderrThread.setDaemon(true);
stdoutThread.setDaemon(true);
stderrThread.start();
stdoutThread.start();
//...
Is this behavior happening for single invocation or after doing this many times?