Goodmorning guys,
I'm trying to develop an Eclipse Plugin that run an external program with ProcessBulder.
During the exectution, I try to write the output on disk, but the plugin doesn't write anything until I close the execution of Eclipse Application.
public void run()
{
ProcessBuilder pb = new ProcessBuilder("NuSMV.exe","-int");
Process process = null;
try {
process = pb.start();
} catch (IOException e) {
e.printStackTrace();
}
OutputStream out = process.getOutputStream();
// Write commands
PrintWriter commands = new PrintWriter(out);
commands.println("reset");
commands.println("set default_trace_plugin 4");
commands.println("read_model -i C:\\temp/ascensore.smv");
commands.println("go");
commands.println("check_ctlspec");
commands.println("show_traces -o C:\\temp/showtraces.xml");
commands.close();
process.getOutputStream().close();
}
Showtraces.xml is written after the end of eclipse. How Can I have the output before this closing?
First flush the printwriter then close it.
ie
commands.flush();
Refer.
http://docs.oracle.com/javase/8/docs/api/java/io/PrintWriter.html#flush--
I am not sure it will help but try to change this:
commands.close();
process.getOutputStream().close();
To this:
commands.flush();
commands.close();
out.flush();
out.close();
I've already tried to use commands.flush() before call the close() method, but this don't change anything.
Instead, if I write more than one file, as an example file-1.xml, file-2.xml, file-3.xml, the files numer one and two was written before I closing the application and the file number three was written only after closing.
Use another constructor to have the autoflush mode set (see Javadoc).
PrintWriter commands = new PrintWriter(out, true);
Changing this allows you to keep your code with minimal change.
Every call to println, printf and format flushes the buffer automatically, fixing your issues.
Related
I have this code that copies an array elements into a text file, and after copying the files I have a button which opens the file i-copied.
try
{
print = new PrintWriter("C:\\Users\\Jofrank\\workspace\\Java\\src\\payroll\\report.txt");
print.println("EMPLOYEES PAYROLL RECORD AS OF "+dateFormat.format(date));
print.println();
for(int x=0;x<department.length;x++)
{
print.println("DEPARTMENT: "+department[x].toUpperCase());
print.println("\tPAYROLL PERIOD\tEMPLOYEE NUMBER\tNAME\tPAY RATE\tHOURS WORKED\tSALARY");
print.println();
for(int y=0;y<trans.length;y++)
{
if(trans[y] == null)
{
continue;
}
if(trans[y].getDepartment().equals(department[x]))
{
print.println("\t"+trans[y].getPayrollPeriod()+"\t"+trans[y].getEmpNo()+"\t\t"+trans[y].getName()+"\t"+trans[y].getPayRate()+"\t\t"+trans[y].getHoursWorked()+"\t\t"+String.format("%,.2f", (trans[y].getPayRate()*trans[y].getHoursWorked())));
total+=(trans[y].getPayRate()*trans[y].getHoursWorked());
}
}
print.println("\t\t\t\t\t\t\t\t\tTOTAL:\t"+String.format("%,.2f", total));
print.println();
total=0;
}
print.close();
}
catch (FileNotFoundException e)
{
e.printStackTrace();
}
Unfortunately, My text file was not UPDATED unless i-close the system.
Is there a way that my text file will be updated automatically without CLOSING the system?
You can actually create a PrintWriter with autoFlush turned on:
print = new PrintWriter(new FileOutputStream
("C:\\Users\\Jofrank\\workspace\\Java\\src\\payroll\\report.txt"), true);
Here 2nd parameter is true. As per Javadoc:
autoFlush - A boolean; if true, the println, printf, or format
methods will flush the output buffer
That depens much on the system.
flush() usually should work, but this is all not garuanteed.
There are some embedded flash file system where you might call sync, too.
But for the first apporach try flush()
You need to empty your stream for it to be written to the file. As others have suggested, use .flush() to accomplish this without having to .close() your stream. Otherwise I believe .close() automagically calls .flush() for you to ensure your stream has been emptied and it's contents written to disk or wherever you are directing it.
In documentation it is said that PrintWriter does not flush lines automatically.
You may need to use different constructor for PrintWriter:
PrintWriter(File file)
but you have to open the file itself and then close it after writing is done.
I want to run a shell script from a java program. This shell script invokes a system library which needs a big file as resource.
My java program calls this script for every word in a document. If I call this script again and again using Runtime.exec() the time taken is very high since the resource loading takes lot of time.
To overcome this I thought of writing the shell script as follows (to make it run continuously in background ):
count=0
while count -lt 10 ; do
read WORD
//execute command on this line
done
I need retrieve the output of the command in my java program and process it further.
How should I code the I/O operations for achieving this task?
I have tried writing words in to the process's output stream and reading back output from process's input stream. But this does not work and throws a broken pipe exception.
try {
parseResult = Runtime.getRuntime().exec(parseCommand);
parsingResultsReader = new BufferedReader(new InputStreamReader (parseResult.getInputStream()));
errorReader = new BufferedReader(new InputStreamReader (parseResult.getErrorStream()));
parseResultsWriter = new BufferedWriter(new OutputStreamWriter((parseResult.getOutputStream())));
} catch (IOException e) {
e.printStackTrace();
}
parseResultsWriter.write(word);
parseResultsWriter.flush();
while ((line = parsingResultsReader.readLine()) != null) {
// capture output in list here
}
Kindly help with this issue
//execute command on this line
Is this command a separate program? Then it will be launched for every word, so you'll get rid of only shell process which is lightweight anyway.
You have to learn how to run the heavyweight command for many words at once.
I'm launching wkhtmltopdf from within my Java app (part of a Tomcat server, running in debug mode within Eclipse Helios on Win7 64-bit): I'd like to wait for it to complete, then Do More Stuff.
String cmd[] = {"wkhtmltopdf", htmlPathIn, pdfPathOut};
Process proc = Runtime.getRuntime().exec( cmd, null );
proc.waitFor();
But waitFor() never returns. I can still see the process in the Windows Task Manager (with the command line I passed to exec(): looks fine). AND IT WORKS. wkhtmltopdf produces the PDF I'd expect, right where I'd expect it. I can open it, rename it, whatever, even while the process is still running (before I manually terminate it).
From the command line, everything is fine:
c:\wrk>wkhtmltopdf C:\Temp\foo.html c:\wrk\foo.pdf
Loading pages (1/6)
Counting pages (2/6)
Resolving links (4/6)
Loading headers and footers (5/6)
Printing pages (6/6)
Done
The process exits just fine, and life goes on.
So what is it about runtime.exec() that's causing wkhtmltopdf to never terminate?
I could grab proc.getInputStream() and look for "Done", but that's... vile. I want something that is more general.
I've calling exec() with and without a working directory. I've tried with and without an empty "env" array. No joy.
Why is my process hanging, and what can I do to fix it?
PS: I've tried this with a couple other command line apps, and they both exhibit the same behavior.
Further exec woes.
I'm trying to read standard out & error, without success. From the command line, I know there's supposed to be something remarkably like my command line experience, but when I read the input stream returned by proc.getInputStream(), I immediately get an EOL (-1, I'm using inputStream.read()).
I checked the JavaDoc for Process, and found this
The parent process uses these streams to feed input to and get output from the subprocess. Because some native platforms only provide limited buffer size for standard input and output streams, failure to promptly write the input stream or read the output stream of the subprocess may cause the [b]subprocess to block, and even deadlock[/b].
Emphasis added. So I tried that. The first 'read()' on the Standard Out inputStream blocked until I killed the process...
WITH WKHTMLTOPDF
With the generic command line ap & no params so it should "dump usage and terminate", it sucks out the appropriate std::out, then terminates.
Interesting!
JVM version issue? I'm using 1.6.0_23. The latest is... v24. I just checked the change log and don't see anything promising, but I'll try updating anyway.
Okay. Don't let the Input Streams fill or they'll block. Check. .close() can also prevent this, but isn't terribly bright.
That works in general (including the generic command line apps I've tested).
In specific however, it falls down. It appears that wkhtmltopdf is using some terminal manipulation/cursor stuff to do an ASCII-graphic progress bar. I believe this is causing the inputStream to immediately return EOF rather than giving me the correct values.
Any ideas? Hardly a deal-breaker, but it would definitely be Nice To Have.
I had the same exact issue as you and I solved it. Here are my findings:
For some reason, the output from wkhtmltopdf goes to STDERR of the process and NOT STDOUT. I have verified this by calling wkhtmltopdf from Java as well as perl
So, for example in java, you would have to do:
//ProcessBuilder is the recommended way of creating processes since Java 1.5
//Runtime.getRuntime().exec() is deprecated. Do not use.
ProcessBuilder pb = new ProcessBuilder("wkhtmltopdf.exe", htmlFilePath, pdfFilePath);
Process process = pb.start();
BufferedReader errStreamReader = new BufferedReader(new InputStreamReader(process.getErrorStream()));
//not "process.getInputStream()"
String line = errStreamReader.readLine();
while(line != null)
{
System.out.println(line); //or whatever else
line = reader.readLine();
}
On a side note, if you spawn a process from java, you MUST read from the stdout and stderr streams (even if you do nothing with it) because otherwise the stream buffer will fill and the process will hang and never return.
To futureproof your code, just in case the devs of wkhtmltopdf decide to write to stdout, you can redirect stderr of the child process to stdout and read only one stream like this:
ProcessBuilder pb = new ProcessBuilder("wkhtmltopdf.exe", htmlFilePath, pdfFilePath);
pb.redirectErrorStream(true);
Process process = pb.start();
BufferedReader inStreamReader = new BufferedReader(new InputStreamReader(process.getInputStream()));
Actually, I do this in all the cases where I have to spawn an external process from java. That way I don't have to read two streams.
You should also read the streams of the spawned process in different threads if you dont want your main thread to block, since reading from streams is blocking.
Hope this helps.
UPDATE: I raised this issue in the project page and was replied that this is by design because wkhtmltopdf supports giving the actual pdf output in STDOUT. Please see the link for more details and java code.
A process has 3 streams: input, output and error. you can read both output and error stream at the same time using separate processes. see this question and its accepted answer and also this one for example.
You should read from the streams in a different thread.
final Semaphore semaphore = new Semaphore(numOfThreads);
final String whktmlExe = tmpwhktmlExePath;
int doccount = 0;
try{
File fileObject = new File(inputDir);
for(final File f : fileObject.listFiles()) {
if(f.getAbsolutePath().endsWith(".html")) {
doccount ++;
if(doccount >500 ) {
LOG.info(" done with conversion of 1000 docs exiting ");
break;
}
System.out.println(" inside for before "+semaphore.availablePermits());
semaphore.acquire();
System.out.println(" inside for after "+semaphore.availablePermits() + " ---" +f.getName());
new java.lang.Thread() {
public void run() {
try {
String F_ = f.getName().replaceAll(".html", ".pdf") ;
ProcessBuilder pb = new ProcessBuilder(whktmlExe , f.getAbsolutePath(), outPutDir + F_ .replaceAll(" ", "_") );//"wkhtmltopdf.exe", htmlFilePath, pdfFilePath);
pb.redirectErrorStream(true);
Process process = pb.start();
BufferedReader errStreamReader = new BufferedReader(new InputStreamReader(process.getInputStream()));
String line = errStreamReader.readLine();
while(line != null)
{
System.err.println(line); //or whatever else
line = errStreamReader.readLine();
}
System.out.println("after completion for ");
} catch (Exception e) {
e.printStackTrace();
}finally {
System.out.println(" in finally releasing ");
semaphore.release();
}
}
}.start();
}
}
}catch (Exception ex) {
LOG.error(" *** Error in pdf generation *** ", ex);
}
while (semaphore.availablePermits() < numOfThreads) {//till all threads finish
LOG.info( " Waiting for all threads to exit "+ semaphore.availablePermits() + " --- " +( numOfThreads - semaphore.availablePermits()));
java.lang.Thread.sleep(10000);
}
I use the following code to write some data to files:
BufferedWriter writer = null;
try {
writer = new BufferedWriter(new FileWriter(file));
writer.write(...);
writer.flush();
}
finally {
if (writer != null)
writer.close();
}
After invoking the method multiple times I got a FileNotFoundException because too many files are open.
Obviously java does not close the file handles when I close the writer stream. Closing the FileWriter separately does not help.
Is there sth. I can do to force java to close the files?
Your code looks fine. It could be another part of your application which is leaking file handles.
You can monitor file handles using lsof on Linux or pfiles on Solaris. On Windows, you can use ProcessExplorer.
No, Java does close the file handles when you close the writer. Its actually built using Decorator pattern. Hence, it must be something else. Show the stack trace.
See this thread about writing to files, good tips there.. pay attention to the finally block in Anons reply.
BufferedWriter closes the underlying stream. Probably, this a multithreading issue. You can keep an instance of FileOutputStream and close it. Something like:
java.io.FileOutputStream out = new java.io.FileOutputStream(file);
try {
// make buffered writer, etc.
} finally {
out.close();
}
i noticed in a java program the below line used to open a file and process it
BufferedReader inp = new BufferedReader(new FileReader(inputFile));
In the javaprogram the inp is not closed before exiting the program the below line is missing
if (inp != null)
try {
inp.close();
} catch (IOException logOrIgnore) {}
The program has exits in a lot of place but they had not closed the file. Do i need to put this line everywhere? If i dont close the file when the program exits will it be a issue.
Does the garbage collector closes the file?
You should use try/finally:
Reader inp = new BufferedReader(new FileReader(inputFile));
try {
// Do stuff with "inp"
} finally {
IOUtils.closeQuietly(inp);
}
IOUtils is from Apache Commons IO. Its closeQuietly method is like your code snippet above: it calls close, and ignores any exceptions thrown.
The garbage collector does not close the file. If you know your program will not be long running or open many files, you can get away without closing the file. But otherwise you need to close it manually.
It sounds like you're using the BufferedReader without returning to the context in which it was declared (possibly an instance variable?). In that instance, you must close it manually upon each possible exit from your application. You cannot rely on the garbage collector to do this for you.