I am trying to write a unix terminal emulator in java. I am having a lot of trouble. It doesn't seem like I can change the working directory of the program, so commands like "cd" aren't working properly. My question is this, If I run a command that requires input from the user, is there any way to send that input to the running process?
Thanks so much, that was a lot of help. Here's an example:
InputStream in = null;
OutputStream outS = null;
StringBuffer commandResult = new StringBuffer();
String line = null;
int readInt;
p = Runtime.getRuntime().exec("gksudo apt-get install firefox");
int returnVal = p.waitFor();
in = p.getInputStream();
while ((readInt = in.read()) != -1)
commandResult.append((char)readInt);
outS = (BufferedOutputStream) p.getOutputStream();
outS.write("Y".getBytes());
outS.close();
System.out.println(commandResult.toString());
in.close();
This is the output:
Reading package lists...
Building dependency tree...
Reading state information...
The following packages were automatically installed and are no longer required:
libmono2.0-cil libmono-data-tds2.0-cil libmono-system-data2.0-cil
libdbus-glib1.0-cil librsvg2-2.18-cil libvncserver0 libsqlite0
libmono-messaging2.0-cil libmono-system-messaging2.0-cil
libmono-system-data-linq2.0-cil libmono-sqlite2.0-cil
libmono-system-web2.0-cil libwnck2.20-cil libgnome-keyring1.0-cil
libdbus1.0-cil libmono-wcf3.0-cil libgdiplus libgnomedesktop2.20-cil
Use 'apt-get autoremove' to remove them.
The following extra packages will be installed:
firefox-globalmenu
Suggested packages:
firefox-gnome-support firefox-kde-support latex-xft-fonts
The following NEW packages will be installed:
firefox firefox-globalmenu
0 upgraded, 2 newly installed, 0 to remove and 5 not upgraded.
Need to get 15.2 MB of archives.
After this operation, 30.6 MB of additional disk space will be used.
Do you want to continue [Y/n]? Abort.
Why is it aborting before I can pipe in the "Y"?
Yes; see Process#getOutputStream() to get the "standard input" (stdin) stream for a Process object.
As for the issue of changing directory, I don't believe the JVM can change its working directory once it has launched. However, your program could model the idea of the "current working directory" as a variable which it uses when it does things which are relative to that location (e.g. launching processes, listing directory contents, etc). The ProcessBuilder class even has a way to set the working directory for Processes it produces.
Related
Like the title says I'm not able to read the contents of a file (csv file) while running the same code on a linux container
private Set<VehicleConfiguration> loadConfigurations(Path file, CodeType codeType) throws IOException {
log.debug("File exists? " + Files.exists(file));
log.debug("Path " + file.toString());
log.debug("File " + file.toFile().toString());
log.debug("File absolute path " + file.toAbsolutePath().toString());
String line;
Set<VehicleConfiguration> configurations = new HashSet<>(); // this way we ignore duplicates in the same file
try(BufferedReader br = new BufferedReader(new FileReader(file.toFile()))){
while ((line = br.readLine()) != null) {
configurations.add(build(line, codeType));
}
}
log.debug("Loaded " + configurations.size() + " configurations");
return configurations;
}
The logs return "true" and the path for the file in both systems (locally on windows and on a linux docker container). On windows it loads "15185 configurations" but on the container it loads "0 configurations".
The file exists on linux, I use bash and check it myself. I use the head command and the file has lines.
Before this I tried with Files.lines like so:
var vehicleConfigurations = Files.lines(file)
.map(line -> build(line, codeType))
.collect(Collectors.toCollection(HashSet::new));
But this has a problem (on container only) regarding the contents. It reads the file but not the whole file, it reaches a given line (say line 8000) and does not read it completely (reads about half a line before the comma separator). Then I get a java.lang.ArrayIndexOutOfBoundsException because my build method tries to split then line and I access index 1 (which it doesn't have, only 0):
private VehicleConfiguration build(String line, CodeType codeType) {
String[] cells = line.split(lineSeparator);
var vc = new VehicleConfiguration();
vc.setVin(cells[0]);
vc.setCode(cells[1]);
vc.setType(codeType);
return vc;
}
What could be the issue? I don't understand how the same code (in Java) works on Windows but not on a Linux container. It makes no sense.
I'm using Java 11. The file is copied using volumes in a docker-compose file like this:
volumes:
- ./file-sources:/file-sources
I then copy the file (using cp command on the linux container) from file-sources to /root because that's where the app is listening for new files to arrive. File contents are then read with the methods I described. Example file data (does not have weird characters):
Thanks in advance.
UPDATE: Tried with newBufferedReader method, same result (works on windows, doesn't work on linux container):
private Set<VehicleConfiguration> loadConfigurations(Path file, CodeType codeType) throws IOException {
String line;
Set<VehicleConfiguration> configurations = new HashSet<>(); // this way we ignore duplicates in the same file
try(BufferedReader br = Files.newBufferedReader(file)){
while ((line = br.readLine()) != null) {
configurations.add(build(line, codeType));
}
}
log.debug("Loaded " + configurations.size() + " configurations");
return configurations;
}
wc -l in the linux container (in /root) returns: 15185 hard_001.csv
Update: This is no solution but I found out that by dropping the files directly on the file-sources folder and make that folder the folder that the code listens to, the files are read. So basically, it seems the problem is more apparent with using cp/mv inside the container to another folder. Maybe the file is read before it is fully copied/moved and that's why it reads 0 configurations?
There are a few methods in java you should never use. ever.
new FileReader(File) is one of them.
Any time that you have a thing that represents bytes and somehow chars or Strings fall out, or vice versa? Don't ever use those, unless the spec of said method explicitly points out that it always uses a pre-set charset. Almost all such methods use the 'system default charset' which means that the operation depends on the machine you run it on. That is shorthand for 'this will fail, and your tests won't catch it'. Which you don't want.
Which is why you should never use these things.
FileReader has been fixed (there is a second constructor that takes a charset), but that's only since JDK11. You already have the nice new API, why do you switch back to the dinky old File API? Don't do that.
All the various methods in Files, such as Files.newBufferedReader, are specced to do UTF-8 if you don't specify (in that way, Files is more useful, and unlike most other java core libraries). Thus:
try (BufferedReader br = Files.newBufferedReader(file)) {
which is just.. better.. than your line.
Now, it'll probably still fail on you. But that's good! It'll also fail on your dev machine. Most likely, the file you are reading is not, in fact, in UTF_8. This is the likely guess; most linuxen are deployed with a UTF_8 default charset, and most dev machines are not; if your dev machine is working and your deployment environment isn't, the obvious conclusion is that your input file is not UTF_8. It does not need to be what your dev machine has a default either; something like ISO_8859_1 will never throw exceptions, but it will read gobbledygook instead. Your code may seem to work (no crashes), but the text you read is still incorrect.
Figure out what text encoding you got, and then specify it. If it's ISO_8859_1, for example:
try (BufferedReader br = Files.newBufferedReader(file, StandardCharsets.ISO_8859_1)) {
and now your code no longer has the 'works on some machines but not on others' nature.
Inspect the line where it fails, in a hex editor if you have to. I bet you dollars to donuts there will be a byte there which is 0x80 or higher (in decimal, 128 or higher). Everything up to and including 127 tends to mean the exact same thing in a wide variety of text encodings, from ASCII to any ISO-8859 variant to UTF-8 Windows Cp1252 to macroman to so many other things, so as long as it's all just plain letters and digits, having the wrong encoding is not going to make any difference. But once you get to 0x80 or higher they're all different. Armed with that byte + some knowledge of what character it is supposed to be is usually a good start in figuring out what encoding that text file is in.
NB: If this isn't it, check how the text file is being copied from your dev machine to your deployment environment. Are you sure it is the same file? If it's being copied through a textual mechanism, charset encoding again can be to blame, but this time in how the file is written, instead of how your java app reads it.
I'am currently changing our system to use another server for getting file (files generated for tracking something, not important). This system is based on java, and the code for getting these files are using Linux commandos. The code for getting these files are:
session = connection.openSession();
session.execCommand("ls -B -A " + filelocation);
output = new BufferedReader(new InputStreamReader(new StreamGobbler(session.getStdout()), "UTF-8"));
This did however work on our original server (x86_64 GNU/Linux), but does not work on the "new" server (SunOs 5.10 Generic January). When running this command on the SunOS server i get:
ls: illegal option -- B
usage: ls -1RaAdCxmnlhogrtuvVcpFbqisfHLeE# [files]
I am far from well versed with with the commandline, and I have not written the original code. But this is what i figured
-A, --almost-all Do not list implied . and ..
-B, --ignore-backups Do not list implied entries ending with ~
Is there an optional way of getting this to work on the SunOS server?
EDIT
Checking each String read if line.endsWith("~");
while ((outputString = output.readLine()) != null) {
if(!outputString.endsWith("~")){
fileList.add(outputString);
}
}
Either you can write a shell script new_ls calling ls and removing the lines that end with "~"
Or when you process the results in java you can also ignore lines read from the BufferedReader by checking each String read if line.endsWith("~");
I'm trying to convert files from png's to pdf using imagemagick and Java. I've got everything working to a place when I'm executing imagemagick command to actually merge multiple png's into one pdf. The command itself looks properly, and it works fine when executed in the terminal but my application gives me error showing that imgck can't open the file (even though it exists and I've set permissions to the folder to 777 :
line: convert: unable to open image `"/Users/mk/Documents/workspace/.metadata/.plugins/org.eclipse.wst.server.core/tmp0/wtpwebapps/sch-java/print-1357784001005.png"': No such file or directory # error/blob.c/OpenBlob/2642.
This is my command :
/opt/local/bin/convert "/Users/mk/Documents/workspace/.metadata/.plugins/org.eclipse.wst.server.core/tmp0/wtpwebapps/sch-java/print-1357784001005.png" "/Users/mk/Documents/workspace/.metadata/.plugins/org.eclipse.wst.server.core/tmp0/wtpwebapps/sch-java/print-1357784001219.png" "/Users/mk/Documents/workspace/.metadata/.plugins/org.eclipse.wst.server.core/tmp0/wtpwebapps/sch-java/complete-exportedPanel2013-01-1003:13:17.212.pdf"
And my Java code :
String filesString = "";
for (String s : pdfs){
filesString += "\""+ s + "\" ";
}
Process imgkProcess = null;
BufferedReader br = null;
File f1 = new File(pdfs[0]);
//returns true
System.out.println("OE: "+f1.exists());
String cmd = imgkPath+"convert "+ filesString+ " \""+outputPath+outName+"\"";
try {
imgkProcess = Runtime.getRuntime().exec(cmd);
InputStream stderr = imgkProcess.getErrorStream();
InputStreamReader isr = new InputStreamReader(stderr);
br = new BufferedReader(isr);
} catch (IOException e1) {
msg = e1.getMessage();
}
imgkProcess.waitFor();
while( (line=br.readLine() ) != null){
System.out.println("line: "+line);
}
The whole code is executed from a java servlet controller after getting request from a form. Any ideas what can cause this ? I'm using latest imgck, jdk, and osx 10.7 .
A few things:
When spawning anything but really trivial processes, it's usually better to use ProcessBuilder than Runtime.exec() - it gives you much better control
Even with ProcessBuilder, it often works better to write a shell script that does what you need. Then spawn a process to run the script. You get a lot more control in shell script than you do in ProcessBuilder
Remember that a spawned process is not a shell. It can't, for instance, evaluate expressions, or expand shell variables. If you need that, then you must execute a shell (like sh or bash). Better yet, write a shell script as described above
If all you need to do is to execute some ImageMagick commands, it would probably be easier to jmagick, a Java interface to ImageMagick - see http://www.jmagick.org/
Actually, since the you're assembling images into a PDF, the iText library - http://itextpdf.com is probably the best tool for the job, as it is native Java code, does not require spawning a native process, and will therefore be much more portable.
Solved it by adding all arguments to an arrayList and then casting it to String array.
ArrayList<String> cmd = new ArrayList<String>();
cmd.add(imgkPath+"convert");
for (int i=0, l=pdfs.length; i<l; i++){
cmd.add(pdfs[i]);
}
cmd.add(outputPath+outName);
imgkProcess = Runtime.getRuntime().exec(cmd.toArray(new String[cmd.size()]));
I'm running a windows program from within java:
String command = "cmd /C start "+fileName+".bat";
Runtime rt = Runtime.getRuntime();
Process pr = rt.exec(command, null, new File(currWD));
int exitValue = pr.waitFor();
The program completes successfully (exitValue == 0) and creates a file "fileName" in the working directory. I am trying in the same routine to find the size of this file:
xmlFileSize = (new File(fileName)).length();
Java finds the file yet it appear to be empty (xmlFileSize == 0). Once Java finishes I can see, however, that the file is non-empty.
How can I resolve this? All I want is that Java can correctly assesses the size of the file created by the windows program that Java has executed.
A zero-length file indicates that the file may not exist. From the docs:
The length, in bytes, of the file denoted by this abstract pathname, or 0L if the file does not exist.
Note that you use currWD as working directory for your bat-file. You could try to do:
new File(currWD, fileName).length()
to make sure you look for the file in the right directory.
It probably has to do with executing the bat file from a command shell. What does the bat file do? Is it launching a program?
I'm guessing that the script calls or executes another program and returns which allows the shell to die. This in turn let's the java process continue while the process from the script continues executing asynchronously.
According to the Java API for Process, that's allowable which it most definitely should be (link java.lang.Process)
I credit this answer to aioobe and John. As John suggests, the external program started by the batch file spawns a process that seems to be running for a while (50-300 millisec) after the Java sub-process running the batch file has returned. I resolved the problem by introducing a pause (as suggested by aioobe) :
int exitValue = pr.waitFor();
try {Thread.currentThread().sleep(300);} catch (InterruptedException e) {e.printStackTrace();}
After the pause Java seems to be able to see the files created by the external program. Thanks again to both contributors who helped me resolve this issue!
If anyone finds a more elegant solution, please, feel welcome to post.
I'm currently working on a web application that involves mounting a drive and extracting a tar.gz file, all in Java. Since the application runs in a linux environment, I figured I'd try using unix commands like "mount" and "tar".
Runtime runtime = Runtime.getRuntime();
Process proc;
String mountCommand = "mount -t cifs -o username=...";
String extractCommand = "tar xzf ..."
proc = runtime.exec(mountCommand);
proc.waitFor();
proc = runtime.exec(extractCommand);
proc.waitFor();
Running the mount command and extract command in the terminal works fine, but fails when FIRST run in java. The second proc.waitFor() returns exit code 2. However, running this code after the first failed attempt works fine. I have a feeling that the problem is that waitFor() isn't waiting until the mount command is fully completed. Am I missing anything important in my code?
Also, I'd rather do this all in Java, but I had a really hard time figuring out how to untar a file, so I'm taking this approach. (oh if anyone can tell me how to do this i would be very happy). Any suggestions would be muuuuuuuuuuch appreciated!
Making progress. In case anyone was wondering, here is how I am extracting a tar.gz file in Java. Put together from a few online tutorials.
public static void extract(String tgzFile, String outputDirectory)
throws Exception {
// Create the Tar input stream.
FileInputStream fin = new FileInputStream(tgzFile);
GZIPInputStream gin = new GZIPInputStream(fin);
TarInputStream tin = new TarInputStream(gin);
// Create the destination directory.
File outputDir = new File(outputDirectory);
outputDir.mkdir();
// Extract files.
TarEntry tarEntry = tin.getNextEntry();
while (tarEntry != null) {
File destPath = new File(outputDirectory + File.separator + tarEntry.getName());
if (tarEntry.isDirectory()) {
destPath.mkdirs();
} else {
// If the parent directory of a file doesn't exist, create it.
if (!destPath.getParentFile().exists())
destPath.getParentFile().mkdirs();
FileOutputStream fout = new FileOutputStream(destPath);
tin.copyEntryContents(fout);
fout.close();
// Presserve the last modified date of the tar'd files.
destPath.setLastModified(tarEntry.getModTime().getTime());
}
tarEntry = tin.getNextEntry();
}
tin.close();
}
Quick Answer
Since a dependency on external commands exists, simplify it like this:
#!/bin/bash
mount -t cifs -o username=...
tar xzf ...
Name it mount-extract.sh then call it using a single Runtime.exec() call.
Semi-integrated Answer
Use Java APIs.
http://java.sun.com/j2se/1.4.2/docs/api/java/util/zip/GZIPInputStream.html
http://www.jajakarta.org/ant/ant-1.6.1/docs/ja/manual/api/org/apache/tools/tar/TarInputStream.html
You will need Runtime.exec to execute the mount command.
Forward Looking
Since Java is a cross-platform software development tool, consider abstracting the mount command in your application to be derived dynamically based on the underlying operating system.
See: How can I mount a windows drive in Java?
See: http://java.sun.com/j2se/1.4.2/docs/api/java/lang/System.html#getProperties()
Of course, Agile development would insist that this not be done until it is needed. So keep it in the back of your mind until then (as you might never run the application on anything but Unix-based systems).
Take a look at the org.apache.tools.tar package in the Ant codebase. There is a class in that package, TarInputStream, that can be used to read tar archives.
It may be related to the way you call the method.
See this answer
Basically try using
.exec( String [] command );
instead of
.exec( String command );
I'm not sure if it is even related, because you mention it runs the second time. Give it a try and let us know.
This can all be done in Java, but you have to be aware of caveats when dealing with native processes.
The waitFor() command may not be doing what you hope: if the process you started has a child process that does the actual work you need then the waitFor(), which returns when the parent process has finished, has not allowed enough time for the child process to finish.
One way to get around this is to loop over some test to see that the native processes you started have finished to your satisfaction---in this case perhaps checking if some java.io.File exists.