SOCAT command tailing from java app isn't working - java

Here is my runCommand that takes Linux command input as a string
public static ArrayList<String> runCommand(String command) {
ArrayList<String> arrayList = new ArrayList<>();
try {
Process process = Runtime.getRuntime().exec(command);
BufferedReader reader =
new BufferedReader(new InputStreamReader(process.getInputStream()));
String line;
while ((line = reader.readLine()) != null) {
arrayList.add(line);
System.out.println(line);
}
int exitCode = process.waitFor();
System.out.println("\nExited with error code : "+exitCode);
System.out.println();
} catch (IOException e) {
e.printStackTrace();
} catch (InterruptedException e) {
e.printStackTrace();
}
return arrayList;
}
It is working great with normal command like ls or sudo lsof -t -i:132 or other. Even tailing live log from ping google.com. My socat command
sudo SOCAT_SOCKADDR=192.168.11.131 socat -d -d -T 10 UDP4-LISTEN:132,reuseaddr,fork UDP4:192.168.11.130:130,bind=192.168.11.131:133
(-d -d for verbose) in terminal creates socat process and tails the logs in terminals like incoming connection or connection status. but if I run this command via my runCommand(), this isn't printing anything in the terminal where the java jar application is running. And it worked fine for other cases of commands.
I tried placing the process.waitFor() before reading the loop but nothing. What is the problem here? my main goal is to parse the live-tailed log and do some other stuff depending on that.

Related

Shell script exit too early due to docker pull, while running through ProcessBuilder

I have the following problem which seems to be caused by the "docker pull" in my shell script, as the pull works concurrently
#!/bin/bash
#VARIABLES
NAME="my-app"
IMAGE="my-image:latest"
#DOCKER
docker stop $NAME
docker rm $NAME
docker pull -q $IMAGE
docker run --name $NAME -d -p 1234:8080 --log-opt fluentd-address=localhost:2233 $IMAGE
Running the script through the terminal works just fine everything works as expected. But when I run it with the Java's ProcessBuilder the script exits much quicker and it seems that it skips the "docker pull" step. As i am not a Java developer and I am not very well familiar with the Language I have the feeling that is something related to the multi-concurrent nature of the docker pull command and the way how the Java Process Builder executes the shell script
The Java class that runs the shell script is this
try {
Collection<Task> tasks = taskService.getProjectTasksByProjectKey(projectId);
Task findTask = findTaskByTaskId(tasks, taskId);
if (findTask.getTaskId() != null) {
ProcessBuilder pb = new ProcessBuilder(findTask.getCmdPath());
Process process = pb.start();
String output;
try (InputStream in = process.getInputStream();
InputStream err = process.getErrorStream();
OutputStream closeOnly = process.getOutputStream()) {
while (process.isAlive()) {
long skipped = in.skip(in.available())
+ err.skip(err.available());
if(skipped == 0L) {
process.waitFor(5L, TimeUnit.MILLISECONDS);
}
}
output = loadStream(in);
} finally {
process.destroy();
}
// String error = loadStream(process.getErrorStream());
// int rc = process.waitFor();
// log.debug("exit code ->>> " + rc);
// StringBuilder output = new StringBuilder();
// BufferedReader reader = new BufferedReader(
// new InputStreamReader(process.getInputStream()));
//
// String line;
//
// while ((line = reader.readLine()) != null) {
// output.append(line + "\n");
// }
//
// int exitVal = process.waitFor();
// if (exitVal == 0) {
// System.out.println(output);
//
// return output.toString();
// } else {
// //abnormal...
// }
return output;
}
else {
throw new InvalidTaskModelException(taskId);
}
} catch (InvalidModelException e) {
throw new InvalidModelException(projectId);
} catch (IOException e) {
e.printStackTrace();
} catch (InterruptedException e) {
e.printStackTrace();
} catch (Exception e) {
e.printStackTrace();
}
return null;
}
private static String loadStream(InputStream s) throws Exception
{
BufferedReader br = new BufferedReader(new InputStreamReader(s));
StringBuilder sb = new StringBuilder();
String line;
while((line=br.readLine()) != null)
sb.append(line).append("\n");
return sb.toString();
}
The commented lines are different ways I tried to do it.
If anyone encountered a similar problem any help would be much appreciated!
It is good that you already take care for the processes' STDOUT and STDIN. But rather than skipping copy them to System.out so you can see what is going on. I suspect something is not going as per your expectations.
Looking at the bash script you posted and the fact you are trying to run several processes: Is it possible your java code is running the bash script line by line? Be aware your java program it is not a BASH interpreter, so e.g. variable substitution should not work.
why you can not run each command in thread and joins them so that unless therad 1 is not completed . next thread can not start .
also please add command to verify image is downloaded successfully
docker pull -q $IMAGE
docker images | grep $IMAGE
docker run --name $NAME -d -p 1234:8080 --log-opt fluentd-address=localhost:2233 $IMAGE
i am guessing there are 2 possiblities are here
Check local directory persmission if possible give 755 permission to
it .
Java process itself is not able to execute docker command due
to permission issue, run process as sudo user.

How to run .sh file, using process builder?

I already create the .sh file, and the inside is:
sudo iptables --flush
sudo iptables -A INPUT -m mac --mac-source 00:00:00:00:00:00 -j DROP
It works normally when I run it on the terminal, but when I use processbuilder, it didn't do anything. No error, but didn't happen anything, this is the code on my java:
Process pb = new ProcessBuilder("/bin/bash","/my/file.sh").start();
I already looking for the answer, but I still failed to run the .sh file, even I do the same thing with people that already done it.
Sorry if this is a bad question, thank you.
Are you sure that the bash is not run? Do you checked the Process object returned by the startmethod? You can get the output value, the output stream, etc. from this objects.
Check your streams and exitvalue for errors... sudo is probably the problem here.
Not necessarily the best code but it gets the job done. Executes a process, takes the process.streams and prints them to System.out. Might helpt to find out what the issue actually is atlest.
ProcessBuilder pb = new ProcessBuilder(args);
pb.redirectErrorStream(true);
final Process proc = pb.start();
final StringBuilder builder = new StringBuilder("Process output");
final Thread logThread = new Thread() {
#Override
public void run() {
InputStream is = proc.getInputStream();
BufferedReader reader = new BufferedReader(new InputStreamReader(is));
try {
String line;
do {
line = reader.readLine();
builder.append("");
builder.append(line == null ? "" : line);
builder.append("<br/>");
} while(line != null);
} catch (IOException e) {
builder.append("Exception! ").append(e.getMessage());
} finally {
try {
reader.close();
} catch (IOException e) {
builder.append("Exception! ").append(e.getMessage());
}
}
}
};
logThread.start();
int retVal = proc.waitFor();
System.out.println(builder.toString());
From Java API Runtime : http://docs.oracle.com/javase/7/docs/api/java/lang/Runtime.html
// Java runtime
Runtime runtime = Runtime.getRuntime();
// Command
String[] command = {"/bin/bash", "/my/file.sh"};
// Process
Process process = runtime.exec(command);
Also you should be careful with sudo commands that may ask for root password.

Execute spark-submit programmatically from java

I am trying to execute it via:
Process process = Runtime.getRuntime().exec(spark_cmd);
with no luck. The command ran via shell starts my application which succeeds. Running it via exec start a process which dies shortly after and does nothing.
When I try
process.waitFor();
it hangs and waits forever. Real magic begins when I try to read something from the process:
InputStreamReader isr = new InputStreamReader(process.getErrorStream());
BufferedReader br = new BufferedReader(isr);
To do so I start a thread that reads from the stream in a while loop:
class ReadingThread extends Thread {
BufferedReader reader;
Wontekk(BufferedReader reader) {
this.reader = reader;
}
#Override
public void run() {
String line;
try {
while ((line = reader.readLine()) != null) {
System.out.println(line);
}
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
Application starts, does some stuff, and hangs. When I abort my application, spark application wakes up (??????????) and completes remaining work. Does anyone have reasonable explanation of what is happening?
thanks
You can send spark job as spark-submit with the help of java code with the help of SparkLauncher so you can go though below link and check it our
https://spark.apache.org/docs/1.4.0/api/java/org/apache/spark/launcher/SparkLauncher.html
One way is Spark launcher as told by #Sandeep Purohit
I'd offer shell script approach with nohup command to submit job like this...
This worked for me incase of mapreduce executions... same way you can
try for spark background jobs as well.
Have a look https://en.wikipedia.org/wiki/Nohup
"nohup spark-submit <parameters> 2>&1 < /dev/null &"
When ever, you get messages then you can poll that event and call this shell script. Below is the code snippet to do this...
/**
* This method is to spark submit
* <pre> You can call spark-submit or mapreduce job on the fly like this.. by calling shell script... </pre>
* #param commandToExecute String
*/
public static Boolean executeCommand(final String commandToExecute) {
try {
final Runtime rt = Runtime.getRuntime();
// LOG.info("process command -- " + commandToExecute);
final String[] arr = { "/bin/sh", "-c", commandToExecute};
final Process proc = rt.exec(arr);
// LOG.info("process started ");
final int exitVal = proc.waitFor();
LOG.trace(" commandToExecute exited with code: " + exitVal);
proc.destroy();
} catch (final Exception e) {
LOG.error("Exception occurred while Launching process : " + e.getMessage());
return Boolean.FALSE;
}
return Boolean.TRUE;
}
Moreover to debug
ps -aef | grep "your pid or process name"
Below command will list the open files opened by the process..
lsof -p <your process id >
Also, have a look at process.waitFor() never returns

How to execute dos commands in java with administrative privilege?

I was developing a project in Java to scan the File System and this involves executing dos commands in java with administrative privilege.
I already wrote the program to execute simple dos commands in Java.
public class doscmd {
public static void main(String args[]) {
try {
Process p = Runtime.getRuntime().exec("cmd /C dir");
p.waitFor();
BufferedReader reader = new BufferedReader(new InputStreamReader(p.getInputStream()));
String line = reader.readLine();
while (line != null) {
System.out.println(line);
line = reader.readLine();
}
} catch (IOException e1) {
} catch (InterruptedException e2) {
}
System.out.println("Done");
}
}
But as you can see this does not allow to execute elevated commands.
I am developing the project in Netbeans IDE and i was hoping if any of you folks could tell me if there is any code in java to get admin privilege instead of converting the file to .exe and then clicking run as administrator.
Your JVM needs to be running with admin-privileges in order to start a process with admin-privileges.
Build your code and run it as an administrator - every process spawned by your class will have administrator privileges as well.
try this code, it works for me:
String command = "cmd /c start cmd.exe";
Process child = Runtime.getRuntime().exec(command);
OutputStream output = child.getOutputStream();
output .write("cd C:/ /r/n".getBytes());
output .flush();
output .write("DIR /r/n".getBytes());
output .close();

how to execute the linux terminal custom command using java

I have binary of engine already developed with Hadoop(HDFS,HBASE,MAPREDUCE) and java which is use to generate CSV file. There are some operation performed by this engine like table creation in HBASE and generating the CSV File from this HBASE. But this engine is performed all operation through only command line , as input is given from Linux terminal in form of command. Now my requirement is to connect this Linux terminal Through java program and run the commands But I am not able to successfully run any of the command
There are two option I tried but none of them successfully worked. Please provide any suggestion or solution to solve this problem as I am just beginner of Linux and hadoop to figure out the problem
1st-way
public class EngineTest {
public static void main(String[] args) {
try {
Process process = Runtime
.getRuntime()
.exec("/home/cloudera/PVGproto/Base/ anloss -i ${TOOL_INPUT}/census_10000_col5.csv -d ${TOOL_DEF}/attr_all_def.txt -q k=14,dage=2 -g ${TOOL_RES}/census_100_col8_gen.csv");
process.waitFor();
BufferedReader bufferedReader = new BufferedReader(
new InputStreamReader(process.getInputStream()));
String line = "";
String output = "";
while ((line = bufferedReader.readLine()) != null) {
output += line + "\n";
}
System.out.println(output);
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
2nd-way
I made one ex.sh file, in this file I put the required command for execution and call this . sh file from the same java program but the same thing happened its not running the command through java program
Process process = Runtime.getRuntime().exec("/home/cloudera/PVGproto/Base/ex.sh");
but if I run the same ex.sh from Linux terminal its running all the command successfully.
ex.sh
# !/bin/bash
exec /home/cloudera/PVGproto/Base/ anloss -i ${TOOL_INPUT}/census_10000_col5.csv -d ${TOOL_DEF}/attr_all_def.txt -q k=14,age=2 -g ${TOOL_RES}/census_100_col8_gen.csv
echo command run successfully

Categories

Resources