ReadLine on TCPDump-Buffer sometimes blocks until kill tcpdump - java

I have a problem using TCPDump from my Android-Application.
It is supposed to read the output from tcpdump line by line and process it within my Application. The Problem is: Sometimes the code works fine, it reads the captured packets immediately. But sometimes, ReadLine blocks until I kill the tcpdump process from the Linux-Console (killall tcpdump). After doing that, my loop is processed for each line (sometimes 10, sometimes 1 or 2) - which means, the readLine should have worked, but didnĀ“t.
I read about similar problems, but did not find any solution for this problem... THANKS!!
public class ListenActivity extends Activity {
static ArrayList<Packet> packetBuffer = new ArrayList<Packet>();
static Process tcpDumpProcess = null;
static ListenThread thread = null;
public static final String TCPDUMP_COMMAND = "tcpdump -A -s0 | grep -i -e 'Cookie'\n";
private InputStream inputStream = null;
private OutputStream outputStream = null;
#Override
protected void onStart() {
super.onStart();
try {
tcpDumpProcess = new ProcessBuilder().command("su").redirectErrorStream(true).start();
inputStream = tcpDumpProcess.getInputStream();
outputStream = tcpDumpProcess.getOutputStream();
outputStream.write(TCPDUMP_COMMAND.getBytes("ASCII"));
} catch (Exception e) {
Log.e("FSE", "", e);
}
thread = new ListenThread(new BufferedReader(new InputStreamReader(inputStream)));
thread.start();
}
private class ListenThread extends Thread {
public ListenThread(BufferedReader reader) {
this.reader = reader;
}
private BufferedReader reader = null;
#Override
public void run() {
reader = new BufferedReader(new InputStreamReader(inputStream));
while (true) {
try {
String received = reader.readLine();
Log.d("FS", received);
Packet pReceived = Packet.analyze(received);
if (pReceived != null) {
packetBuffer.add(pReceived);
}
} catch (Exception e) {
Log.e("FSE", "", e);
}
}
}
}
}

Because output sent to pipes is usually block buffered, both the tcpdump process and the grep process will be waiting until they've received enough data to bother sending it onto your program. You're very lucky though, both programs you have chosen to use are prepared to modify their buffer behavior (using the setvbuf(3) function internally, in case you're curious about the details):
For tcpdump(8):
-l Make stdout line buffered. Useful if you want to see
the data while capturing it. E.g.,
``tcpdump -l | tee dat'' or ``tcpdump -l >
dat & tail -f dat''.
For grep(1):
--line-buffered
Use line buffering on output. This can cause a
performance penalty.
Try this:
"tcpdump -l -A -s0 | grep --line-buffered -i -e 'Cookie'\n";

I don't understand why, but even with the -l option the buffer is too large if you read on the standard output of the process wherein you run tcpdump.
I solve this problem by redirect TcpDump's output to a file and read this file in another thread. The TcpDump command should be something like :
tcpdump -l-A -s0 > /data/local/output.txt
The run method inside your thread have to be change to read in the output file :
File dumpedFile = new File("/data/local/output.txt");
//open a reader on the tcpdump output file
BufferedReader reader = new BufferedReader(new FileReader(dumpedFile));
String temp = new String();
//The while loop is broken if the thread is interrupted
while (!Thread.interrupted()) {
temp = reader.readLine();
if (temp!=null) {
Log.e("READER",new String(temp));
}
}
I dont exactly know what you want to do with grep but I think it's possible do achieve the same actions with a regexp inside the Java code.
You should also be aware that the TcpDump's process will never end, so you have to kill it when your activity is paused or distroy.
You can have a look here to my blog post, I explain my whole code to start/stop tcpdump.

Related

Stream not closing appropriately while using named pipes in Java/Linux

I have a program where I use named pipes to share info with an external executable:
Process p = Runtime.getRuntime().exec("mkfifo /tmp/myfifo");
p.waitFor();
Process cat = Runtime.getRuntime().exec("cat /tmp/myfifo");
BufferedWriter fifo = new BufferedWriter(
new OutputStreamWriter(new FileOutputStream("/tmp/myfifo")));
fifo.write("Hello!\n");
fifo.close();
cat.waitFor();
When I execute this, the program hangs waiting for cat to finish. It seems that cat has not 'realized' that the fifo was closed.
I tried running $> touch /tmp/myfifo on the terminal, and it worked to 'unhang' the process and it finishing properly; but when I added code to run this within my program, it would remain hanging:
fifo.close();
Process touch = Runtime.getRuntime().exec("touch /tmp/myfifo");
touch.waitFor();
cat.waitFor();
The process will still hang waiting for cat to finish. I'm not sure what to do now.
NOTE - I have already added code to consume the output of the cat command, but the problem does not seem to be there.
Anyone know a workaround/fix for this?
some native platforms only provide limited buffer size for standard
input and output streams, failure to promptly write the input stream
or read the output stream of the subprocess may cause the subprocess
to block, and even deadlock.you need to consume the output like print it on stdout something or file
try something like this
Process cat = Runtime.getRuntime().exec("cat /tmp/myfifo");
new Thread(new Reader(cat.getErrorStream(), System.err)).start();
new Thread(new Reader(cat.getInputStream(), System.out)).start();
int returnCode = cat.waitFor();
System.out.println("Return code = " + returnCode);
class Reader implements Runnable
{
public Reader (InputStream istrm, OutputStream ostrm) {
this.istrm = istrm;
this.ostrm = ostrm;
}
public void run() {
try
{
final byte[] buffer = new byte[1024];
for (int length = 0; (length = istrm.read(buffer)) != -1; )
{
ostrm.write(buffer, 0, length);
}
}
catch (Exception e)
{
e.printStackTrace();
}
}
private final OutputStream ostrm;
private final InputStream istrm;
}

Problems again with child processes in Java

I am on Ubuntu 14.04.
I am trying to run something like ps aux | grep whatevah through Java's class ProcessBuilder. I create two child processes and I make them communicate synchronously, but for some reason, I can not see anything in the terminal.
This is the code:
try {
// What comes out of process1 is our inputStream
Process process1 = new ProcessBuilder("ps", "aux").start();
InputStream is1 = process1.getInputStream();
BufferedReader br1 = new BufferedReader (new InputStreamReader(is1));
// What goes into process2 is our outputStream
Process process2 = new ProcessBuilder("grep", "gedit").start();
OutputStream os = process2.getOutputStream();
BufferedWriter bw = new BufferedWriter(new OutputStreamWriter(os));
// Send the output of process1 to the input of process2
String p1Output = null;
while ((p1Output = br1.readLine()) != null) {
bw.write(p1Output);
System.out.println(p1Output);
}
// Synchronization
int finish = process2.waitFor();
System.out.println(finish);
// What comes out of process2 is our inputStream
InputStream is2 = process2.getInputStream();
BufferedReader br2 = new BufferedReader(new InputStreamReader(is2));
String combOutput = null;
while ((combOutput = br2.readLine()) != null)
System.out.println(combOutput);
os.close();
is1.close();
is2.close();
} catch (IOException e) {
System.out.println("Command execution error: " + e.getMessage());
} catch (Exception e) {
System.out.println("General error: " + e.getMessage());
}
(The System.out.println(p1Output); is just for me to check, the print that has to work is the last one, printing the result of ps aux | grep whatevah.)
I've tried several things, the less silly include:
If I comment everything regarding process2, I get the result of ps aux printed on the terminal
If I run the program as is, it prints nothing to the terminal.
If I uncomment the waitFor call, only ps aux gets printed.
If change the commands to, for example, ls -al and ls -al, then both get printed.
I tried changing "aux" for "aux |" but still nothing is printed.
Closed the buffers, also nothing
etc.
Any help will be sorely appreciated.
Cheers!
EDIT
Minutes after accepting Ryan's amazing answer I made my last try to make this code work. And I succeeded! I changed:
while ((p1Output = br1.readLine()) != null) {
bw.write(p1Output);
System.out.println(p1Output);
}
for:
while ((p1Output = br1.readLine()) != null) {
bw.write(p1Output + "\n");
System.out.println(p1Output);
}
bw.close();
and it works! I remember closing the buffer before, so I don't know what went wrong. Turns out you should not stay awake until late trying to make a piece of code work XD.
Ryan's answer down here is still amazing, though.
Given the advice in the comments, the important thing to note is the necessity to use threads to process input/output for a process in order to achieve what you want.
I've used the link posted by jtahlborn and adapted this solution that you might be able to use.
I created a simple example that will list files in a directory and grep through the output.
This example simulates the command ls -1 | grep some from a directory called test with three files somefile.txt someotherfile.txt and this_other_file.csv
EDIT: The original solution didn't really fully use the "pipe" methodology, as it was waiting fully for p1 to finish before starting p2. Rather, it should start them both, and then the output of the first should be piped to the second. I've updated the solution with a class that accomplishes this.
import java.io.*;
import java.util.Scanner;
public class Main {
public static void main(String[] args) {
try {
// construct a process
ProcessBuilder pb1 = new ProcessBuilder("ls", "-1");
// set working directory
pb1.directory(new File("test"));
// start process
final Process process1 = pb1.start();
// get input/error streams
final InputStream p1InStream = process1.getInputStream();
final InputStream p1ErrStream = process1.getErrorStream();
// handle error stream
Thread t1Err = new InputReaderThread(p1ErrStream, "Process 1 Err");
t1Err.start();
// this will print out the data from process 1 (for illustration purposes)
// and redirect it to process 2
Process process2 = new ProcessBuilder("grep", "some").start();
// process 2 streams
final InputStream p2InStream = process2.getInputStream();
final InputStream p2ErrStream = process2.getErrorStream();
final OutputStream p2OutStream = process2.getOutputStream();
// do the same as process 1 for process 2...
Thread t2In = new InputReaderThread(p2InStream, "Process 2 Out");
t2In.start();
Thread t2Err = new InputReaderThread(p2ErrStream, "Process 2 Err");
t2Err.start();
// create a new thread with our pipe class
// pass in the input stream of p1, the output stream of p2, and the name of the input stream
new Thread(new PipeClass(p1InStream, p2OutStream, "Process 1 Out")).start();
// wait for p2 to finish
process2.waitFor();
} catch (IOException e) {
System.out.println("Command execution error: " + e.getMessage());
} catch (Exception e) {
System.out.println("General error: " + e.getMessage());
}
}
}
This is a class that will be used to simulate a process pipe. It uses some loops to copy bytes around, and could be more efficient, depending on your needs, but for the illustration, it should work.
// this class simulates a pipe between two processes
public class PipeClass implements Runnable {
// the input stream
InputStream is;
// the output stream
OutputStream os;
// the name associated with the input stream (for printing purposes only...)
String isName;
// constructor
public PipeClass(InputStream is, OutputStream os, String isName) {
this.is = is;
this.os = os;
this.isName = isName;
}
#Override
public void run() {
try {
// use a byte array output stream so we can clone the data and use it multiple times
ByteArrayOutputStream baos = new ByteArrayOutputStream();
// read the data into the output stream (it has to fit in memory for this to work...)
byte[] buffer = new byte[512]; // Adjust if you want
int bytesRead;
while ((bytesRead = is.read(buffer)) != -1) {
baos.write(buffer, 0, bytesRead);
}
// clone it so we can print it out
InputStream clonedIs1 = new ByteArrayInputStream(baos.toByteArray());
Scanner sc = new Scanner(clonedIs1);
// print the info
while (sc.hasNextLine()) {
System.out.println(this.isName + " >> " + sc.nextLine());
}
// clone again to redirect to the output of the other process
InputStream clonedIs2 = new ByteArrayInputStream(baos.toByteArray());
buffer = new byte[512]; // Adjust if you want
while ((bytesRead = clonedIs2.read(buffer)) != -1) {
// write it out to the output stream
os.write(buffer, 0, bytesRead);
}
}
catch (IOException ex) {
ex.printStackTrace();
}
finally {
try {
// close so the process will finish
is.close();
os.close();
}
catch (Exception ex) {
ex.printStackTrace();
}
}
}
}
This is a class that was created for handling process output, adapted from this reference
// Thread reader class adapted from
// http://www.javaworld.com/article/2071275/core-java/when-runtime-exec---won-t.html
public class InputReaderThread extends Thread {
// input stream
InputStream is;
// name
String name;
// is there data?
boolean hasData = false;
// data itself
StringBuilder data = new StringBuilder();
// constructor
public InputReaderThread(InputStream is, String name) {
this.is = is;
this.name = name;
}
// set if there's data to read
public synchronized void setHasData(boolean hasData) {
this.hasData = hasData;
}
// data available?
public boolean hasData() { return this.hasData; }
// get the data
public StringBuilder getData() {
setHasData(false); // clear flag
StringBuilder returnData = this.data;
this.data = new StringBuilder();
return returnData;
}
#Override
public void run() {
// input reader
InputStreamReader isr = new InputStreamReader(this.is);
Scanner sc = new Scanner(isr);
// while data remains
while ( sc.hasNextLine() ) {
// print out and append to data
String line = sc.nextLine();
System.out.println(this.name + " >> " + line);
this.data.append(line + "\n");
}
// flag there's data available
setHasData(true);
}
}
The produced output is:
Process 1 Out >> somefile.txt
Process 1 Out >> someotherfile.txt
Process 1 Out >> this_other_file.csv
Process 2 Out >> somefile.txt
Process 2 Out >> someotherfile.txt
To show that piping is really working, changing the command to ps -a | grep usr the output is:
Process 1 Out >> PID PPID PGID WINPID TTY UID STIME COMMAND
Process 1 Out >> I 15016 1 15016 15016 con 400 13:45:59 /usr/bin/grep
Process 1 Out >> 15156 1 15156 15156 con 400 14:21:54 /usr/bin/ps
Process 1 Out >> I 9784 1 9784 9784 con 400 14:21:54 /usr/bin/grep
Process 2 Out >> I 15016 1 15016 15016 con 400 13:45:59 /usr/bin/grep
Process 2 Out >> 15156 1 15156 15156 con 400 14:21:54 /usr/bin/ps
Process 2 Out >> I 9784 1 9784 9784 con 400 14:21:54 /usr/bin/grep
Seeing the grep command in process 2's output shows that the piping is working, with the old solution I posted, this would be missing.
Note the handling of the error stream, which is always good practice, even if you don't plan to use it.
This is a quick and dirty solution that could benefit from some additional thread management techniques, but it should get you what you want.

Not able to execute sort command using Runtime or ProcessBuilder

I am trying to execute this command sort --field-separator="," --key=2 /home/dummy/Desktop/sample.csv" -o /home/dummy/Desktop/sample_temp.csv using Java Runtime and ProcessBuilder.
Manually I am able to execute this command in linux, but using Runtime or ProcessBuilder, this command does not execute. It returns with an error code = 2.
Edit:
If I am trying to execute 'ls' command in linux through Java, I get the list of files in the current directory. But, If I try to execute the command 'ls | grep a', an IOException is thrown with error code=2.
Here is the snippet:
public static void main(String[] args) throws IOException {
InputStream is = null;
ByteArrayOutputStream baos = null;
ProcessBuilder pb = new ProcessBuilder("ls | grep a");
try {
Process prs = pb.start();
is = prs.getInputStream();
byte[] b = new byte[1024];
int size = 0;
baos = new ByteArrayOutputStream();
while((size = is.read(b)) != -1){
baos.write(b, 0, size);
}
System.out.println(new String(baos.toByteArray()));
}
catch (IOException e)
{
e.printStackTrace();
}
finally
{
try {
if(is != null) is.close();
if(baos != null) baos.close();
} catch (Exception ex){}
}
}
There could be a range of issue with your code. Hence you did not supply your code I can only guess.
The output file needs to be already created
The ',' field separator does not need the quotes around it (see code below)
So after these 2 issues (both making the program exit with '2'), this code actually works:
import java.io.IOException;
import java.util.Arrays;
public class Test {
public static void main(String[] args) throws IOException, InterruptedException {
ProcessBuilder pb = new ProcessBuilder(Arrays.asList("sort", "--field-separator=,", "--key=2", "/tmp/sample.csv", "-o",
"/tmp/sample_temp.csv"));
Process p = pb.start();
int returnCode = p.waitFor();
System.out.println(returnCode);
}
}
Will print '0' and will sort the file correctly.
For the 'ls | grep' issue, read this great article: http://www.javaworld.com/article/2071275/core-javahen-runtime-exec---won-t/core-java/when-runtime-exec---won-t.html
The article basically explains that the Runtime.exec (and the ProcessBuilder wrapper) is for running processes and not a Shell (the ls | grep you are trying are actually 2 processes in Linux communicating with each other thru stdout/in).
I am able to execute that manually. And error code 2 means Misuse of Shell BuiltIns
I see in your example you are only invoking "ls", not "/usr/bin/ls" (or something like that).
When you execute manually you have the luxury of PATH environment variable which is not availabled to the process you create.
Use "which ls" to discover the location of 'ls' on your target system. For your code to be portable you will have to make it a configurable option.
this is the way to execute any bash commands like sort, ls, cat (with sub-options). Please find the snippet:
private String executeCommand(String command) {
StringBuffer output = new StringBuffer();
Process p;
try {
p = Runtime.getRuntime().exec("script.sh");
p.waitFor();
BufferedReader reader =
new BufferedReader(new InputStreamReader(p.getInputStream()));
String line = "";
while ((line = reader.readLine())!= null) {
output.append(line + "\n");
}
} catch (Exception e) {
e.printStackTrace();
}
return output.toString();
}
In the exec() method, I passed a shell script which contains the bash command within it. That linux command will be executed and you can carry on with the next task. Hope this was helpful.

java: ProcessBuilder makes a memory hog

I have some issues regarding ProcessBuilder.
The program is basically a simple wrapper invoking a command line script.
When running the script on its own via the terminal, the memory consumption stays below 2G.
When running the script via the java wrapper, the memory consumption explodes and even 8G is quickly filled up, resulting in out-of-memory errors.
The code to launch the process is simply:
public static int execute(String command) throws IOException
{
System.out.println("Executing: " + command);
ProcessBuilder pb = new ProcessBuilder(command.split(" +"));
Process p = pb.start();
// display any output in stderr or stdout
StreamConsumer stderr = new StreamConsumer(p.getErrorStream(), "stderr");
StreamConsumer stdout = new StreamConsumer(p.getInputStream(), "stdout");
new Thread(stderr).start();
new Thread(stdout).start();
try {
return p.waitFor();
} catch (InterruptedException e) {
throw new RuntimeException(e);
}
}
The StreamConsumer class is simply a class which consumes the stdout/stderr streams and display them on the console.
...the question is: why on earth does the memory consumption explode?
Regards,
Arnaud
Edit:
Whether I use ProcessBuilder or
Runtime.getRuntime.exec(...), the
result is the same.
The memory bursts tend to appear during unix 'sort' invoked by the
shell script called:
sort big-text-file > big-text-file.sorted
Edit 2 on request of Jim Garrison:
Ok, here is the StreamConsumer class which I omitted because it is rather simple:
class StreamConsumer implements Runnable
{
InputStream stream;
String descr;
StreamConsumer(InputStream stream, String descr) {
this.stream = stream;
this.descr = descr;
}
#Override
public void run()
{
String line;
BufferedReader brCleanUp =
new BufferedReader (new InputStreamReader (stream));
try {
while ((line = brCleanUp.readLine ()) != null)
System.out.println ("[" + descr + "] " + line);
brCleanUp.close();
} catch (IOException e) {
// TODO: handle exception
}
}
}
if you change your command like this :
sort -o big-text-file.sorted big-text-file
is it always the same ?
Maybe its because those StreamConsumer threads are not daemons so they don't die and get garbage collected when your processes return? You could try:
//...
final StreamConsumer stderr = new StreamConsumer(p.getErrorStream(), "stderr");
final StreamConsumer stdout = new StreamConsumer(p.getInputStream(), "stdout");
final Thread stderrThread = new Thread(stderr);
final Thread stdoutThread = new Thread(stdout);
stderrThread.setDaemon(true);
stdoutThread.setDaemon(true);
stderrThread.start();
stdoutThread.start();
//...
Is this behavior happening for single invocation or after doing this many times?

Calling a shell script from java hangs

So I'm trying to execute a shell script which produces a lot of output(in 100s of MBs) from a Java file.
This hangs the process and never completes.
However, within the shell script, if I redirect the output of the script to some log file or /dev/null Java file executes and completes in a jiffy.
Is it because of amount of data that the Java program never completes?
If so, is there any documentation as such? or is there any limit on the amount of data(documented)?
Here's how you can simulate this scenario.
Java file will look like:
import java.io.InputStream;
public class LotOfOutput {
public static void main(String[] args) {
String cmd = "sh a-script-which-outputs-huuggee-data.sh";
try {
ProcessBuilder pb = new ProcessBuilder("bash", "-c", cmd);
pb.redirectErrorStream(true);
Process shell = pb.start();
InputStream shellIn = shell.getInputStream();
int shellExitStatus = shell.waitFor();
System.out.println(shellExitStatus);
shellIn.close();
} catch (Exception ignoreMe) {
}
}
}
The script 'a-script-which-outputs-huuggee-data.sh' may look like:
#!/bin/sh
# Toggle the line below
exec 3>&1 > /dev/null 2>&1
count=1
while [ $count -le 1000 ]
do
cat some-big-file
((count++))
done
echo
echo Yes I m done
Free beer for the right answer. :)
It's because you're not reading from the Process' output.
As per the class' Javadocs, if you don't do this then you may end up with a deadlock; the process fills its IO buffer and waits for the "shell" (or listening process) to read from it and empty it. Meanwhile your process, which should be doing this, is blocking waiting for the process to exit.
You'll want to call getInputStream() and read from that reliably (perhaps from another thread) to stop the process blocking.
Also take a look at Five Java Process Pitfalls and When Runtime.exec() Won't - both informative articles about common problems with Process.
You're never reading the input stream, so it's probably blocking because the input buffer is full.
The input/output buffer have a limited size (depending on the operating system). If I remember correctly this wasn't big or Windows XP at least. Try creating a thread that reads the InputStream as fast as possible.
Something along these lines:
class StdInWorker
implements Worker
{
private BufferedReader br;
private boolean run = true;
private int linesRead = 0;
private StdInWorker (Process prcs)
{
this.br = new BufferedReader(
new InputStreamReader(prcs.getInputStream()));
}
public synchronized void run ()
{
String in;
try {
while (this.run) {
while ((in = this.br.readLine()) != null) {
this.buffer.add(in);
linesRead++;
}
Thread.sleep(50);
}
}
catch (IOException ioe) {
ioe.printStackTrace();
}
catch (InterruptedException ie) {}
}
}
}

Categories

Resources