Starting a process with inherited stdin/stdout/stderr in Java 6 - java

If I start a process via Java's ProcessBuilder class, I have full access to that process's standard in, standard out, and standard error streams as Java InputStreams and OutputStreams. However, I can't find a way to seamlessly connect those streams to System.in, System.out, and System.err.
It's possible to use redirectErrorStream() to get a single InputStream that contains the subprocess's standard out and standard error, and just loop through that and send it through my standard out—but I can't find a way to do that and let the user type into the process, as he or she could if I used the C system() call.
This appears to be possible in Java SE 7 when it comes out—I'm just wondering if there's a workaround now. Bonus points if the result of isatty() in the child process carries through the redirection.

You will need to copy the Process out, err, and input streams to the System versions. The easiest way to do that is using the IOUtils class from the Commons IO package. The copy method looks to be what you need. The copy method invocations will need to be in separate threads.
Here is the basic code:
// Assume you already have a processBuilder all configured and ready to go
final Process process = processBuilder.start();
new Thread(new Runnable() {public void run() {
IOUtils.copy(process.getOutputStream(), System.out);
} } ).start();
new Thread(new Runnable() {public void run() {
IOUtils.copy(process.getErrorStream(), System.err);
} } ).start();
new Thread(new Runnable() {public void run() {
IOUtils.copy(System.in, process.getInputStream());
} } ).start();

A variation on John's answer that compiles and doesn't require you to use Commons IO:
private static void pipeOutput(Process process) {
pipe(process.getErrorStream(), System.err);
pipe(process.getInputStream(), System.out);
}
private static void pipe(final InputStream src, final PrintStream dest) {
new Thread(new Runnable() {
public void run() {
try {
byte[] buffer = new byte[1024];
for (int n = 0; n != -1; n = src.read(buffer)) {
dest.write(buffer, 0, n);
}
} catch (IOException e) { // just exit
}
}
}).start();
}

For System.in use the following pipein() instead of pipe()
pipein(System.in, p.getOutputStream());
Implementation:
private static void pipein(final InputStream src, final OutputStream dest) {
new Thread(new Runnable() {
public void run() {
try {
int ret = -1;
while ((ret = System.in.read()) != -1) {
dest.write(ret);
dest.flush();
}
} catch (IOException e) { // just exit
}
}
}).start();
}

Related

I am trying to make an interface from where we can with on button click execute .exe files to install a s/w

This is my main class, wherein run(), I am calling one another method install setup() which is for exe files.
public static void main(String[] args) {
launch(args);
}
public void startSetup() {
Runnable task=new Runnable() {
#Override
public void run() {
try {
Thread.sleep(1000);
installSetup();
} catch (InterruptedException e) {
e.printStackTrace();
}
}
};
Thread thread=new Thread(task);
thread.start();
}
Here is my installsetup() method
public void installSetup() {
try {
Runtime.getRuntime().exec("cmd /c C:path\\setup.exe", null, new File("C:pathfolder\\01_Setupexe"));
//process.waitFor();
} catch (IOException e) {
e.printStackTrace();
}
};
I am calling it in my controller class like this:
public class Controller extends Thread {
#FXML
private ComboBox<?> dsetup;
public void generateRandom() {
if(dsetup.getValue()!=null) dsetupValue = dsetup.getValue().toString();
if(dsetupValue!=null)call.startSetup();
Before I was just calling the install files with the exec method but not with threads concept, the application was working fine, but it was executing all the.exe files at once and then my interface freezes. So now I am using threads concept and trying to implement one thread at a time. I don't understand if it is a wrong way or not, but I do not get any error in console.
Runtime.exec has been obsolete for many years. Use ProcessBuilder instead:
ProcessBuilder builder = new ProcessBuilder("C:\\path\\setup.exe");
builder.directory(new File("C:pathfolder\\01_Setupexe"));
builder.inheritIO();
builder.start();
The inheritIO() method will make the spawned process use the Java program’s stdin, stdout, and stderr, so it will not hang waiting for input or waiting for an available output buffer.
I doubt you need the new Thread or the sleep call, but I don’t know what files you’re calling or whether they depend on each other.
Sadly exec has some pitfalls. Most of the time using the process aproche (see Listing 4.3) saved me related to buffer issues and so on.
https://www.javaworld.com/article/2071275/core-java/when-runtime-exec---won-t.html
import java.util.*;
import java.io.*;
public class MediocreExecJavac
{
public static void main(String args[])
{
try
{
Runtime rt = Runtime.getRuntime();
Process proc = rt.exec("javac");
InputStream stderr = proc.getErrorStream();
InputStreamReader isr = new InputStreamReader(stderr);
BufferedReader br = new BufferedReader(isr);
String line = null;
System.out.println("<ERROR>");
while ( (line = br.readLine()) != null)
System.out.println(line);
System.out.println("</ERROR>");
int exitVal = proc.waitFor();
System.out.println("Process exitValue: " + exitVal);
} catch (Throwable t)
{
t.printStackTrace();
}
}
}
Source: javaworld

Java Thread doesn't live as long as the main program does

What I'm trying to accomplish here is to instantiate via Apache Commons Exec an instance of a second .jar apart from the main one which throws this second Java program.
What this second .jar does is basically send bytes to the stdout. This is the code that launches this program.
private void runJar(PipedOutputStream output) throws IOException {
DefaultExecutor executor = new DefaultExecutor();
CommandLine commandLine;
String executeMe = "java -jar myjar.jar";
commandLine = CommandLine.parse(executeMe);
executor.setStreamHandler(new PumpStreamHandler(output, null));
executor.execute(commandLine, new DefaultExecuteResultHandler());
}
But so far I couldn't find a way to execute this without blocking the normal flow of the normal program with the library, so what I did as a workaround is to create Thread, like this...
Thread t3 = new Thread() {
public void run() {
try {
runJar(output);
} catch (IOException e) {
e.printStackTrace();
}
}
};
t3.start();
I've tried launching the commands via the cmd and everything works properly, but in this case when I run everything inside Java the Thread seems to stop after several seconds.
Am I missing something when I create a new Thread so it can live as long as the main program does?
UPDATE:
A runnable example from the first .jar can be this code...
public class SecondApp {
public static void main(String[] args) throws ClassNotFoundException, IOException, InterruptedException {
File in = new File(args[0]);
try (InputStream input = new FileInputStream(in)) {
int bytesRead, CHUNK_SIZE = 4096;
byte[] data = new byte[CHUNK_SIZE];
while (true) {
bytesRead = input.read(data, 0, CHUNK_SIZE);
if (bytesRead > 0) {
System.out.write(data, 0, bytesRead);
System.out.flush();
} else if (bytesRead == -1) {
System.exit(0);
}
}
}
}
It basically spits out bytes to the stdout. The third app is just an external one (ffmpeg), which receives those bytes from this second .jar via a PipedInputStream, like this...
PipedOutputStream output = new PipedOutputStream();
PipedInputStream input = new PipedInputStream();
output.connect(input);
In order to debug and know the Thread status succesfully, I've changed the asynchronous execution for a synchronous one, and now I get that both threads are in RUNNABLE state and are ALIVE.
#Override
public void run() {
try {
DefaultExecutor executor = new DefaultExecutor();
CommandLine commandLine;
String executeMe = "java -jar myjar.jar";
commandLine = CommandLine.parse(executeMe);
executor.setStreamHandler(new PumpStreamHandler(output, null));
executor.execute(commandLine); //, new DefaultExecuteResultHandler());
} catch (ExecuteException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
As #matt stated, it may be a problem of the output from the FFMPEG thread not being consumed.
executor.setStreamHandler(
new PumpStreamHandler(
null, // stdout
null, // stderr
input)); // stdin
That was the previous line from the FFMPEG executor, now changed by a NullOutputStream (the one from Apache Commons Exec).
executor.setStreamHandler(
new PumpStreamHandler(
new NullOutputStream(), // stdout
null, // stder
input)); // stdin
After those changes, the stream seems to stop after few seconds, so it may be the reason #matt said, the stream isn't being consumed and gets full. Why isn't it working with a NullOutputStream()?

How to make a Jtext Component for input and output of a single process ran through ProcessBuilder like netbeans console [duplicate]

I am trying to create a sort of console/terminal that allows the user to input a string, which then gets made into a process and the results are printed out. Just like a normal console. But I am having trouble managing the input/output streams. I have looked into this thread, but that solution sadly doesn't apply to my problem.
Along with the standard commands like "ipconfig" and "cmd.exe", I need to be able to run a script and use the same inputstream to pass some arguments, if the script is asking for input.
For example, after running a script "python pyScript.py", I should be able pass further input to the script if it is asking for it(example: raw_input), while also printing the output from the script. The basic behavior you would expect from a terminal.
What I've got so far:
import java.awt.BorderLayout;
import java.awt.Color;
import java.awt.Dimension;
import java.awt.event.KeyEvent;
import java.awt.event.KeyListener;
import java.io.BufferedReader;
import java.io.BufferedWriter;
import java.io.IOException;
import java.io.InputStream;
import java.io.InputStreamReader;
import java.io.OutputStream;
import java.io.OutputStreamWriter;
import javax.swing.JFrame;
import javax.swing.JPanel;
import javax.swing.JScrollPane;
import javax.swing.JTextPane;
import javax.swing.text.BadLocationException;
import javax.swing.text.Document;
public class Console extends JFrame{
JTextPane inPane, outPane;
InputStream inStream, inErrStream;
OutputStream outStream;
public Console(){
super("Console");
setPreferredSize(new Dimension(500, 600));
setLocationByPlatform(true);
setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE);
// GUI
outPane = new JTextPane();
outPane.setEditable(false);
outPane.setBackground(new Color(20, 20, 20));
outPane.setForeground(Color.white);
inPane = new JTextPane();
inPane.setBackground(new Color(40, 40, 40));
inPane.setForeground(Color.white);
inPane.setCaretColor(Color.white);
JPanel panel = new JPanel(new BorderLayout());
panel.add(outPane, BorderLayout.CENTER);
panel.add(inPane, BorderLayout.SOUTH);
JScrollPane scrollPanel = new JScrollPane(panel);
getContentPane().add(scrollPanel);
// LISTENER
inPane.addKeyListener(new KeyListener(){
#Override
public void keyPressed(KeyEvent e){
if(e.getKeyCode() == KeyEvent.VK_ENTER){
e.consume();
read(inPane.getText());
}
}
#Override
public void keyTyped(KeyEvent e) {}
#Override
public void keyReleased(KeyEvent e) {}
});
pack();
setVisible(true);
}
private void read(String command){
println(command);
// Write to Process
if (outStream != null) {
System.out.println("Outstream again");
BufferedWriter writer = new BufferedWriter(new OutputStreamWriter(outStream));
try {
writer.write(command);
//writer.flush();
//writer.close();
} catch (IOException e1) {
e1.printStackTrace();
}
}
// Execute Command
try {
exec(command);
} catch (IOException e) {}
inPane.setText("");
}
private void exec(String command) throws IOException{
Process pro = Runtime.getRuntime().exec(command, null);
inStream = pro.getInputStream();
inErrStream = pro.getErrorStream();
outStream = pro.getOutputStream();
Thread t1 = new Thread(new Runnable() {
public void run() {
try {
String line = null;
while(true){
BufferedReader in = new BufferedReader(new InputStreamReader(inStream));
while ((line = in.readLine()) != null) {
println(line);
}
BufferedReader inErr = new BufferedReader(new InputStreamReader(inErrStream));
while ((line = inErr.readLine()) != null) {
println(line);
}
Thread.sleep(1000);
}
} catch (Exception e) {
e.printStackTrace();
}
}
});
t1.start();
}
public void println(String line) {
Document doc = outPane.getDocument();
try {
doc.insertString(doc.getLength(), line + "\n", null);
} catch (BadLocationException e) {}
}
public static void main(String[] args){
new Console();
}
}
I don't use the mentioned ProcessBuilder, since I do like to differentiate between error and normal stream.
UPDATE 29.08.2016
With the help of #ArcticLord we have achieved what was asked in the original question.
Now it is just a matter of ironing out any strange behavior like the non terminating process. The Console has a "stop" button that simply calls pro.destroy(). But for some reason this does not work for infinitely running processes, that are spamming outputs.
Console: http://pastebin.com/vyxfPEXC
InputStreamLineBuffer: http://pastebin.com/TzFamwZ1
Example code that does not stop:
public class Infinity{
public static void main(String[] args){
while(true){
System.out.println(".");
}
}
}
Example code that does stop:
import java.util.concurrent.TimeUnit;
public class InfinitySlow{
public static void main(String[] args){
while(true){
try {
TimeUnit.SECONDS.sleep(1);
} catch (InterruptedException e) {
e.printStackTrace();
}
System.out.println(".");
}
}
}
You are on the right way with your code. There are only some minor things you missed.
Lets start with your read method:
private void read(String command){
[...]
// Write to Process
if (outStream != null) {
[...]
try {
writer.write(command + "\n"); // add newline so your input will get proceed
writer.flush(); // flush your input to your process
} catch (IOException e1) {
e1.printStackTrace();
}
}
// ELSE!! - if no outputstream is available
// Execute Command
else {
try {
exec(command);
} catch (IOException e) {
// Handle the exception here. Mostly this means
// that the command could not get executed
// because command was not found.
println("Command not found: " + command);
}
}
inPane.setText("");
}
Now lets fix your exec method. You should use separate threads for reading normal process output and error output. Additionally I introduce a third thread that waits for the process to end and closes the outputStream so next user input is not meant for process but is a new command.
private void exec(String command) throws IOException{
Process pro = Runtime.getRuntime().exec(command, null);
inStream = pro.getInputStream();
inErrStream = pro.getErrorStream();
outStream = pro.getOutputStream();
// Thread that reads process output
Thread outStreamReader = new Thread(new Runnable() {
public void run() {
try {
String line = null;
BufferedReader in = new BufferedReader(new InputStreamReader(inStream));
while ((line = in.readLine()) != null) {
println(line);
}
} catch (Exception e) {
e.printStackTrace();
}
System.out.println("Exit reading process output");
}
});
outStreamReader.start();
// Thread that reads process error output
Thread errStreamReader = new Thread(new Runnable() {
public void run() {
try {
String line = null;
BufferedReader inErr = new BufferedReader(new InputStreamReader(inErrStream));
while ((line = inErr.readLine()) != null) {
println(line);
}
} catch (Exception e) {
e.printStackTrace();
}
System.out.println("Exit reading error stream");
}
});
errStreamReader.start();
// Thread that waits for process to end
Thread exitWaiter = new Thread(new Runnable() {
public void run() {
try {
int retValue = pro.waitFor();
println("Command exit with return value " + retValue);
// close outStream
outStream.close();
outStream = null;
} catch (InterruptedException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
});
exitWaiter.start();
}
Now this should work.
If you enter ipconfig it prints the command output, closes the output stream and is ready for a new command.
If you enter cmd it prints the output and let you enter more cmd commands like dir or cd and so on until you enter exit. Then it closes the output stream and is ready for a new command.
You may run into problems with executing python scripts because there are problems with reading Process InputStreams with Java if they are not flushed into system pipeline.
See this example python script
print "Input something!"
str = raw_input()
print "Received input is : ", str
You could run this with your Java programm and also enter the input but you will not see the script output until the script is finished.
The only fix I could find is to manually flush the output in the script.
import sys
print "Input something!"
sys.stdout.flush()
str = raw_input()
print "Received input is : ", str
sys.stdout.flush()
Running this script will bahave as you expect.
You can read more about this problem at
Java: is there a way to run a system command and print the output during execution?
Why does reading from Process' InputStream block altough data is available
Java: can't get stdout data from Process unless its manually flushed
EDIT: I have just found another very easy solution for the stdout.flush() problem with Python Scripts. Start them with python -u script.py and you don't need to flush manually. This should solve your problem.
EDIT2: We discussed in the comments that with this solution output and error Stream will be mixed up since they run in different threads. The problem here is that we cannot distinguish if output writing is finish when error stream thread comes up. Otherwise classic thread scheduling with locks could handle this situation. But we have a continuous stream until process is finished no matter if data flows or not. So we need a mechanism here that logs how much time has elapsed since last line was read from each stream.
For this I will introduce a class that gets an InputStream and starts a Thread for reading the incoming data. This Thread stores each line in a Queue and stops when end of stream arrives. Additionally it holds the time when last line was read and added to Queue.
public class InputStreamLineBuffer{
private InputStream inputStream;
private ConcurrentLinkedQueue<String> lines;
private long lastTimeModified;
private Thread inputCatcher;
private boolean isAlive;
public InputStreamLineBuffer(InputStream is){
inputStream = is;
lines = new ConcurrentLinkedQueue<String>();
lastTimeModified = System.currentTimeMillis();
isAlive = false;
inputCatcher = new Thread(new Runnable(){
#Override
public void run() {
StringBuilder sb = new StringBuilder(100);
int b;
try{
while ((b = inputStream.read()) != -1){
// read one char
if((char)b == '\n'){
// new Line -> add to queue
lines.offer(sb.toString());
sb.setLength(0); // reset StringBuilder
lastTimeModified = System.currentTimeMillis();
}
else sb.append((char)b); // append char to stringbuilder
}
} catch (IOException e){
e.printStackTrace();
} finally {
isAlive = false;
}
}});
}
// is the input reader thread alive
public boolean isAlive(){
return isAlive;
}
// start the input reader thread
public void start(){
isAlive = true;
inputCatcher.start();
}
// has Queue some lines
public boolean hasNext(){
return lines.size() > 0;
}
// get next line from Queue
public String getNext(){
return lines.poll();
}
// how much time has elapsed since last line was read
public long timeElapsed(){
return (System.currentTimeMillis() - lastTimeModified);
}
}
With this class we could combine the output and error reading thread into one. That lives while the input reading buffer threads live and have not comsumed data. In each run it checks if some time has passed since last output was read and if so it prints all unprinted lines at a stroke. The same with the error output. Then it sleeps for some millis for not wasting cpu time.
private void exec(String command) throws IOException{
Process pro = Runtime.getRuntime().exec(command, null);
inStream = pro.getInputStream();
inErrStream = pro.getErrorStream();
outStream = pro.getOutputStream();
InputStreamLineBuffer outBuff = new InputStreamLineBuffer(inStream);
InputStreamLineBuffer errBuff = new InputStreamLineBuffer(inErrStream);
Thread streamReader = new Thread(new Runnable() {
public void run() {
// start the input reader buffer threads
outBuff.start();
errBuff.start();
// while an input reader buffer thread is alive
// or there are unconsumed data left
while(outBuff.isAlive() || outBuff.hasNext() ||
errBuff.isAlive() || errBuff.hasNext()){
// get the normal output if at least 50 millis have passed
if(outBuff.timeElapsed() > 50)
while(outBuff.hasNext())
println(outBuff.getNext());
// get the error output if at least 50 millis have passed
if(errBuff.timeElapsed() > 50)
while(errBuff.hasNext())
println(errBuff.getNext());
// sleep a bit bofore next run
try {
Thread.sleep(100);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
System.out.println("Finish reading error and output stream");
}
});
streamReader.start();
// remove outStreamReader and errStreamReader Thread
[...]
}
Maybe this is not a perfect solution but it should handle the situation here.
EDIT (31.8.2016)
We discussed in comments that there is still a problem with the code while implementing a stop button that kills the started
process using Process#destroy(). A process that produces very much output e.g. in an infinite loop will
be destroyed immediately by calling destroy(). But since it has already produced a lot of output that has to be consumed
by our streamReader we can't get back to normal programm behaviour.
So we need some small changes here:
We will introduce a destroy() method to the InputStreamLineBuffer that stops the output reading and clears the queue.
The changes will look like this:
public class InputStreamLineBuffer{
private boolean emergencyBrake = false;
[...]
public InputStreamLineBuffer(InputStream is){
[...]
while ((b = inputStream.read()) != -1 && !emergencyBrake){
[...]
}
}
[...]
// exits immediately and clears line buffer
public void destroy(){
emergencyBrake = true;
lines.clear();
}
}
And some little changes in the main programm
public class ExeConsole extends JFrame{
[...]
// The line buffers must be declared outside the method
InputStreamLineBuffer outBuff, errBuff;
public ExeConsole{
[...]
btnStop.addActionListener(new ActionListener() {
public void actionPerformed(ActionEvent e) {
if(pro != null){
pro.destroy();
outBuff.destroy();
errBuff.destroy();
}
}});
}
[...]
private void exec(String command) throws IOException{
[...]
//InputStreamLineBuffer outBuff = new InputStreamLineBuffer(inStream);
//InputStreamLineBuffer errBuff = new InputStreamLineBuffer(inErrStream);
outBuff = new InputStreamLineBuffer(inStream);
errBuff = new InputStreamLineBuffer(inErrStream);
[...]
}
}
Now it should be able to destroy even some output spamming processes.
Note: I found out that Process#destroy() is not able to destroy child processes. So if you start cmd on windows
and start a java programm from there you will end up destroying the cmd process while the java programm is still running.
You will see it in the task manager. This problem could not be solved with java itself. it will need
some os depending external tools to get the pids of these processes and kill them manually.
Although #ArticLord solution is nice and neat, recently I faced the same kind of problem and came up with a solution that's conceptually equivalent, but slightly different in its implementation.
The concept is the same, namely "bulk reads": when a reader thread acquires its turn, it consumes all the stream it handles, and pass the hand only when it is done.
This guarantees the out/err print order.
But instead of using a timer-based turn assignment, I use a lock-based non-blocking read simulation:
// main method for testability: replace with private void exec(String command)
public static void main(String[] args) throws Exception
{
// create a lock that will be shared between reader threads
// the lock is fair to minimize starvation possibilities
ReentrantLock lock = new ReentrantLock(true);
// exec the command: I use nslookup for testing on windows
// because it is interactive and prints to stderr too
Process p = Runtime.getRuntime().exec("nslookup");
// create a thread to handle output from process (uses a test consumer)
Thread outThread = createThread(p.getInputStream(), lock, System.out::print);
outThread.setName("outThread");
outThread.start();
// create a thread to handle error from process (test consumer, again)
Thread errThread = createThread(p.getErrorStream(), lock, System.err::print);
errThread.setName("errThread");
errThread.start();
// create a thread to handle input to process (read from stdin for testing purpose)
PrintWriter writer = new PrintWriter(p.getOutputStream());
Thread inThread = createThread(System.in, null, str ->
{
writer.print(str);
writer.flush();
});
inThread.setName("inThread");
inThread.start();
// create a thread to handle termination gracefully. Not really needed in this simple
// scenario, but on a real application we don't want to block the UI until process dies
Thread endThread = new Thread(() ->
{
try
{
// wait until process is done
p.waitFor();
logger.debug("process exit");
// signal threads to exit
outThread.interrupt();
errThread.interrupt();
inThread.interrupt();
// close process streams
p.getOutputStream().close();
p.getInputStream().close();
p.getErrorStream().close();
// wait for threads to exit
outThread.join();
errThread.join();
inThread.join();
logger.debug("exit");
}
catch(Exception e)
{
throw new RuntimeException(e.getMessage(), e);
}
});
endThread.setName("endThread");
endThread.start();
// wait for full termination (process and related threads by cascade joins)
endThread.join();
logger.debug("END");
}
// convenience method to create a specific reader thread with exclusion by lock behavior
private static Thread createThread(InputStream input, ReentrantLock lock, Consumer<String> consumer)
{
return new Thread(() ->
{
// wrap input to be buffered (enables ready()) and to read chars
// using explicit encoding may be relevant in some case
BufferedReader reader = new BufferedReader(new InputStreamReader(input));
// create a char buffer for reading
char[] buffer = new char[8192];
try
{
// repeat until EOF or interruption
while(true)
{
try
{
// wait for your turn to bulk read
if(lock != null && !lock.isHeldByCurrentThread())
{
lock.lockInterruptibly();
}
// when there's nothing to read, pass the hand (bulk read ended)
if(!reader.ready())
{
if(lock != null)
{
lock.unlock();
}
// this enables a soft busy-waiting loop, that simultates non-blocking reads
Thread.sleep(100);
continue;
}
// perform the read, as we are sure it will not block (input is "ready")
int len = reader.read(buffer);
if(len == -1)
{
return;
}
// transform to string an let consumer consume it
String str = new String(buffer, 0, len);
consumer.accept(str);
}
catch(InterruptedException e)
{
// catch interruptions either when sleeping and waiting for lock
// and restore interrupted flag (not necessary in this case, however it's a best practice)
Thread.currentThread().interrupt();
return;
}
catch(IOException e)
{
throw new RuntimeException(e.getMessage(), e);
}
}
}
finally
{
// protect the lock against unhandled exceptions
if(lock != null && lock.isHeldByCurrentThread())
{
lock.unlock();
}
logger.debug("exit");
}
});
}
Note that both solutions, #ArticLord's and mine, are not totally starvation-safe, and chances (really few) are inversely proportional to consumers speed.
Happy 2016! ;)

DataOutputStream different approach

I am using this code in a application for sending some string throw a socket.
public class OutgoingData {
public static DataOutputStream dos = null;
public static String toSend = "";
public static volatile boolean continuousSending = true;
public static String toSendTemp = "";
public static void startSending(final DataOutputStream d) {
new Thread(new Runnable() {
public void run() {
try {
dos = d;
while (continuousSending) {
if (!toSend.equals(toSendTemp)) {
dos.writeUTF(toSend);
dos.flush();
toSendTemp = toSend;
}
}
} catch (IOException e) {
e.printStackTrace();
}
}
}).start();
}
And from another thread I am calling this method
private void send(String str) {
OutgoingData.toSend = str;
}
Are there any problems that could appear using this implementation? Excepting the case when send() is called synchronously from two threads.
I am not using something like this:
private void send(final String str){
new Thread(new Runnable() {
#Override
public void run() {
synchronized (OutgoingData.dos) {
try {
OutgoingData.dos.writeUTF(str);
OutgoingData.dos.flush();
} catch (IOException e) {
e.printStackTrace();
}
}
}
}).start();
}
Because the system on which this code is runned, has a limit on the number of threads a process can create and takes a long time to get a lock on an object.
Your implementation is not thread safe:
if (!toSend.equals(toSendTemp)) {
// toSend can be changed before this line happens
// causing you to miss data
dos.writeUTF(toSend);
dos.flush();
// or here
toSendTemp = toSend;
}
You need some form of thread synchronization, regardless of whether or not it is "slow".
A better choice rather than busy waiting on a field is to use a BlockingQueue<String> This will ensure you never miss a value, nor do you consume CPU when there is nothing to do.
A good way of wrapping up a Queue and a Thread (pool) is to use an ExecutorService which does both.
In your case, a Socket stream is a queue already so queuing writing to another queue is likely to be redundant and all you really need to buffer your output stream.
Because the system on which this code is runned, has a limit on the number of threads a process can create and takes a long time to get a lock on an object.
Creating a thread is more than 100x than creating a thread. Ideally you don't want to have either. Note: the Socket already has a write lock.

Input and Output Stream Pipe in Java

Does anyone have any good suggestions for creating a Pipe object in Java which is both an InputStream and and OutputStream since Java does not have multiple inheritance and both of the streams are abstract classes instead of interfaces?
The underlying need is to have a single object that can be passed to things which need either an InputStream or an OutputStream to pipe output from one thread to input for another.
It seems the point of this question is being missed. If I understand you correctly, you want an object that functions like an InputStream in one thread, and an OutputStream in another to create a means of communicating between the two threads.
Perhaps one answer is to use composition instead of inheritance (which is recommended practice anyway). Create a Pipe which contains a PipedInputStream and a PipedOutputStream connected to each other, with getInputStream() and getOutputStream() methods.
You can't directly pass the Pipe object to something needing a stream, but you can pass the return value of it's get methods to do it.
Does that work for you?
java.io.PipedOutputStream and java.io.PipedInputStream look to be the classes to use for this scenario. They are designed to be used together to pipe data between threads.
If you really want some single object to pass around it would need to contain one of each of these and expose them via getters.
This is a pretty common thing to do, I think. See this question.
Easy way to write contents of a Java InputStream to an OutputStream
You can't create a class which derives both from InputStream and OutputStream because these aren't interfaces and they have common methods and Java doesn't allow multiple inheritance (the compiler doesn't know whether to call InputStream.close() or OutputStream.close() if you call close() on your new object).
The other problem is the buffer. Java wants to allocate a static buffer for the data (which doesn't change). This means when you use the `java.io.PipedXxxStream', the writing data to it will eventually block unless you use two different threads.
So the answer from Apocalisp is correct: You must write a copy loop.
I suggest that you include Apache's commons-io in your project which contains many helper routines just for tasks like this (copy data between streams, files, strings and all combinations thereof).
See http://ostermiller.org/utils/CircularBuffer.html
I had to implement a filter for slow connections to Servlets so basically I wrapped the servlet output stream into a QueueOutputStream which will add every byte (in small buffers), into a queue, and then output those small buffers to a 2nd output stream, so in a way this acts as input/output stream, IMHO this is better than JDK pipes which won't scale that well, basically there is too much context switching in the standard JDK implementation (per read/write), a blocking queue is just perfect for a single producer/consumer scenario:
import java.io.IOException;
import java.io.OutputStream;
import java.util.concurrent.*;
public class QueueOutputStream extends OutputStream
{
private static final int DEFAULT_BUFFER_SIZE=1024;
private static final byte[] END_SIGNAL=new byte[]{};
private final BlockingQueue<byte[]> queue=new LinkedBlockingDeque<>();
private final byte[] buffer;
private boolean closed=false;
private int count=0;
public QueueOutputStream()
{
this(DEFAULT_BUFFER_SIZE);
}
public QueueOutputStream(final int bufferSize)
{
if(bufferSize<=0){
throw new IllegalArgumentException("Buffer size <= 0");
}
this.buffer=new byte[bufferSize];
}
private synchronized void flushBuffer()
{
if(count>0){
final byte[] copy=new byte[count];
System.arraycopy(buffer,0,copy,0,count);
queue.offer(copy);
count=0;
}
}
#Override
public synchronized void write(final int b) throws IOException
{
if(closed){
throw new IllegalStateException("Stream is closed");
}
if(count>=buffer.length){
flushBuffer();
}
buffer[count++]=(byte)b;
}
#Override
public synchronized void write(final byte[] b, final int off, final int len) throws IOException
{
super.write(b,off,len);
}
#Override
public synchronized void close() throws IOException
{
flushBuffer();
queue.offer(END_SIGNAL);
closed=true;
}
public Future<Void> asyncSendToOutputStream(final ExecutorService executor, final OutputStream outputStream)
{
return executor.submit(
new Callable<Void>()
{
#Override
public Void call() throws Exception
{
try{
byte[] buffer=queue.take();
while(buffer!=END_SIGNAL){
outputStream.write(buffer);
buffer=queue.take();
}
outputStream.flush();
} catch(Exception e){
close();
throw e;
} finally{
outputStream.close();
}
return null;
}
}
);
}
Better to use Pipe or ArrayBlockingQueue, I recommend you not to use PipedInput/OutputStream as they have a bad practice even you can see in the link below that they have asked to be deprecated as it causes many issues.
https://bugs.openjdk.java.net/browse/JDK-8223048
For the BlockingQueue and Pipe here a simple example of that
Pipe:
Pipe pipe = Pipe.open();
Pipe.SinkChannel sinkChannel = pipe.sink();
String newData = "New String to write to file..." + System.currentTimeMillis();
ByteBuffer buf = ByteBuffer.allocate(48);
buf.clear();
buf.put(newData.getBytes());
buf.flip();
while(buf.hasRemaining()) {
sinkChannel.write(buf);
}
Pipe.SourceChannel sourceChannel = pipe.source();
ByteBuffer buf = ByteBuffer.allocate(48);
int bytesRead = inChannel.read(buf);
Reference: http://tutorials.jenkov.com/java-nio/pipe.html
BlockingQueue:
//Shared class used by threads
public class Buffer {
// ArrayBlockingQueue
private BlockingQueue<Integer> blockingQueue = new ArrayBlockingQueue<Integer>(1);
public void get() {
// retrieve from ArrayBlockingQueue
try {
System.out.println("Consumer received - " + blockingQueue.take());
} catch (InterruptedException e) {
e.printStackTrace();
}
}
public void put(int data) {
try {
// putting in ArrayBlockingQueue
blockingQueue.put(data);
System.out.println("Producer produced - " + data);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
}
public static void main(String[] args) {
// Starting two threads
ExecutorService executorService = null;
try {
Buffer buffer = new Buffer();
executorService = Executors.newFixedThreadPool(2);
executorService.execute(new Producer(buffer));
executorService.execute(new Consumer(buffer));
} catch (Exception e) {
e.printStackTrace();
}finally {
if(executorService != null) {
executorService.shutdown();
}
}
}
public class Consumer implements Runnable {
private Buffer buffer;
public Consumer(Buffer buffer) {
this.buffer = buffer;
}
#Override
public void run() {
while (true) {
try {
buffer.get();
Thread.sleep(1000);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
}
}
public class Producer implements Runnable {
private Buffer buffer;
public Producer(Buffer buffer) {
this.buffer = buffer;
}
#Override
public void run() {
while (true) {
Random random = new Random();
int data = random.nextInt(1000);
buffer.put(data);
}
}
}
Reference:
https://github.com/kishanjavatrainer/ArrayBlockingQueueDemo/tree/master/ArrayBlockingQueueDemo

Categories

Resources