How to write bytes by byte and display it continuously - java

I have encrypted video file and while decrypting it i have defined Bytebyte[] input = new byte[1024]; size to written it in output file.
Here i want to write first 1024 bytes in output files while at same time if want to play that video file i can play that output file without waiting to whole file written like video streaming.
when first 1024 bytes written , video file will start playing till whole file will written.

You'll have to setup your input stream and output stream depending on where you're getting the data and where you're saving/viewing it. Performance could also likely be improved with some buffering on the output. You should get the general idea.
public class DecryptionWotsit {
private final BlockingDeque<Byte> queue = new LinkedBlockingDeque<Byte>();
private final InputStream in;
private final OutputStream out;
public DecryptionWotsit(InputStream in, OutputStream out) {
this.in = in;
this.out = out;
}
public void go() {
final Runnable decryptionTask = new Runnable() {
#Override
public void run() {
try {
byte[] encrypted = new byte[1024];
byte[] decrypted = new byte[1024];
while (true) {
int encryptedBytes = in.read(encrypted);
// TODO: decrypt into decrypted, set decryptedBytes
int decryptedBytes = 0;
for (int i = 0; i < decryptedBytes; i++)
queue.addFirst(decrypted[i]);
}
}
catch (Exception e) {
// exception handling left for the reader
throw new RuntimeException(e);
}
}
};
final Runnable playTask = new Runnable() {
#Override
public void run() {
try {
while (true) {
out.write(queue.takeLast());
}
}
catch (Exception e) {
throw new RuntimeException(e);
}
}
};
Executors.newSingleThreadExecutor().execute(decryptionTask);
Executors.newSingleThreadExecutor().execute(playTask);
}
}

You will have to do the writing in a separate thread.
Since writing to file is a lot slower than displaying video, expect the file-writing thread to be running long after you've quit watching the video. Unless (as I understand it) you intend to write only the first 1024 bytes to file.
If you intend to write the entire video to file, a single 1024 byte buffer will slow you down. You will either have to use a buffer that is a lot larger, or need a lot of these 1024-byte buffers. (I suppose the 1024 byte buffer size is a consequence of the decryption algorithm?)
Also, you may want to look at how much memory is available for the JVM, to make sure that you won't get an OutOfMemoryException halfway. You can use the -Xms and -Xmx options to set the amount of memory available to the JVM.

A simple way to write to a file, you also want to process is to open the file twice (or more times). In one thread you write to the file and update a counter to say how much you have written e.g. a long protected by a synchronized block. In the reading thread(s) you can get this value and read up to that point, repeatedly until the writer has finished. A simple way to signal the write has finished is to set the size to Long.MAX_VALUE, causing the readers to read until the EOF. To stop the readers busy waiting, you can have them wait() until the data written is greater than the amount read.
This approach always uses a fixed amount of memory e.g. 16 - 128K, regardless of how far behind the readers are from the writer.

Related

Split mp3 file into chuncks using multiple threads

I have to write a program which can split and merge files with various extensions. While splitting and merging it should use multiple threads. My code can do only a half of the task - if I don't use multithreading, it splits the file perfectly. If I do use multithreading, it splits the file, but saves only the first part several times.
What should I fix to make it work?
A method of Splitter.class
public void splitFile(CustomFile customFile, int dataSize) {
for (int i = 1; i <= partsNumber; i++) {
FileSplitterThread thread = new FileSplitterThread(customFile, i, dataSize);
thread.start();
}
}
Run method of my thread:
#Override
public void run() {
try {
fileInputStream = new FileInputStream(initialFile.getData());
byte[] b = new byte[dataSize];
String fileName = initialFile.getName() + "_part_" + index + "." + initialFile.getExtension();
fileOutputStream = new FileOutputStream(fileName);
int i = fileInputStream.read(b);
fileOutputStream.write(b, 0, i);
fileOutputStream.close();
fileOutputStream = null;
} catch (IOException e) {
e.printStackTrace();
}
}
The reason is you cannot achieve multi-threaded file splitting with just InputStream. And you are reading the file from the beginning always, you are getting the same bytes
For a simple file splitting mechanism, the following could be the general steps:
Get the size of the file (data size)
Chunk it into offsets for each thread to read. Example, if you have 2 threads and the data is 1000 bytes, the offsets will be 0,1000/2, where the read length is 500. the first thread will read position from 0 to 499, the next thread will start at 500 and read till 999
Get two InputStreams and position them using Channel (here is a good post, Java how to read part of file from specified position of bytes?)
Encapsulate the above info: InputStream, offset, length to read, output file name etc. and provide it to each of the threads

Reading file >4GB file in java

I have mainframe data file which is greater than 4GB. I need to read and process the data for every 500 bytes. I have tried using FileChannel, however I am getting error with message Integer.Max_VALUE exceeded
public void getFileContent(String fileName) {
RandomAccessFile aFile = null;
FileChannel inChannel = null;
try {
aFile = new RandomAccessFile(Paths.get(fileName).toFile(), "r");
inChannel = aFile.getChannel();
ByteBuffer buffer = ByteBuffer.allocate(500 * 100000);
while (inChannel.read(buffer) > 0) {
buffer.flip();
for (int i = 0; i < buffer.limit(); i++) {
byte[] data = new byte[500];
buffer.get(data);
processData(new String(data));
buffer.clear();
}
}
} catch (Exception ex) {
// TODO
} finally {
try {
inChannel.close();
aFile.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
Can you help me out with a solution?
The worst problem of you code is the
catch (Exception ex) {
// TODO
}
part, which implies that you won’t notice any exceptions thrown by your code. Since there is nothing in the JRE printing a “Integer.Max_VALUE exceeded” message, that problem must be connected to your processData method.
It might be worth noting that this method will be invoked way too often with repeated data.
Your loop
for (int i = 0; i < buffer.limit(); i++) {
implies that you iterate as many times as there are bytes within the buffer, up to 500 * 100000 times. You are extracting 500 bytes from the buffer in each iteration, processing a total of up to 500 * 500 * 100000 bytes after each read, but since you have a misplaced buffer.clear(); at the end of the loop body, you will never experience a BufferUnderflowException. Instead, you will invoke processData each of the up to 500 * 100000 times with the first 500 bytes of the buffer.
But the whole conversion from bytes to a String is unnecessarily verbose and contains unnecessary copy operations. Instead of implementing this yourself, you can and should just use a Reader.
Besides that, your code makes a strange detour. It starts with a Java 7 API, Paths.get, to convert it to a legacy File object, create a legacy RandomAccessFile to eventually acquire a FileChannel. If you have a Path and want a FileChannel, you should open it directly via FileChannel.open. And, of course, use a try(…) { … } statement to ensure proper closing.
But, as said, if you want to process the contents as Strings, you surely want to use a Reader instead:
public void getFileContent(String fileName) {
try( Reader reader=Files.newBufferedReader(Paths.get(fileName)) ) {
CharBuffer buffer = CharBuffer.allocate(500 * 100000);
while(reader.read(buffer) > 0) {
buffer.flip();
while(buffer.remaining()>500) {
processData(buffer.slice().limit(500).toString());
buffer.position(buffer.position()+500);
}
buffer.compact();
}
// there might be a remaining chunk of less than 500 characters
if(buffer.position()>0) {
processData(buffer.flip().toString());
}
} catch(Exception ex) {
// the *minimum* to do:
ex.printStackTrace();
// TODO real exception handling
}
}
There is no problem with processing files >4GB, I just tested it with a 8GB file. Note that the code above uses the UTF-8 encoding. If you want to retain the behavior of your original code of using whatever happens to be your system’s default encoding, you may create the Reader using
Files.newBufferedReader(Paths.get(fileName), Charset.defaultCharset())
instead.

Data loss during java serial communcation

I am trying to receive a telemetry string from arduino board to javafx apllication by jserialcomm.
My arduino output rate is about 100Hz now. in this situation i want to receive data in application about 1Hz. this is what i am doing:
This are only some important parts of code:
Runnable r1 = new Runnable() {
public void run() {
try {
while (true) {
refresher(rx);
Thread.sleep(1000L);
}
} catch (InterruptedException iex) {}
}
};
Thread thr1 = new Thread(r1);
thr1.start();
public void refresher(SerialPort rx){
readRX(rx);
parseString(lastTelemetry);
}
private void readRX(SerialPort rx){
Scanner ss = new Scanner(rx.getInputStream());
while(ss.hasNextLine()){
lastTelemetry = ss.nextLine();
if (lastTelemetry.isEmpty()) continue;
System.out.println(lastTelemetry);
break;
}
}
But the recieved string is not complete. some lines are complete and some or lost . this is what my output look like:
8,0,330,1306.42,86586.00,0,31.36,0,0,0,0,0,0,0,0,62.27,-6.81,4.53,0.00,00
0,0,0,0,0,0,0,0,66.24,-6.81,4.52,-0.30,00
1.36,0,0,0,0,0,0,0,0,70.22,-6.81,4.52,-0.10,00
7098,0,396,1306.33,86587.00,0,31.36,0,0,0,0,0,0,0,0,75.22,-6.81,4.51,-0.10,00
You probably shouldn't be creating a new input stream new Scanner(rx.getInputStream()) every time you read input. It is likely that if data is being buffered it is lost when you make a new stream. Create the input stream once when you open the serial port and pass that as the parameter to the readRX method instead of the SerialPort.
Also, I couldn't find anything in a cursory read of the Javadoc where you specify the buffer size, or what happens if the buffer overflows. That is another factor to consider.
I think there's a problem with the definition of hasNextLine().
See the following example:
byte[] data = "test".getBytes();
ByteArrayInputStream inputStream = new ByteArrayInputStream(data);
Scanner scanner = new Scanner(inputStream);
while (scanner.hasNextLine()) {
String text = scanner.nextLine();
System.out.println(text);
}
There is no \n at the end or something like this. But it still outputs "test". So I expect you assume a different behaviour of the next line part.
Also es mentioned by someone else, cachine might be an additional failure source.

How to write a byte array to OutputStream of process builder (Java)

byte[] bytes = value.getBytes();
Process q = new ProcessBuilder("process","arg1", "arg2").start();
q.getOutputStream().write(bytes);
q.getOutputStream().flush();
System.out.println(q.getInputStream().available());
I'm trying to stream file contents to an executable and capture the output but the output(InputStream) is always empty. I can capture the output if i specify the the file location but not with streamed input.
How might I overcome this?
Try wrapping your streams with BufferedInputStream() and BufferedOutputStream():
http://download.oracle.com/javase/6/docs/api/java/lang/Process.html#getOutputStream%28%29
Implementation note: It is a good idea for the output stream to be buffered.
Implementation note: It is a good idea for the input stream to be buffered.
Even with buffered streams, it is still possible for the buffer to fill if you're dealing with large amounts of data, you can deal with this by starting a separate thread to read from q.getInputStream(), so you can still be reading from the process while writing to the process.
Perhaps the program you execute only starts its work when it detects the end of its input data. This is normally done by waiting for an EOF (end-of-file) symbol. You can send this by closing the output stream to the process:
q.getOutputStream().write(bytes);
q.getOutputStream().close();
Try this together with waiting for the process.
I dont know if something else may also be wrong here, but the other process ("process") does not even have time to respond, you are not waiting for it (the method available() does not block). To try this out you can first insert a sleep(2000) after the flush(), and if that works you should switch to query'ing q.getInputStream().available() multiple times with short pauses in between.
I think, you have to wait, until the process finished.
I implemented something like this this way:
public class ProcessReader {
private static final int PROCESS_LOOP_SLEEP_MILLIS = 100;
private String result;
public ProcessReader(Process process) {
BufferedReader resultReader = new BufferedReader(new InputStreamReader(process.getInputStream()));
StringBuilder resultOutput = new StringBuilder();
try {
while (!checkProcessTerminated(process, resultReader, resultOutput)) {
}
} catch (Exception ex1) {
throw new RuntimeException(ex1);
}
result = resultOutput.toString();
}
public String getResult(){
return result;
}
private boolean checkProcessTerminated(Process process, BufferedReader resultReader, StringBuilder resultOutput) throws Exception {
try {
int exit = process.exitValue();
return true;
} catch (IllegalThreadStateException ex) {
Thread.sleep(PROCESS_LOOP_SLEEP_MILLIS);
} finally {
while (resultReader.ready()) {
String out = resultReader.readLine();
resultOutput.append(out).append("\n");
}
}
return false;
}
}
I just removed now some specific code, that you dont need, but it should work, try it.
Regards

Capturing large amounts of output from Apache Commons-Exec

I am writing a video application in Java by executing ffmpeg and capturing its output to standard output. I decided to use Apache Commons-Exec instead of Java's Runtime, because it seems better. However, I am have a difficult time capturing all of the output.
I thought using pipes would be the way to go, because it is a standard way of inter-process communication. However, my setup using PipedInputStream and PipedOutputStream is wrong. It seems to work, but only for the first 1042 bytes of the stream, which curiously happens to be the value of PipedInputStream.PIPE_SIZE.
I have no love affair with using pipes, but I want to avoid use disk I/O (if possible), because of speed and volume of data (a 1m 20s video at 512x384 resolution produces 690M of piped data).
Thoughts on the best solution to handle large amounts of data coming from a pipe? My code for my two classes are below. (yes, sleep is bad. Thoughts on that? wait() and notifyAll() ?)
WriteFrames.java
public class WriteFrames {
public static void main(String[] args) {
String commandName = "ffmpeg";
CommandLine commandLine = new CommandLine(commandName);
File filename = new File(args[0]);
String[] options = new String[] {
"-i",
filename.getAbsolutePath(),
"-an",
"-f",
"yuv4mpegpipe",
"-"};
for (String s : options) {
commandLine.addArgument(s);
}
PipedOutputStream output = new PipedOutputStream();
PumpStreamHandler streamHandler = new PumpStreamHandler(output, System.err);
DefaultExecutor executor = new DefaultExecutor();
try {
DataInputStream is = new DataInputStream(new PipedInputStream(output));
YUV4MPEGPipeParser p = new YUV4MPEGPipeParser(is);
p.start();
executor.setStreamHandler(streamHandler);
executor.execute(commandLine);
} catch (IOException e) {
e.printStackTrace();
}
}
}
YUV4MPEGPipeParser.java
public class YUV4MPEGPipeParser extends Thread {
private InputStream is;
int width, height;
public YUV4MPEGPipeParser(InputStream is) {
this.is = is;
}
public void run() {
try {
while (is.available() == 0) {
Thread.sleep(100);
}
while (is.available() != 0) {
// do stuff.... like write out YUV frames
}
} catch (IOException e) {
e.printStackTrace();
} catch (InterruptedException e) {
e.printStackTrace();
}
}
}
The problem is in the run method of YUV4MPEGPipeParser class. There are two successive loops. The second loop terminates immediately if there are no data currently available on the stream (e.g. all input so far was processed by parser, and ffmpeg or stream pump were not fast enough to serve some new data for it -> available() == 0 -> loop is terminated -> pump thread finishes).
Just get rid of these two loops and sleep and just perform a simple blocking read() instead of checking if any data are available for processing. There is also probably no need for wait()/notify() or even sleep() because the parser code is started on a separate thread.
You can rewrite the code of run() method like this:
public class YUV4MPEGPipeParser extends Thread {
...
// optimal size of buffer for reading from pipe stream :-)
private static final int BUFSIZE = PipedInputStream.PIPE_SIZE;
public void run() {
try {
byte buffer[] = new byte[BUFSIZE];
int len = 0;
while ((len = is.read(buffer, 0, BUFSIZE) != -1) {
// we have valid data available
// in first 'len' bytes of 'buffer' array.
// do stuff.... like write out YUV frames
}
} catch ...
}
}

Categories

Resources