I am still trying to get this html 5 playback of video & audio with jetty to work error free. For large videos I notice the video plays up to 75% (just an estimate) then jumps to the end.This never happens with shorter videos but consistently happens on videos more than 10 minutes or so. I don't see any errors in the logs or browser debugger window. Here are the range requests and also the couple errors that do get logged but I don't think are related.
Video
Status Code:206 Partial Content
Accept-Ranges:bytes
Content-Length:16001618
Content-Range:bytes 0-16001617/16001618
Content-Type:video/webm
Server:Jetty(9.3.z-SNAPSHOT)
REQUEST
Connection:keep-alive
Range:bytes=0-
Audio
Status Code:206 Partial Content
Accept-Ranges:bytes
Content-Length:9044858
Content-Range:bytes 0-9044857/9044858
Content-Type:audio/wav
REQUEST
Range:bytes=0-
Video
Status Code:206 Partial Content
Accept-Ranges:bytes
Content-Length:10834
Content-Range:bytes 15990784-16001617/16001618
Content-Type:video/webm
REQUEST
Connection:keep-alive
Range:bytes=15990784-
Video
Status Code:206 Partial Content
Accept-Ranges:bytes
Content-Length:12069458
Content-Range:bytes 3932160-16001617/16001618
Content-Type:video/webm
REQUEST
Connection:keep-alive
Range:bytes=3932160-
Like I said this happens consistently for larger videos so I don't think the videos are corrupted. But no errors are returned except the following which does not stop video playing anyway based on the time this error occurs:
Accept-Ranges: bytes
Content-Range: bytes 0-11229438/11229439
Content-Length: 11229439
Content-Type: video/webm
, Content-Length=11229439, ScreenID=117435}: {1},
com.service.,java.nio.channels.ClosedChannelException
at
org.eclipse.jetty.util.IteratingCallback.close(IteratingCallback.java:427)
at org.eclipse.jetty.server.HttpConnection.onClose(HttpConnection.java:489)
Code that handles the requests:
if(parameters.containsKey(Constants.PARAM_CONTENT_LENGTH)) {
totalLength = java.lang.Math.toIntExact((long)
parameters.get(Constants.PARAM_CONTENT_LENGTH));
} else {
String range =
parameters.get(Constants.PARAM_RANGE_REQUEST).toString();
String[] ranges = range.split("=")[1].split("-");
from = Integer.parseInt(ranges[0]);
if(from >= 0 && from < totalLength) {
to = (totalLength - 1);
if (ranges.length == 2) {
to = Integer.parseInt(ranges[1]);
if (to >= totalLength) {
to = (int) (totalLength - 1);
}
}
} else {
to = -1;
}
}
}
return new ContentRange(from, to, totalLength);
Code that writes to the stream:
BufferedOutputStream bout = new BufferedOutputStream (outputStream);
Buffer b = new Buffer();
long skipped =0;
int byteLength = 0;
do {
inputStream.read(b);
int len = b.getLength();
byte[] data = (byte[]) b.getData();
int offset = b.getOffset();
if (len > 0) {
if(skipped < from) {
if(skipped + len <= from) {
skipped += len;
} else {
offset += (from - skipped);
len -= (from - skipped);
skipped = from;
}
}
if(to >= 0) {
if(from + byteLength + len <= to) {
} else {
len = (to + 1) - (from + byteLength);
}
}
byteLength+= len;
bytesWrittenObj.bytesWritten = byteLength;
bout.write(data, offset, len);
}
} while (!b.isEOM());
bout.flush();
Related
I followed the Android Guide to build a Bluetooth connection.
To separate things and make them independent, I decided to take the sending part of the BT to a separated thread. To achieve this, I pass the "OutStream" of the BT-Socket to a separated Thread class. My problem is, as soon as I start this thread, the incoming messages are not well red anymore.
But I don't know why, because I do not use this Thread at the moment. It is started but no messages are written in it.
This is part of the "ConnectedToDevice"-Class which receives the messages. I use a special way to detect that my Messages are received completely.
public void run() {
byte[] buffer = new byte[1024];
int bytes;
sendPW();
int len = 0;
Communication.getInstance().setFrequentSending(OVS_CONNECTION_IN_PROGRESS);
Communication.getInstance().setSendingMessages(mmOutStream); //Passing the OutStream to the sending class.
Communication.getInstance().setReceivingMessages(queueReceivingMsg);
Communication.getInstance().startThreads(); //currently: only start sending thread.
while (true) {
try {
bytes = this.mmInStream.read(buffer, len, buffer.length - len);
len += bytes;
if (len >= 3 && buffer[2] != -1) {
len = 0;
Log.d(TAG, "run: To Short?");
} else if (len >= 5) {
int dataLength = Integer
.parseInt(String.format("%02X", buffer[3]) + String.format("%02X", buffer[4]), 16);
if (len == 6 + dataLength) {
queueReceivingMsg.add(buffer);
Log.d(TAG, "run: Added to queue");
len = 0;
}
Log.d("BSS", "dataLenght: " + Integer.toString(dataLength) + " len " + len);
}
} catch (IOException var5) {
cancel();
Communication.getInstance().interruptThreads();
return;
}
}
}
The important part of sending message Class
public static BlockingQueue<Obj_SendingMessage> sendingMessages = new LinkedBlockingQueue<>();
#Override
public void run() {
while (!isInterrupted()) {
if (bGotResponse){
try{
sendingMessage = sendingMessages.take();
send(sendingMessage.getsData());
bGotResponse = false;
lTime = System.currentTimeMillis();
} catch (InterruptedException e){
this.interrupt();
}
}
if((System.currentTimeMillis()%500 == 0) && System.currentTimeMillis() <= lTime+1000){
if(sendingMessage != null){
send(sendingMessage.getsData());
}
} else {
bGotResponse =true;
}
}
}
//Where the outStream is used
private void write(int[] buffer) {
try {
for (int i : buffer) {
this.mmOutputStream.write(buffer[i]);
}
} catch (IOException var3) {
}
}
To be clear again, the sendingMessages is empty all the time, but still the messages get not Received correctly anymore.
Here's a proposal how robust code for reading messages from the stream could look like. The code can handle partial and multiple messages by:
Waiting for more data if a message is not complete
Processing the first message and saving the rest of the data if data for more than one message is available
Searching for a the marker byte 0xff and retaining the data for the next possibly valid message if invalid data needs to be discard
While writing this code I've noticed another bug in the code. If a message is found, the data is not copied. Instead the buffer is returned. However, the buffer and therefore the returned message might be overwritten by the next message before or while the previous one is processed.
This bug is more severe than the poor decoding of the stream data.
private byte[] buffer = new byte[1024];
private int numUnprocessedBytes = 0;
public void run() {
...
while (true) {
try {
int numBytesRead = mmInStream.read(buffer, numUnprocessedBytes, buffer.length - numUnprocessedBytes);
numUnprocessedBytes += numBytesRead;
processBytes();
} catch (IOException e) {
...
}
}
}
private void processBytes() {
boolean tryAgain;
do {
tryAgain = processSingleMessage();
} while (tryAgain);
}
private boolean processSingleMessage() {
if (numUnprocessedBytes < 5)
return false; // insufficient data to decode length
if (buffer[2] != (byte)0xff)
// marker byte missing; discard some data
return discardInvalidData();
int messageLength = (buffer[3] & 0xff) * 256 + (buffer[4] & 0xff);
if (messageLength > buffer.length)
// invalid message length; discard some data
return discardInvalidData();
if (messageLength > numUnprocessedBytes)
return false; // incomplete message; wait for more data
// complete message received; copy it and add it to the queue
byte[] message = Arrays.copyOfRange(buffer, 0, messageLength);
queueReceivingMsg.add(message);
// move remaining data to the front of buffer
if (numUnprocessedBytes > messageLength)
System.arraycopy(buffer, messageLength, buffer, 0, numUnprocessedBytes - messageLength);
numUnprocessedBytes -= messageLength;
return numUnprocessedBytes >= 5;
}
private boolean discardInvalidData() {
// find marker byte after index 2
int index = indexOfByte(buffer, (byte)0xff, 3, numUnprocessedBytes);
if (index >= 3) {
// discard some data and move remaining bytes to the front of buffer
System.arraycopy(buffer, index - 2, buffer, 0, numUnprocessedBytes - (index - 2));
numUnprocessedBytes -= index - 2;
} else {
// discard all data
numUnprocessedBytes = 0;
}
return numUnprocessedBytes >= 5;
}
static private int indexOfByte(byte[] array, byte element, int start, int end) {
for (int i = start; i < end; i++)
if (array[i] == element)
return i;
return -1;
}
In trying to fulfill a partial range request using chrome as the video playback tool, the video playback start playing back but on reaching halfway, it freezes as if the client is still waiting on more data. At this point the server has already sent the entire request. Please observe the following requests and response and the code used to send the range requested:
Range:bytes=0-
Accept-Ranges:bytes
Content-Length:546827
Content-Range:bytes 0-546827/546828
Content-Type:video/webm
Accept-Ranges:bytes
Content-Length:6155
Content-Range:bytes 540672-546827/546828
Content-Type:video/webm
Accept-Ranges:bytes
Content-Length:1
Content-Range:bytes 546827-546827/546828
Content-Type:video/webm
Is the second response handled correctly? Cause it freezes on this request and starts making multiple requests again until the request times out.
Code handling the request:
private static void copyStream (PullBufferStream inputStream, OutputStream outputStream, BytesStreamObject bytesWrittenObj, int from, int to) throws IOException, CodecNotAvailableException {
//export media stream
BufferedOutputStream bout = new BufferedOutputStream (outputStream);
Buffer b = new Buffer();
long skipped =0;
int byteLength = 0;
do {
inputStream.read(b);
int len = b.getLength();
byte[] data = (byte[]) b.getData();
int offset = b.getOffset();
if (len > 0) {
if(skipped < from) {
// skip bytes until reach from position in inputstream
if(skipped + len <= from) {
// skip all bytes just read
skipped += len;
} else {
// skip only some of bytes read
offset += (from - skipped);
len -= (from - skipped);
skipped = from;
}
}
if(to >= 0) {
if(from + byteLength + len <= to) {
// use the full buffer
} else {
// trim len to needed bytes to be read to prevent exceeding the "to" input parameter
len = (to + 1) - (from + byteLength);
}
}
byteLength+= len;
bytesWrittenObj.bytesWritten = byteLength;
bout.write(data, offset, len);
//bout.write(data, from, end);
}
} while (!b.isEOM());
//fileProperties.setBytesLength(byteLength);
//bout.close();
}
I needed to ensure I flush the outputstream but also made some changes on the actual file size rather than using stream.available().
I used bellow code to encode raw data to h264 in order to create video and it's encoded very well but video is playing too fast. It seems that there is a problem with presentation time. when record starts I set the value of "tstart" and for each frame calculate difference of current time from tstart and pass it to the queueinputbuffer but nothing has changed. which part has problem ? I know that in android 4.3 I can pass surface to mediacodec but I want to support android 4.1. thanks in advance.
public void onPreviewFrame(final byte[] bytes, Camera camera) {
if (recording == true) {
long time = System.nanoTime();
time -= tstart;
if(mThread.isAlive()&&recording == true) {
encode(bytes, time );
}
}
}
private synchronized void encode(byte[] dataInput,long time)
{
byte[] data=new byte[dataInput.length];
NV21toYUV420Planar(dataInput,data,640,480);
inputBuffers = mMediaCodec.getInputBuffers();// here changes
outputBuffers = mMediaCodec.getOutputBuffers();
int inputBufferIndex = mMediaCodec.dequeueInputBuffer(-1);
if (inputBufferIndex >= 0) {
ByteBuffer inputBuffer = inputBuffers[inputBufferIndex];
inputBuffer.clear();
inputBuffer.put(data);
time/=1000;
mMediaCodec.queueInputBuffer(inputBufferIndex, 0, data.length, time, 0);
} else {
return;
}
MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
int outputBufferIndex = mMediaCodec.dequeueOutputBuffer(bufferInfo, 0);
Log.i("tag", "outputBufferIndex-->" + outputBufferIndex);
do {
if (outputBufferIndex >= 0) {
ByteBuffer outBuffer = outputBuffers[outputBufferIndex];
byte[] outData = new byte[bufferInfo.size];
outBuffer.get(outData);
try {
if (bufferInfo.offset != 0) {
fos.write(outData, bufferInfo.offset, outData.length
- bufferInfo.offset);
} else {
fos.write(outData, 0, outData.length);
}
fos.flush();
Log.i("camera", "out data -- > " + outData.length);
mMediaCodec.releaseOutputBuffer(outputBufferIndex, false);
outputBufferIndex = mMediaCodec.dequeueOutputBuffer(bufferInfo,
0);
} catch (IOException e) {
e.printStackTrace();
}
} else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
outputBuffers = mMediaCodec.getOutputBuffers();
} else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
MediaFormat format = mMediaCodec.getOutputFormat();
}
} while (outputBufferIndex >= 0);
}
Your problem is that you don't write the output frames into a container that actually stores any timestamps at all. You are writing a plain H264 file, which only contains the raw encoded frames, no index, no timestamps, no audio, nothing else.
In order to get proper timestamps for the files, you need to use MediaMuxer (which appeared in 4.3) or a similar third party library (e.g. libavformat or similar) to write the encoded packets to a file. The timestamp of the output packet is in bufferInfo.presentationTime, and in the if (outputBufferIndex >= 0) { clause, you don't use it at all - you are basically throwing away the timestamps.
I'm writing a code to read the bytes of a request body and this requires knowing the Content-Length or Transfer-encoding ahead of time to safely transfer the message to the client. According to the RCF2616 Section 14.13:
Any Content-Length greater than or equal to zero is a valid value.
In my code Implementation, I achieved this by getting the Content-Length: header field which returns 0,which I guess is a valid response but not the required amount of bytes. Have tried in the below code to read the InputStream from the socket still the amount is achieved but this seem to be failing.Any pointers achieving this? Can provide more code if necessary.
Here is the calling method to get content-length header and read the bytes in chunk till the exact amount is achieved:
//Gets the Content-Length header value
int contLengthOffset = Integer.parseInt(newRequest.getHeaderField("Content-Length"));
int Offset = contLengthOffset;
if(Offset >= 0) {
//Any Content-Length greater than or equal to zero is a valid value.
count = QueryStreamClass.ReadFullyHelper(socket.getInputStream(), Offset);
}
Below is the method that reads the content-length:
/**
* Read the content-length to determine the transfer-length of the message.
* We need enough bytes to get the required message.
* #param Stream
* #param size
*/
public static String ReadFullyHelper(InputStream Stream, int size) {
int Read;
int totalRead = 0;
int toRead = GMInjectHandler.buffer;;
StringBuilder Request = new StringBuilder();
if(toRead > size) {
toRead = size;
}
while(true) {
try {
final byte[] by = new byte[toRead];
Read = Stream.read(by, 0, toRead);
if(Read == -1){
break;
}
Request.append(new String(by, 0, Read));
totalRead += Read;
if (size - totalRead < toRead) {
toRead = size - totalRead;
}
if (totalRead == size) {
break;
}
} catch (IOException e) {
Log.e(TAG, "Error reading stream", e);
}
}
return Request.toString();
}
'This seems to be failing' is not a problem description, but:
public static String ReadFullyHelper(InputStream Stream, int size) {
You can reduce this entire method to the following:
DataInputStream din = new DataInputStream(Stream);
byte[][ buffer = new byte[size];
din.readFully(buffer);
return new String(buffer, 0, size); // or you may want to use a specific encoding here
I'm working with xuggle since one week and I wrote a method to get a
frame by a video but if video is long this method takes too much time:
public static void getFrameBySec(IContainer container, int videoStreamId, IStreamCoder videoCoder, IVideoResampler resampler, double sec)
{
BufferedImage javaImage = new BufferedImage(videoCoder.getWidth(), videoCoder.getHeight(), BufferedImage.TYPE_3BYTE_BGR);
IConverter converter = ConverterFactory.createConverter(javaImage, IPixelFormat.Type.BGR24);
IPacket packet = IPacket.make();
while(container.readNextPacket(packet) >= 0)
{
if (packet.getStreamIndex() == videoStreamId)
{
IVideoPicture picture = IVideoPicture.make(videoCoder.getPixelType(), videoCoder.getWidth(), videoCoder.getHeight());
int offset = 0;
while(offset < packet.getSize())
{
int bytesDecoded = videoCoder.decodeVideo(picture, packet, offset);
if (bytesDecoded < 0)
throw new RuntimeException("got error decoding video");
offset += bytesDecoded;
if (picture.isComplete())
{
IVideoPicture newPic = picture;
if (resampler != null)
{
newPic = IVideoPicture.make(resampler.getOutputPixelFormat(), picture.getWidth(), picture.getHeight());
if (resampler.resample(newPic, picture) < 0)
throw new RuntimeException("could not resample video from");
}
if (newPic.getPixelType() != IPixelFormat.Type.BGR24)
throw new RuntimeException("could not decode video as RGB 32 bit data in");
javaImage = converter.toImage(newPic);
try
{
double seconds = ((double)picture.getPts()) / Global.DEFAULT_PTS_PER_SECOND;
if (seconds >= sec && seconds <= (sec +(Global.DEFAULT_PTS_PER_SECOND )))
{
File file = new File(Config.MULTIMEDIA_PATH, "frame_" + sec + ".png");
ImageIO.write(javaImage, "png", file);
System.out.printf("at elapsed time of %6.3f seconds wrote: %s \n", seconds, file);
return;
}
}
catch (Exception e)
{
e.printStackTrace();
}
}
}
}
else
{
// This packet isn't part of our video stream, so we just
// silently drop it.
}
}
converter.delete();
}
Do you know a better way to do this?
Well from just reading your code I see some optimizations that can be made.
One you first read through the entire file once, create an index of byteoffsets and seconds. Then the function can lookup of the byteoffset from the seconds given and you can decode the video at that offset and do the rest of your code.
Another option is to use your method, reading through the whole file each time, but instead of calling all that resampler, newPic, and java image converter code, check if the seconds match up first. If they do, then convert the image into a new pic to be displayed.
So
if(picture.isComplete()){
try {
double seconds = ((double)picture.getPts()) / Global.DEFAULT_PTS_PER_SECOND;
if (seconds >= sec && seconds <= (sec +(Global.DEFAULT_PTS_PER_SECOND )))
{
Resample image
Convert Image
Do File stuff
}
Use seekKeyFrame option. You can use this function to seek to any time in the video file (time is in milliseconds).
double timeBase = 0;
int videoStreamId = -1;
private void seekToMs(IContainer container, long timeMs) {
if(videoStreamId == -1) {
for(int i = 0; i < container.getNumStreams(); i++) {
IStream stream = container.getStream(i);
IStreamCoder coder = stream.getStreamCoder();
if (coder.getCodecType() == ICodec.Type.CODEC_TYPE_VIDEO) {
videoStreamId = i;
timeBase = stream.getTimeBase().getDouble();
break;
}
}
}
long seekTo = (long) (timeMs/1000.0/timeBase);
container.seekKeyFrame(videoStreamId, seekTo, IContainer.SEEK_FLAG_BACKWARDS);
}
From there you use your classic while(container.readNextPacket(packet) >= 0) method of getting the images to files.
Notice: It won't seek to exact time but approximate so you'll still need to go through the packets (but of course much less than before).