I'm working with xuggle since one week and I wrote a method to get a
frame by a video but if video is long this method takes too much time:
public static void getFrameBySec(IContainer container, int videoStreamId, IStreamCoder videoCoder, IVideoResampler resampler, double sec)
{
BufferedImage javaImage = new BufferedImage(videoCoder.getWidth(), videoCoder.getHeight(), BufferedImage.TYPE_3BYTE_BGR);
IConverter converter = ConverterFactory.createConverter(javaImage, IPixelFormat.Type.BGR24);
IPacket packet = IPacket.make();
while(container.readNextPacket(packet) >= 0)
{
if (packet.getStreamIndex() == videoStreamId)
{
IVideoPicture picture = IVideoPicture.make(videoCoder.getPixelType(), videoCoder.getWidth(), videoCoder.getHeight());
int offset = 0;
while(offset < packet.getSize())
{
int bytesDecoded = videoCoder.decodeVideo(picture, packet, offset);
if (bytesDecoded < 0)
throw new RuntimeException("got error decoding video");
offset += bytesDecoded;
if (picture.isComplete())
{
IVideoPicture newPic = picture;
if (resampler != null)
{
newPic = IVideoPicture.make(resampler.getOutputPixelFormat(), picture.getWidth(), picture.getHeight());
if (resampler.resample(newPic, picture) < 0)
throw new RuntimeException("could not resample video from");
}
if (newPic.getPixelType() != IPixelFormat.Type.BGR24)
throw new RuntimeException("could not decode video as RGB 32 bit data in");
javaImage = converter.toImage(newPic);
try
{
double seconds = ((double)picture.getPts()) / Global.DEFAULT_PTS_PER_SECOND;
if (seconds >= sec && seconds <= (sec +(Global.DEFAULT_PTS_PER_SECOND )))
{
File file = new File(Config.MULTIMEDIA_PATH, "frame_" + sec + ".png");
ImageIO.write(javaImage, "png", file);
System.out.printf("at elapsed time of %6.3f seconds wrote: %s \n", seconds, file);
return;
}
}
catch (Exception e)
{
e.printStackTrace();
}
}
}
}
else
{
// This packet isn't part of our video stream, so we just
// silently drop it.
}
}
converter.delete();
}
Do you know a better way to do this?
Well from just reading your code I see some optimizations that can be made.
One you first read through the entire file once, create an index of byteoffsets and seconds. Then the function can lookup of the byteoffset from the seconds given and you can decode the video at that offset and do the rest of your code.
Another option is to use your method, reading through the whole file each time, but instead of calling all that resampler, newPic, and java image converter code, check if the seconds match up first. If they do, then convert the image into a new pic to be displayed.
So
if(picture.isComplete()){
try {
double seconds = ((double)picture.getPts()) / Global.DEFAULT_PTS_PER_SECOND;
if (seconds >= sec && seconds <= (sec +(Global.DEFAULT_PTS_PER_SECOND )))
{
Resample image
Convert Image
Do File stuff
}
Use seekKeyFrame option. You can use this function to seek to any time in the video file (time is in milliseconds).
double timeBase = 0;
int videoStreamId = -1;
private void seekToMs(IContainer container, long timeMs) {
if(videoStreamId == -1) {
for(int i = 0; i < container.getNumStreams(); i++) {
IStream stream = container.getStream(i);
IStreamCoder coder = stream.getStreamCoder();
if (coder.getCodecType() == ICodec.Type.CODEC_TYPE_VIDEO) {
videoStreamId = i;
timeBase = stream.getTimeBase().getDouble();
break;
}
}
}
long seekTo = (long) (timeMs/1000.0/timeBase);
container.seekKeyFrame(videoStreamId, seekTo, IContainer.SEEK_FLAG_BACKWARDS);
}
From there you use your classic while(container.readNextPacket(packet) >= 0) method of getting the images to files.
Notice: It won't seek to exact time but approximate so you'll still need to go through the packets (but of course much less than before).
Related
I am currently making mp3 player in NetBeans 12.1 and I can't find a way to control current position of a song.
I have tried using .setMicrosecondPosition(), but it seems it only works with the clip not with the line.
Is it even possible for my player to change current position of the track or should I change my code?
This is the code of the player.
public void run() {
final File file = new File(filePath);
try (final AudioInputStream in = AudioSystem.getAudioInputStream(file)) {
final AudioFormat outFormat = getOutFormat(in.getFormat());
final Info info = new Info(SourceDataLine.class, outFormat);
try (final SourceDataLine line
= (SourceDataLine) AudioSystem.getLine(info)) {
getLine(line);
line.getMicrosecondPosition();
if (line != null) {
line.open(outFormat);
line.start();
long millis;
AudioFileFormat fileFormat = AudioSystem.getAudioFileFormat(file);
Map<?, ?> properties = ((TAudioFileFormat) fileFormat).properties();
String key = "duration";
String title = "title";
Long microseconds = (Long) properties.get(key);
maksimumSekunde = (int)TimeUnit.MICROSECONDS.toSeconds(microseconds);
title1 = (String) properties.get(title);
int mili = (int) (microseconds / 1000);
sec = (mili / 1000) % 60;
min = (mili / 1000) / 60;
setVolumeDown(sliderGlasnoca.getValue());
//STREAM
int n = 0;
final byte[] buffer = new byte[4096];
AudioInputStream inp = getAudioInputStream(outFormat, in);
while (n != -1) {
if (pauza == true) {
break;
}
if (stop == true) {
synchronized (LOCK) {
LOCK.wait();
}
}
n = inp.read(buffer, 0, buffer.length);
if (n != -1) {
line.write(buffer, 0, n);
}
millis = TimeUnit.MICROSECONDS.toMillis(line.getMicrosecondPosition());
trajanjeSekunde = (int)TimeUnit.MICROSECONDS.toSeconds(line.getMicrosecondPosition());
minutes = (millis / 1000) / 60;
seconds = ((millis / 1000) % 60);
//System.out.println(minutes + ":" + seconds + " " + "time = " + min + ":" + sec + " " + title1);
}
//STREAM
line.drain();
line.stop();
Finished();
}
} catch (InterruptedException ex) {
}
} catch (UnsupportedAudioFileException
| LineUnavailableException
| IOException e) {
throw new IllegalStateException(e);
}
}
Its my first time posting here.
I always just counted and discarded frames from bytes being read via the AudioInputStream, but looking anew at the API, I see that one can use the AudioInputStream.skip(...) method to jump forward a given number of bytes. Calculating the number of bytes corresponding to a given amount of time involves knowing the number of bytes per frame, e.g., 16-bit encoding, stereo is 4 bytes per frame, and the sample rate.
IDK if one can reliably skip backwards. This will depend on whether one can "mark" and "reset" the file being read by the AudioInputStream. If these capabilities are supported, it seems conceivable that one could mark(...) the start of the AudioInputStream. Then, to go backwards, first reset() back to the beginning and then jump forward via skip(...). I haven't tested this. A lot would depend on the number of bytes permitted in the mark(...) method.
If starting or stopping in the middle of playing audio, the data that is fed to the SourceDataLine would potentially exhibit "clicks" due to the discontinuity in the signal. To deal with that it might be necessary to convert the starts and stops to PCM and ramp the volume up if starting, or down if stopping. The number of frames required would probably need to be determined by experimenting. I'm guessing 64 frames for 44100fps might be a good first try.
I am still trying to get this html 5 playback of video & audio with jetty to work error free. For large videos I notice the video plays up to 75% (just an estimate) then jumps to the end.This never happens with shorter videos but consistently happens on videos more than 10 minutes or so. I don't see any errors in the logs or browser debugger window. Here are the range requests and also the couple errors that do get logged but I don't think are related.
Video
Status Code:206 Partial Content
Accept-Ranges:bytes
Content-Length:16001618
Content-Range:bytes 0-16001617/16001618
Content-Type:video/webm
Server:Jetty(9.3.z-SNAPSHOT)
REQUEST
Connection:keep-alive
Range:bytes=0-
Audio
Status Code:206 Partial Content
Accept-Ranges:bytes
Content-Length:9044858
Content-Range:bytes 0-9044857/9044858
Content-Type:audio/wav
REQUEST
Range:bytes=0-
Video
Status Code:206 Partial Content
Accept-Ranges:bytes
Content-Length:10834
Content-Range:bytes 15990784-16001617/16001618
Content-Type:video/webm
REQUEST
Connection:keep-alive
Range:bytes=15990784-
Video
Status Code:206 Partial Content
Accept-Ranges:bytes
Content-Length:12069458
Content-Range:bytes 3932160-16001617/16001618
Content-Type:video/webm
REQUEST
Connection:keep-alive
Range:bytes=3932160-
Like I said this happens consistently for larger videos so I don't think the videos are corrupted. But no errors are returned except the following which does not stop video playing anyway based on the time this error occurs:
Accept-Ranges: bytes
Content-Range: bytes 0-11229438/11229439
Content-Length: 11229439
Content-Type: video/webm
, Content-Length=11229439, ScreenID=117435}: {1},
com.service.,java.nio.channels.ClosedChannelException
at
org.eclipse.jetty.util.IteratingCallback.close(IteratingCallback.java:427)
at org.eclipse.jetty.server.HttpConnection.onClose(HttpConnection.java:489)
Code that handles the requests:
if(parameters.containsKey(Constants.PARAM_CONTENT_LENGTH)) {
totalLength = java.lang.Math.toIntExact((long)
parameters.get(Constants.PARAM_CONTENT_LENGTH));
} else {
String range =
parameters.get(Constants.PARAM_RANGE_REQUEST).toString();
String[] ranges = range.split("=")[1].split("-");
from = Integer.parseInt(ranges[0]);
if(from >= 0 && from < totalLength) {
to = (totalLength - 1);
if (ranges.length == 2) {
to = Integer.parseInt(ranges[1]);
if (to >= totalLength) {
to = (int) (totalLength - 1);
}
}
} else {
to = -1;
}
}
}
return new ContentRange(from, to, totalLength);
Code that writes to the stream:
BufferedOutputStream bout = new BufferedOutputStream (outputStream);
Buffer b = new Buffer();
long skipped =0;
int byteLength = 0;
do {
inputStream.read(b);
int len = b.getLength();
byte[] data = (byte[]) b.getData();
int offset = b.getOffset();
if (len > 0) {
if(skipped < from) {
if(skipped + len <= from) {
skipped += len;
} else {
offset += (from - skipped);
len -= (from - skipped);
skipped = from;
}
}
if(to >= 0) {
if(from + byteLength + len <= to) {
} else {
len = (to + 1) - (from + byteLength);
}
}
byteLength+= len;
bytesWrittenObj.bytesWritten = byteLength;
bout.write(data, offset, len);
}
} while (!b.isEOM());
bout.flush();
I want to build android download speed test. To do that I am using TrafficStats class. Problem is that I am getting wrong results. Results are almost the same when I run the test but I put heavy load on my Internet connection before I run test. I download file for 30 seconds and after that (or when file is downloaded) and then calculate bytes using TrafficStats
Does someone knows where is the problem?
This is code that I am using:
#Override
protected String doInBackground(String... urls) {
String downloaded ="";
// String uploaded = "";
try{
long BeforeTime = System.currentTimeMillis();
long TotalTxBeforeTest = TrafficStats.getTotalTxBytes();
long TotalRxBeforeTest = TrafficStats.getTotalRxBytes();
URL url = new URL(urls[0]);
URLConnection connection = new URL(urls[0]).openConnection();
connection.setUseCaches(false);
connection.connect();
InputStream input = connection.getInputStream();
BufferedInputStream bufferedInputStream = new BufferedInputStream(input);
byte[] buffer = new byte[1024];
int n = 0;
long endLoop = BeforeTime+30000;
while(System.currentTimeMillis() < endLoop) {
/* if (bufferedInputStream.read(buffer) != -1){
break;
}*/
}
long TotalTxAfterTest = TrafficStats.getTotalTxBytes();
long TotalRxAfterTest = TrafficStats.getTotalRxBytes();
long AfterTime = System.currentTimeMillis();
double TimeDifference = AfterTime - BeforeTime;
double rxDiff = TotalRxAfterTest - TotalRxBeforeTest;
double txDiff = TotalTxAfterTest - TotalTxBeforeTest;
Log.e(TAG, "Download skinuto. "+ rxDiff);
if((rxDiff != 0) && (txDiff != 0)) {
double rxBPS = (rxDiff / (TimeDifference/1000)); // total rx bytes per second.
double txBPS = (txDiff / (TimeDifference/1000)); // total tx bytes per second.
downloaded = String.valueOf(rxBPS) + "B/s. Total rx = " + rxDiff;
// uploaded = String.valueOf(txBPS) + "B/s. Total tx = " + txDiff;
}
else {
downloaded = "No downloaded bytes.";
}
}
catch(Exception e){
Log.e(TAG, "Error while downloading. "+ e.getMessage());
}
return downloaded;
}
I tried your code - it seems to work fine for me BUT i changed
while(System.currentTimeMillis() < endLoop) {
/* if (bufferedInputStream.read(buffer) != -1) {
break;
}*/
}
to
while(System.currentTimeMillis() < endLoop) {
if (bufferedInputStream.read(buffer) == -1){
break;
}
}
since read returns -1 if the end of the stream is reached.
I used bellow code to encode raw data to h264 in order to create video and it's encoded very well but video is playing too fast. It seems that there is a problem with presentation time. when record starts I set the value of "tstart" and for each frame calculate difference of current time from tstart and pass it to the queueinputbuffer but nothing has changed. which part has problem ? I know that in android 4.3 I can pass surface to mediacodec but I want to support android 4.1. thanks in advance.
public void onPreviewFrame(final byte[] bytes, Camera camera) {
if (recording == true) {
long time = System.nanoTime();
time -= tstart;
if(mThread.isAlive()&&recording == true) {
encode(bytes, time );
}
}
}
private synchronized void encode(byte[] dataInput,long time)
{
byte[] data=new byte[dataInput.length];
NV21toYUV420Planar(dataInput,data,640,480);
inputBuffers = mMediaCodec.getInputBuffers();// here changes
outputBuffers = mMediaCodec.getOutputBuffers();
int inputBufferIndex = mMediaCodec.dequeueInputBuffer(-1);
if (inputBufferIndex >= 0) {
ByteBuffer inputBuffer = inputBuffers[inputBufferIndex];
inputBuffer.clear();
inputBuffer.put(data);
time/=1000;
mMediaCodec.queueInputBuffer(inputBufferIndex, 0, data.length, time, 0);
} else {
return;
}
MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
int outputBufferIndex = mMediaCodec.dequeueOutputBuffer(bufferInfo, 0);
Log.i("tag", "outputBufferIndex-->" + outputBufferIndex);
do {
if (outputBufferIndex >= 0) {
ByteBuffer outBuffer = outputBuffers[outputBufferIndex];
byte[] outData = new byte[bufferInfo.size];
outBuffer.get(outData);
try {
if (bufferInfo.offset != 0) {
fos.write(outData, bufferInfo.offset, outData.length
- bufferInfo.offset);
} else {
fos.write(outData, 0, outData.length);
}
fos.flush();
Log.i("camera", "out data -- > " + outData.length);
mMediaCodec.releaseOutputBuffer(outputBufferIndex, false);
outputBufferIndex = mMediaCodec.dequeueOutputBuffer(bufferInfo,
0);
} catch (IOException e) {
e.printStackTrace();
}
} else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
outputBuffers = mMediaCodec.getOutputBuffers();
} else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
MediaFormat format = mMediaCodec.getOutputFormat();
}
} while (outputBufferIndex >= 0);
}
Your problem is that you don't write the output frames into a container that actually stores any timestamps at all. You are writing a plain H264 file, which only contains the raw encoded frames, no index, no timestamps, no audio, nothing else.
In order to get proper timestamps for the files, you need to use MediaMuxer (which appeared in 4.3) or a similar third party library (e.g. libavformat or similar) to write the encoded packets to a file. The timestamp of the output packet is in bufferInfo.presentationTime, and in the if (outputBufferIndex >= 0) { clause, you don't use it at all - you are basically throwing away the timestamps.
I am trying to forward live RTMP streaming video from link1 to link2. But when video stops or pause at input end then my java application stops reading packets and get error 'unable to read RTMP-Header Packet'. Code is given below-
import com.xuggle.xuggler.ICodec;
import com.xuggle.xuggler.IContainer;
import com.xuggle.xuggler.IContainerFormat;
import com.xuggle.xuggler.IPacket;
import com.xuggle.xuggler.IStream;
import com.xuggle.xuggler.IStreamCoder;
import com.xuggle.xuggler.IVideoPicture;
public class XugglerRecorder
{
public static void main(String[] args)
{
String url = "rtmp://IP:PORT/live2/16_8_2013";
IContainer readContainer = IContainer.make();
readContainer.setInputBufferLength(4096);
IContainer writeContainer=IContainer.make();
//writeContainer.setInputBufferLength(0);
IContainerFormat containerFormat_live = IContainerFormat.make();
containerFormat_live.setOutputFormat("flv","rtmp://IP:PORT/live/abc", null);
int retVal= writeContainer.open("rtmp://192.168.1.198:1935/live/abc", IContainer.Type.WRITE, containerFormat_live);
//writeContainer.setInputBufferLength(0);
if (retVal < 0) {
System.err.println("Could not open output container for live stream");
System.exit(1);
}
if (readContainer.open(url, IContainer.Type.READ, null, true, false) < 0) {
throw new RuntimeException("unable to open read container");
}
IStream video = writeContainer.addNewStream(0);
if (video == null) {
throw new RuntimeException("unable to add video stream");
}
IPacket packet = IPacket.make();
while (readContainer.readNextPacket(packet) >= 0 && !packet.isKeyPacket()) {}
IStreamCoder inVideoCoder = null;
int videoStreamId = -1;
for (int i = 0; i < readContainer.getNumStreams(); ++i) {
IStream stream = readContainer.getStream(i);
IStreamCoder coder = stream.getStreamCoder();
if (coder.getCodecType() == ICodec.Type.CODEC_TYPE_VIDEO) {
inVideoCoder = coder;
videoStreamId = i;
if (inVideoCoder.open(null, null) < 0) {
throw new RuntimeException("Unable to open input video coder");
}
//for getting frame params need to decode at least one key frame
IVideoPicture picture = IVideoPicture.make(inVideoCoder.getPixelType(), 0, 0);
int bytesDecoded = inVideoCoder.decodeVideo(picture, packet, 0);
if (bytesDecoded < 0) {
throw new RuntimeException("Unable to decode video packet");
}
}
}
if (videoStreamId == -1) {
throw new RuntimeException("unable to find video stream");
}
IStreamCoder outVideoCoder = video.getStreamCoder();
outVideoCoder.setCodec(inVideoCoder.getCodec());
outVideoCoder.setHeight(inVideoCoder.getHeight());
outVideoCoder.setWidth(inVideoCoder.getWidth());
outVideoCoder.setPixelType(inVideoCoder.getPixelType());
outVideoCoder.setBitRate(inVideoCoder.getBitRate());
outVideoCoder.setTimeBase(inVideoCoder.getTimeBase());
if (outVideoCoder.open(null, null) < 0) {
throw new RuntimeException("unable to open output video coder");
}
if (writeContainer.writeHeader() < 0) {
throw new RuntimeException("unable to write header");
}
int i = 0;
doit(readContainer, packet, writeContainer);
if (writeContainer.writeTrailer() < 0) {
throw new RuntimeException("unable to write trailer");
}
}
private static void doit(IContainer readContainer, IPacket packet,
IContainer writeContainer) {
int i = 0;
while (readContainer.readNextPacket(packet) >= 0) {
if(readContainer.readNextPacket(packet)<0)
{
System.out.println("Packet null Hello");
try{
doit(readContainer, packet, writeContainer);
}catch(Exception e){e.printStackTrace();}
continue;
}
if(readContainer.readNextPacket(packet)==-1 ){
System.out.println("Packet is absent");
}
if (packet.getStreamIndex() != 0) {
continue;
}
if (writeContainer.writePacket(packet) < 0 && readContainer.readNextPacket(packet)>=0) {
try{
System.out.println(" packet sleep");
}catch(Exception e){e.printStackTrace();}
}
}
}
}
I am able to publish live video to FMS via RTMP. But unable to store save point of video before stop or pause of video streaming. If there is any time lag of input streaming then my application should keep on checking and waiting for stream instead of stopping.
Kindly help me out from this. Thanks in advance.
Any help or tips for how to debug this would be immensely appreciated.
I don't know much about Xuggler.
You might try catching Exception.
Or you might try calling doit() one more time in main method. Suppose in case if no more packets will be there to read then method doit() overs. So, it will not try to read again.
Or you can try adding something like
while (container.readPacket==null) {} at the starting of doit() OR I think it will be
while (container.readPacket!=null) {}
Try not checking for isKeyPacket and see what will happen.