Since JavaFX2.0 has a media view is it somehow possible to live stream the camera feed into the Media component in real time? Since there is no camera API I am unaware of how to make this happen. Can we use another Java library to work with the camera and then stream the video in the MediaView
So is it possible, and if yes then how can we do it. May be by using any JavaAPI for camera and then streaming the video into the MediaView?
There is a Java library called Xuggle that is an open source solution for streaming video into Java applications. It is built on top of the ffmpeg libraries.
In my experience it will work with some implementations of the MPEG-2 and MPEG-4 codecs, but not others. If you were not aware, there are something like 800 different versions of those codecs and some of them end up sticking packets on the front, or in the middle, or at the end in order to force you to use their decoders when displaying the video. Up to, and perhaps including, the new JavaFX code there has been very little robust support for streaming video into Java.
You may want to explore doing something like embedding an instance of VLC in a JPanel and displaying that to your user. There are also libraries that attempt to allow some interaction between Flash and Java that could be used to approach this issue.
Good Luck!
It seems that in 2.0 you still cannot attach an external source for the video/audio streams. You need to create a file and provide an uri to this file to play video in the MediaView. Not acceptable for the capture video from the camera.
I did not do this in JavaFX 2.0 but in 1.3 we used to deliver just an image to the ImageView writing our own capturer/streamer. Possibly you can do this with any 3rd party lib.
Related
For performance reasons, I ditched Python-Opnecv/FFmpeg solution and moved on to Java.
But to my surprise, I am not able to find any better and complete solution as we have in Python. I tried using vlcj but again it just gives more of a command line kind of interface. I am not able to find any callback kind of mechanism for reading and analyzing all the frames.
I also tried using Java Sockets but wasn't able to do anything more than establishing a connection with Ip Camera streaming h264 video over RTSP.
Note: It will be running in a server environment so we don't want to show any frame, we just need run certain other operations on frames.
Please guide me in the right direction.
If you want to get access to the video frame buffer while media is playing you have a couple of options.
I'l assume you are using vlcj 4.x+, which is current at time of writing.
First, you can use an EmbeddedMediaPlayer with a CallbackVideoSurface.
You can use the MediaPlayerFactory to create your video surface.
When you create your video surface, it requires a RenderCallback implementation that you provide.
Create the embedded media player as normal, and invoke mediaPlayer.setVideoSurface() to set your video surface.
It is this render callback implementation class that will be called back by VLC with raw video frame data in the form of a ByteBuffer backed by native memory. You can then do your analysis on the data in this byte buffer.
The second approach is to look instead at the CallbackMediaPlayerComponent class - this class aims to make it very easy for you to get an out-of-the-box working media player and provides a way for you to plug in only the bits you want to customise. In this case you plug in your render callback implementation to do your analysis.
There are examples in the vlcj source code at the github project page that show all of this. One of the examples processes this buffer to dynamically convert the video to greyscale, but obviously you can do anything you want with the frame data.
The method is named "onDisplay()" but you do not have to actually display the video anywhere if you're only interested in performing some analysis.
This is the extent of what vlcj can provide if you want to access the video frame data.
You would have thought that tere is a simple solution to this but there isn't :(
My application needs to capture a stream from a USB/firewire (or whatever is the connection) connected camera (result would be a file like output.flv). I would prefer that I can detect all connected cameras and choose which one to use (one or more at the same time --> one or more output files). Application has to be cross platform.
Found libraries:
Xuggle - not very good camera support. Good for manipulating over images and video.
JMF - an old API but if I can use it, I will. I don't see a MAC OS X link on downloads page.
FMJ - looks like a better version of JMF but a can't find a way of installing it.
LTI-CIVIL - FMJ uses it. It looks like it only captures images from camera (not video). I could use Xuggle to create a video from images taken from LTI-CIVIL. And like FMJ, it is difficult to install.
What are your suggestions on this one?
I'd recommend VLCj for this - it should be able to stream from webcams onto a Java canvas without any difficulties. It uses native code so you need to provide libvlc.so / dll but from there on it should work on all the major platforms (Windows, Mac, Linux).
You may need to look at out of process players for complete reliability which is a bit more complex (see here for my efforts so far) but once you've got that in place it should work fine.
There really is no good camera support for Java. You will have to use native code, tailored for each platform, through JNI to get video capture for your project.
There's a related question here. Basically they're suggesting OpenCV wrapped with JNI.
I need to stream multiple videos in a web browser and have them all be synchronized; I also need to be able to switch between audio channels on the fly. Despite a lot of research, it looks as though at present this is impossible using the current browser implementations of HTML5 video. Flash seems to have the same problem. There are forums full of people wanting to do on-the-fly, in-band audio and video switching with sync, and multi-video playback with sync, but no real way to do it.
It's in the HTML5 spec for the future, actually, but has no browser implementation yet.
So, I need to build a custom plugin or application.
What is the best approach to doing this? Would something like Google Web Toolkit be a good place to start? Is client-side Java a good approach to building something this custom but with ease of deployability (vs say OS-specific C++ plugins, for example)?
JavaFX 2(An official supported Java library) has a video player which I've seen modified interesting in a 3d world and basically doing what you want. Currently it only supports flv, but they probably change it like it was in JavaFX 1 and support what ever installed codecs on ones computer, once out of beta.
Edit: Checking over the video stuff again not totally sure on keeping things in sync when this is on there roadmap (http://javafx.com/roadmap/)
Synchronized Media and Animations
Sometimes applications need to have very tightly aligned media and
animation in a timeline. JavaFX 2.0 will provide support to tie a
timeline to a specific media stream such that events in the timeline
occur in sync with events in the media stream.
I need simple video playback in Java.
Here are my requirements:
PRODUCTION QUALITY
Open and decode video files whose video and audio codecs can be chosen by me. I.E I can pick well behaving codecs.
Be able to play, pause, seekToFrame OR seekToTime and stop playback. Essentially I wish to be able to play segments of a single video file in a non linear fashion. For example I may want to play the segment 20.3sec to 25.6sec, pause for 10 seconds and then play the segment 340.3sec to 350.5sec, etc.
During playback, video and audio must be in sync.
The video must be displayed in a Swing JComponent.
Must be able to use in a commercial product without having to be open source (I.E. LGPL or Comercial is good)
My research has led me to the following solutions:
Use Java Media Framework + Fobs4JMF
http://fobs.sourceforge.net/f4jmf_first.html
I have implemented a quick prototype and this seems to do what I need. I can play a segment of video using:
player.setStopTime(new Time(end));
player.setMediaTime(new Time(start));
player.start();
While Fobs4JMF seems to work, I feel the quality of the code is poor and the project is no longer active. Does anyone know of any products which use Fobs4JMF?
Write a Flash application which plays a video and use JFlashPlayer to bring it into my Java Swing application
Unlike Java, Flash is brilliant at playing video. I could write a small Flash application with the methods:
open(String videoFile),
play(),
pause(),
seek(int duration),
stop()
Then bring it into Java using JFlashPlayer which can call Flash functions from Java.
What I like about this solution is that video playback in Flash should be rock solid. Has anyone used JFlashPlayer to play video in Java?
Write a simple media player on top of Xuggler
Xuggler is an FFMpeg wrapper for Java which seems to be a quite active and high quality project. However, implementing the simple video playback described in the requirements is not trivial (Seeking in particular) but some of the work has been done in the MediaTools MediaViewer which would be the base upon which to build from.
Use FMJ
I have tried to get FMJ to work but have had no sucess so far.
I would appreciate your opinions on my problem.
Can a brother get a shout out for Xuggler?
In my mind, VLCJ is the way forward for this type of thing. I love Xuggler for encoding / transcoding work, but unfortunately it's just so complicated to do simple playback and solve all the sync issues and suchlike - and it does very much feel like reinventing the wheel doing so.
The only thing with VLCJ is that to get it to work reliably with multiple players I've had to resort to out of process players. The framework wasn't the simplest thing in the world to get in place, but when it's there it works beautifully. I'm currently running 3 out of process players in my app side by side with no problems whatsoever.
The other caveat is that the embedded media player won't work with a swing component, just a heavyweight canvas - but that hasn't proven a problem for me at all. If it does, then you can use the direct media player to get a bufferedimage and display that on whatever you choose, but it will eat into your CPU a bit more (though no more than other players that take this approach.)
JavaFX has a number of working video and audio codecs builtin. It's likely to be the solution with the broadest support at the moment.
I've been using jffmpeg in the same way you use FOBS, it works pretty well, although I haven't compared them.
I would also love to see an easy way to interface with native codecs the way that JavaFX does, but there doesn't seem to be real integration between JavaFX and Java.
There has also been some work trying to get the VLC library libvlc into java. I haven't tried it yet and would be interested to hear back from anyone who has.
haven't tried Xuggler (which i'm interested in) but I'm having a good time with VLCJ. The drawback I find in it is only that you have to have VLC installed prior to your application.
I'd recommend using MPV. You can use it in combination with JavaFX quite easily, see this example.
In short, you use a little JNA magic to use the MPV native libaries directly, and then let the video display on a JavaFX stage. If you use a child stage, you can even overlay JavaFX controls on top of the video (with full transparancy support).
VLC (with VLCJ) can be used in a similar fashion, but I find that the MPV solution performs better (faster seek and start times).
I'm interested in doing image processing in Java with frames collected from a network video adapter. The first challenge is finding network video adapters/cameras which don't require an ActiveX control for PTZ control and therefore require IE. Then the issue is how to do still image grabs from network video adapters which only make MP4 available.
Does anyone know of some Java friendly network video cameras and adapters?
Anyone know of some Java code to control PTZ on a network camera?
Two ways in Java that I know of. The first (and the one I current recommend) is the LTI-Civil project. The second is to use Xuggler which uses FFmpeg webcam code behind the scenes.