I am looking to do develop the following application. How to proceed?
Scan the system for installed webcams and their supported video modes.
Let the user select a cam and a video mode.
Displays a video of the camera.
Starts a frame grabber/processor, it doesn't have to do nothing for now. I want to have the possibility to elaborate frames or at least one frame every x.
Not sure if it's possible but i'd need also a routine to overlay processed frames on the playing video.
Check this post on SO for inspiration.
The JMF framework supports capturing real-time data, audio or video, as detailed in this article
You can also try LTI-Civil
I would recommend you using Webcam Capture project since neither JMF nor LTI-CIVIL are maintained any more. Webcam capture is a cross-platform, open source project hosted on Github. There are plenty of examples, e.g. of how to do things you've asked:
How to enumerate webcams and listen for new devices connected
Display video from camera
Enable grabbing and take snapshot on demand
Unfortunately there is no possibility to overlay image obtained thru Webcam Capture API on the playing video. At least not within the Webcam Capture itself, but you could use Xuggler to do that - it contains example of how this can be done.
Please note that Webcam Capture API can be used on top of the JMF, FMJ, LTI-CIVIL, GStreamer, OpenIMAJ and other.
Related
I'm currently developing simple video player with VLCJ.
Can anyone please point me some clue about changing audio pitch with it?
is it possible?
I've searching through but cannot find the right keyword, what i need is
some control (method/function) of vlcj (or any) to increase sound so it sounds like
kids voice, or lower down so make like a very old man sound.
Thanks in advance.
NOTE:
still looking on google but found nothing about vlc. what i want is something about the "timbre" as explained at http://www.screamingbee.com/support/morphdoc/MorphDocPitchTimbre.aspx
If you are only interested in playing audio (you don't care about displaying any video at the same time) then vlcj 2.4.0 and later provide a so-called "direct" audio player component.
With this component, your Java application can get direct access to the native audio sample buffer. You can run whatever algorithm you want on those samples, then play out your modified samples via JavaSound or some other API.
There is a sample included in the vlcj distribution that shows how to use this component to play via JavaSound:
https://github.com/caprica/vlcj/tree/vlcj-2.4.1/src/test/java/uk/co/caprica/vlcj/test/directaudio
The example does not show how to change the pitch of the audio, but it does show how to use the direct audio player.
Since JavaFX2.0 has a media view is it somehow possible to live stream the camera feed into the Media component in real time? Since there is no camera API I am unaware of how to make this happen. Can we use another Java library to work with the camera and then stream the video in the MediaView
So is it possible, and if yes then how can we do it. May be by using any JavaAPI for camera and then streaming the video into the MediaView?
There is a Java library called Xuggle that is an open source solution for streaming video into Java applications. It is built on top of the ffmpeg libraries.
In my experience it will work with some implementations of the MPEG-2 and MPEG-4 codecs, but not others. If you were not aware, there are something like 800 different versions of those codecs and some of them end up sticking packets on the front, or in the middle, or at the end in order to force you to use their decoders when displaying the video. Up to, and perhaps including, the new JavaFX code there has been very little robust support for streaming video into Java.
You may want to explore doing something like embedding an instance of VLC in a JPanel and displaying that to your user. There are also libraries that attempt to allow some interaction between Flash and Java that could be used to approach this issue.
Good Luck!
It seems that in 2.0 you still cannot attach an external source for the video/audio streams. You need to create a file and provide an uri to this file to play video in the MediaView. Not acceptable for the capture video from the camera.
I did not do this in JavaFX 2.0 but in 1.3 we used to deliver just an image to the ImageView writing our own capturer/streamer. Possibly you can do this with any 3rd party lib.
I am trying to take photos with my web cam, and I'm having some difficulty trying to find a solution in java.
I've successfully set up FMJ and have my streaming video, but I want to take a photo, not video frame-grab. My web cam says it's 1.3MP, and 640 x 480 is a whopping 0.3MP!
So to clarify - I want to:
Stream video to my display from my web cam (so I can see what I'm doing)
When I press enter
Stop capturing video
Take full resolution photo (an image; not part of a low-res stream)
Return to step 1 (streaming video to display)
You should use Java Media Framework (JMF). It exposes fully functional API for image and movie processing.
Picture Transfer Protocol is the way to go for high res. (I don't yet know what preview functionality is provided; perhaps JMF is used there...?)
If your web cam driver is good/recent (mine isn't), then you'll be able to see it in Window's "Scanners and Cameras" view. These devices are available under WIA (Windows Image Acquisition) technology. (I'm intending to use Jacob to talk to the windows libs).
For linuxes, there's a linking ptp page over at sourceforge. Note that it indicates that jphoto is obselete, but cameraptp over at google code extends it as recently as Feb 2011.
I am trying to post an Event or notification to a dialog or frame , when a video capture device is connected to a system.(Mac OS X). I am trying to do it with java .
Any help would be far kind.
You might want to look into the Java Media Framework. Its CaptureDeviceManager class seems to offer the required facilities. I've shortly experimented with JMF and found it to be quite easy to get a stream from a webcam and output it to a Swing container. Took minimal effort.
I'm interested in doing image processing in Java with frames collected from a network video adapter. The first challenge is finding network video adapters/cameras which don't require an ActiveX control for PTZ control and therefore require IE. Then the issue is how to do still image grabs from network video adapters which only make MP4 available.
Does anyone know of some Java friendly network video cameras and adapters?
Anyone know of some Java code to control PTZ on a network camera?
Two ways in Java that I know of. The first (and the one I current recommend) is the LTI-Civil project. The second is to use Xuggler which uses FFmpeg webcam code behind the scenes.