I am trying to take photos with my web cam, and I'm having some difficulty trying to find a solution in java.
I've successfully set up FMJ and have my streaming video, but I want to take a photo, not video frame-grab. My web cam says it's 1.3MP, and 640 x 480 is a whopping 0.3MP!
So to clarify - I want to:
Stream video to my display from my web cam (so I can see what I'm doing)
When I press enter
Stop capturing video
Take full resolution photo (an image; not part of a low-res stream)
Return to step 1 (streaming video to display)
You should use Java Media Framework (JMF). It exposes fully functional API for image and movie processing.
Picture Transfer Protocol is the way to go for high res. (I don't yet know what preview functionality is provided; perhaps JMF is used there...?)
If your web cam driver is good/recent (mine isn't), then you'll be able to see it in Window's "Scanners and Cameras" view. These devices are available under WIA (Windows Image Acquisition) technology. (I'm intending to use Jacob to talk to the windows libs).
For linuxes, there's a linking ptp page over at sourceforge. Note that it indicates that jphoto is obselete, but cameraptp over at google code extends it as recently as Feb 2011.
Related
I download an MP4 file from the internet and play it on the blackberry device. I get the following error: "the video portion of the media being played uses an unsupported format". The audio starts playing, but the video doesn't play, while showing this error.
It should be noted that this only happens on Device OS 5 & 6. The same video plays properly or OS 7 and OS 7.1. I am guessing this is because RIM included some updates to MMAPI. What could i do to allow devices prior to OS 7 play the videos? OS 5 & 6 devices play MP4 files, just not all of them.
I have been looking into custom decoding the bytes of the MP4 file, but that will take a lot of time, looking into existing decoder implementation before adapting to J2ME, not an easy task.
Any help would be great here.
Edit:
The video content owners have control of the videos on the server side, but arent willing to re encode, mainly due to size concerns on the server, even though i recommended they do as well.
The resolution of the video is about 720w x 400h. This is quite high for a BB, but Bold 9790 and Torch 9810 both play it without a problem. So why cant Bold 9780 play the same file?
Update:
Regarding the problem with the video playing on a 9790 and not a 9780, those are different devices. The 9790 came out about a year after the 9780, and apparently RIM added more capability.
From 9780 specs:
Video player DivX/WMV/XviD/3gp
From 9790 specs:
DivX/XviD/MP4/H.264/H.263/WMV player
So, that explains why you can't get that video to play on the 9780. If playing this video is fundamental to your app, you might change the settings in BlackBerry App World to list it as incompatible with 9780s. If this is only one of many features of your app, you might at least catch the media exception and inform the user gracefully that their device can't play the video requested, so they don't think it's your app's fault.
Original Answer:
MP4 actually contains a family of related formats.
The actual support for your video probably depends both on your BlackBerry OS version (e.g. 5/6/7) and also the device itself.
Here is a BlackBerry reference document that describes video format capabilities of various BlackBerry devices.
See also this reference document.
Of course, different devices also have different sizes of screens.
It might be useful for you to produce the videos in a variety of formats and resolutions, and have your BlackBerry app download different versions of the video depending on the device. Since video downloads are slow, doing it this way will also ensure that the user sees the fastest possible download on their device. There's no use downloading a higher resolution than the device can display.
You didn't specify whether you have control of the videos on the server side or not, so this may not be an option for you.
Since JavaFX2.0 has a media view is it somehow possible to live stream the camera feed into the Media component in real time? Since there is no camera API I am unaware of how to make this happen. Can we use another Java library to work with the camera and then stream the video in the MediaView
So is it possible, and if yes then how can we do it. May be by using any JavaAPI for camera and then streaming the video into the MediaView?
There is a Java library called Xuggle that is an open source solution for streaming video into Java applications. It is built on top of the ffmpeg libraries.
In my experience it will work with some implementations of the MPEG-2 and MPEG-4 codecs, but not others. If you were not aware, there are something like 800 different versions of those codecs and some of them end up sticking packets on the front, or in the middle, or at the end in order to force you to use their decoders when displaying the video. Up to, and perhaps including, the new JavaFX code there has been very little robust support for streaming video into Java.
You may want to explore doing something like embedding an instance of VLC in a JPanel and displaying that to your user. There are also libraries that attempt to allow some interaction between Flash and Java that could be used to approach this issue.
Good Luck!
It seems that in 2.0 you still cannot attach an external source for the video/audio streams. You need to create a file and provide an uri to this file to play video in the MediaView. Not acceptable for the capture video from the camera.
I did not do this in JavaFX 2.0 but in 1.3 we used to deliver just an image to the ImageView writing our own capturer/streamer. Possibly you can do this with any 3rd party lib.
I want to create a C++ cross-platform (Windows and MacOS X) application that sends the screen as a video stream to a server.
The application is needed in the context of lecture capture. The end result will be a Flash based web page that plays back the lecture (presenter video and audio + slides/desktop).
I am currently exploring a few options:
Bundle the VLC (the Video Player) binary with my app and use its desktop streaming features.
Use the Qt Phonon library, but it doesn't seem to be powerful enough.
Send individual screenshots plus a timestamp to the server instead of a video stream. The server then would have to create the video stream.
Implement it in Java and use Xuggler (BigBlueButton uses it for their Desktop Sharing feature)
...?
I would greatly appreciate your insights/comments on how to approach this problem.
I think VNC is a great starting point for a software solution. Cross platform and well tested. I can think of a couple commercial projects that are derived from VNC - Co-pilot from Fog Creek springs to mind.
But concider tapping in to the projector hardware to capture slides instead of installing software on every computer brought in by lecturers. I.e. a splitter and then a computer to capture the slide video signal as well as the presenter video signal.
Where I worked lecturers brought in a plethora of laptops for their presentations and rather disliked the idear of installing anything moments before their presentation.
I'd go for a hardware solution - a Mac mini with Boinx.
There are a bunch of screen streaming and recording software available, On the Windows platform you can use Windows media encoder to do this and even broadcast a live mms:// stream
Capturing the screen is not hard to do ( unless the content on the screen is overlay video or fullscreen 3d graphics ). Streaming it live is complicated, encoding and recording it to disk is quite straightforward with most multimedia frameworks ( Directshow, gstreamer )
My solution was to write a simple GUI application in Qt that invokes a VLC process in the background. This works really well.
I am looking to do develop the following application. How to proceed?
Scan the system for installed webcams and their supported video modes.
Let the user select a cam and a video mode.
Displays a video of the camera.
Starts a frame grabber/processor, it doesn't have to do nothing for now. I want to have the possibility to elaborate frames or at least one frame every x.
Not sure if it's possible but i'd need also a routine to overlay processed frames on the playing video.
Check this post on SO for inspiration.
The JMF framework supports capturing real-time data, audio or video, as detailed in this article
You can also try LTI-Civil
I would recommend you using Webcam Capture project since neither JMF nor LTI-CIVIL are maintained any more. Webcam capture is a cross-platform, open source project hosted on Github. There are plenty of examples, e.g. of how to do things you've asked:
How to enumerate webcams and listen for new devices connected
Display video from camera
Enable grabbing and take snapshot on demand
Unfortunately there is no possibility to overlay image obtained thru Webcam Capture API on the playing video. At least not within the Webcam Capture itself, but you could use Xuggler to do that - it contains example of how this can be done.
Please note that Webcam Capture API can be used on top of the JMF, FMJ, LTI-CIVIL, GStreamer, OpenIMAJ and other.
I need to talk to Video4Linux (to capture output from a webcam) on a debian system running on an armel system (OpenMoko). JavaMediaFramework won't work in this case as it only have x86 and AMD versions. The linux kernel is 2.6.24 (with the v4l drivers compiled in separatly) and I cannot upgrade it (as it is not available on my hardware.
I have been following closely a project called video4linux4java . I now works with a lot of drivers (therefore a lot of webcams & capture cards), and produces a JPEG-encoded stream of images captured from a video device. Recently, the author has added classes to report information on the video device itself (webcam, TV tuner, ...). It is simple to use and comes with some examples. One of them (used to test v4l4j) displays a video stream in a JFrame. I use v4l4j in my own app to capture frames from my Logitech Quickcam Sphere AF, and control the pan and tilt. Works Great !
Video 4 Linux devices should be accessible through a device file (like /dev/video).
So I think you can open the device you want to access as you would do with a normal file and then read the stream coming from it.
To have more info about the devices, the video format etc... just check V4L web site