I want to create a C++ cross-platform (Windows and MacOS X) application that sends the screen as a video stream to a server.
The application is needed in the context of lecture capture. The end result will be a Flash based web page that plays back the lecture (presenter video and audio + slides/desktop).
I am currently exploring a few options:
Bundle the VLC (the Video Player) binary with my app and use its desktop streaming features.
Use the Qt Phonon library, but it doesn't seem to be powerful enough.
Send individual screenshots plus a timestamp to the server instead of a video stream. The server then would have to create the video stream.
Implement it in Java and use Xuggler (BigBlueButton uses it for their Desktop Sharing feature)
...?
I would greatly appreciate your insights/comments on how to approach this problem.
I think VNC is a great starting point for a software solution. Cross platform and well tested. I can think of a couple commercial projects that are derived from VNC - Co-pilot from Fog Creek springs to mind.
But concider tapping in to the projector hardware to capture slides instead of installing software on every computer brought in by lecturers. I.e. a splitter and then a computer to capture the slide video signal as well as the presenter video signal.
Where I worked lecturers brought in a plethora of laptops for their presentations and rather disliked the idear of installing anything moments before their presentation.
I'd go for a hardware solution - a Mac mini with Boinx.
There are a bunch of screen streaming and recording software available, On the Windows platform you can use Windows media encoder to do this and even broadcast a live mms:// stream
Capturing the screen is not hard to do ( unless the content on the screen is overlay video or fullscreen 3d graphics ). Streaming it live is complicated, encoding and recording it to disk is quite straightforward with most multimedia frameworks ( Directshow, gstreamer )
My solution was to write a simple GUI application in Qt that invokes a VLC process in the background. This works really well.
Related
You would have thought that tere is a simple solution to this but there isn't :(
My application needs to capture a stream from a USB/firewire (or whatever is the connection) connected camera (result would be a file like output.flv). I would prefer that I can detect all connected cameras and choose which one to use (one or more at the same time --> one or more output files). Application has to be cross platform.
Found libraries:
Xuggle - not very good camera support. Good for manipulating over images and video.
JMF - an old API but if I can use it, I will. I don't see a MAC OS X link on downloads page.
FMJ - looks like a better version of JMF but a can't find a way of installing it.
LTI-CIVIL - FMJ uses it. It looks like it only captures images from camera (not video). I could use Xuggle to create a video from images taken from LTI-CIVIL. And like FMJ, it is difficult to install.
What are your suggestions on this one?
I'd recommend VLCj for this - it should be able to stream from webcams onto a Java canvas without any difficulties. It uses native code so you need to provide libvlc.so / dll but from there on it should work on all the major platforms (Windows, Mac, Linux).
You may need to look at out of process players for complete reliability which is a bit more complex (see here for my efforts so far) but once you've got that in place it should work fine.
There really is no good camera support for Java. You will have to use native code, tailored for each platform, through JNI to get video capture for your project.
There's a related question here. Basically they're suggesting OpenCV wrapped with JNI.
I am interested in building a music visualiser using fractal patterns for my final year project. I have googled quite a lot on it and I know a bit about fractals, however I was wondering what software would be used to 'animate' the graphics.
I know Java has a drawing API (AWT and Swing), but it's probably not the best for the animation factor. On the other hand there is also Flash that has new capabilities within as3 to produce such an effect, but if this app was to traverse into mobile development Flash wouldn't be a great choice. So there still exists a gray patch in my head regarding actual app development. Can anyone give me a head's up on where to start looking?
I would suggest that you first define what it is you want to build and then choose the best technology for the task. For audio visualization you will most likely be using some kind of Fourier data. This and the fractal math concepts should translate across programming languages well.
I will often build prototypes in ActionScript or Python just to understand the fundamentals of new topics. Once I have an understanding of the concepts and know the target platform, the prototypes are usually very helpful and sometimes can be easily ported over.
As for quickly prototyping audio visualizations you could use Processing ( java based ), openFrameworks ( c++ ), Cinder ( c++ ) or Flash.
Each of these technologies are cross-platform, allow you to read audio data in realtime, quickly create windows and provide easy to use drawing APIs.
Also, it sounds like you are thinking about mobile. I believe that both Cinder and openFrameworks can be used for iPhone development. As well, AS3 can be compiled into an AIR app which will run on the Android platform. Performance of Flash on Android devices varies greatly from device to device though.
Trying to build a very simple video player component in a JPanel (or something similar) to sit in a swing app, connect to an mpeg (or, really, anything VLC can output) video stream, and play it. Don't need any controls or anything -- just a live connection to the video stream.
It has to be cross-platform -- at least Mac and Windows (linux would be a nice bonus, but not necessary).
I'm developing in NetBeans, so any specifics regarding that would be extra-helpful.
JMF? Xuggler? Help! Thanks.
I would suggest integrating one of the two best-known Open Source media players, VLC or mplayer. Both projects are widely used, in active development, highly flexible and open to integration. Out of the box they are both able to play dozens of video & audio formats on Windows, OS X, and Linux.
For VLC, there are Java bindings jvlc (older, no longer maintained) or vlcj (newer, simpler). Mplayer can be embedded into a Java component in "slave mode".
As an alternative, the Java Media Framework (JMF) may be a Java-friendly way of embedding video, but it is not so up-to-date and few people recommend it. Docs can be found here.
JavaFX will do just that. It's quite simple to use and it'll support both Win, Mac and Linux. Playing most static video files are okay but you do have to test streaming.
Let me first state that I do not know Java. I'm a .NET developer with solid C# skills, but I'm actually attempting to learn Java and the Android SDK at the same time (I know it's probably not ideal, but oh well, I'm adventurous :))
That said, my end goal is to write a streaming media player for Android that can accept Windows Media streams. I'm okay with restricting myself to Android 2.0 and greater if I need to. My current device is a Motorola Droid running Android 2.0.1. There is one online radio service I listen to religiously on my PC that only offers Windows Media streaming, and I'd like to transcode the stream so my Android device can play it.
Is such a thing possible? If so, would it be feasible (i.e., would it be too CPU intensive and kill the battery)? Should I be looking into doing this with the NDK in native code instead of Java? I'm not opposed to writing some sort of service in between that runs on a desktop computer (even in C#), but ideally I'd like to explore purely device-based options first. Where should I start?
Thanks in advance for any insight you can provide!
Having a proxy on your PC that captures windows audio output, encodes it, and sends it to your phone is perfectly possible. I had something like that 8 years ago on a linux-based PDA (sharp zaurus). The trick is that you're not trying to decode or access the XM radio stream directly, you're simply capturing what is being sent to the speakers on your desktop and re-sending it. There will be a minor hit in audio quality due to the re-encode, but shouldn't be too bad.
I've done cloud-to-phone transcoding using an alpha version of Android Cloud Services. The transcoding is transparently done on a server and the resulting stream is streamed on the phone. Might worth having a look. http://positivelydisruptive.blogspot.com/2010/08/streaming-m4a-files-using-android-cloud.html
Im trying to create an app to control a pc remotely using java , i want to use red5 to let the admin control desktops using a flash movie
so i need to find java classes to :
capture desktop as live video
-control mouse and keyboard
TightVNC has a Java viewer so you can easily manage your server through VNC protocol and use a Java client (usable as an applet too).
There are tools, though not in Java, such as vncrec to record VNC sessions. I don't know if this is exactly what you are looking for since to distribute video a better choice would be to set up a streaming server.
I don't believe Java libraries exist that will give you what you want. Capturing the screen and controlling the mouse/keyboard require hooking into the OS in ways that Java doesn't do. You'd need to write a native DLL that uses JNI and exposes the functionality you need, and then load that in java. Then you'd have to convert the image data you get into a format that Flash can understand.
Creating a live video stream probably wouldn't be the most effective. Generally streamed video is fairly lossy (which generally isn't that great when screen sharing), and is at smaller resolutions than you'd probably be dealing with. I'm not sure you could create a video that would both stream fast enough and have high enough image quality to be usable.
You're probably better off using a pre-existing product for this (like TightVNC, as Fernando suggested). Unfortunately, that would require a java applet (or native application) to view, not Flash.
(Full Disclosure: I've written screen sharing applications that use Java on both ends, both the server and the viewer, and we've looked at trying to make a Flash viewer a number of times.)
Just use VNC. You can call the VNC server executable from Java. On the client you can use the Java VNC Viewer. I'm sure there's source code somewhere if you really need to make changes to it. You could use AppletWindow from BlueJ to launch the VNC viewer applet inside a JFrame in any Java app.