I installed Red5 1.0 on EC2 running linux. My goal is to record webcam video from my website -- connect to a user's webcam and save the video to S3.
I tried out the video recorder application in the pre-installed demo apps. It works, but when I play back the recorded flv video, the quality is terrible.
At best, the video is extremely pixelated and blurs with motion
At worst, the video doesn't even play -- it just stays stuck on one frame
Most often, the video and audio are totally out of sync and choppy. I found that I could affect this be increasing the buffer allowance on the server using one of the config files, but increasing the buffer seems to cause the video to get choppy.
I've tried connecting with multiple computers and even a fast corporate internet connection. Interestingly, the quality issues persist even when connecting to localhost, so it doesn't seem to be a network problem.
When I use the red5-recorder.com flash app to record to the demo server app, the quality is even worse.
Ultimately, I just want to get a high quality video recording from a visitor's webcam, but don't want to drop the money for FMS or Wowza.
Any ideas on how to get Red5 to record high quality? Is it always this bad?
Thanks for your help!!
The quality of recorded video is not related with RED5 settings but your flash app settings. Just try to set Camera.setQuality() to something more suitable to your needs. For example if you use setQuality(0, 100) you'll get the best possible video quality but bandwidth use also increase.
We experience the same problem with 0.91 version. I read somewhere that 0.8 was fine. You might try that one..
Try using Red5 RC1 it will surely give you much better recording but i am also trying to find something even better and i have not come up with something till now..if you solved your problem i would be glad to hear a better approach..
All Red5 versions (up to and including 1.0.2 ) have been plagued by serious video recording issues. See this answer for a list of all versions and their issues.
Red5 1.0.3 is the 1st version of Red5 with the video recording process fixed because it's the 1st to contain this awesome patch.
Quick explanation of the 2 part cause
Flash Player buffering (only) video packets
Flash Player is known to buffer video packets and send only audio packets when the network conditions do not allow it to send both.
This works very well for live video scenarios where you want to keep at least the audio going but NOT for video recording scenarios where the locally buffered video packets end up arriving too late at the media server (the corresponding .flv section might have already been written to disk).
That's why AMS and Wowza have implemented delayed write mechanism where they wait for the video packets to arrive before writing the data to disk.
Red5's bug
Red5 also had such a mechanism but, due to a serious bug, it was dropping the video packets instead waiting for them.
The bug fas fixed with the patch mentioned above.
How much Red5 will wait for buffered video packets is controlled by fileconsumer.queue.size in conf/red5.properties. it defaults to 120 which should be enough for a buffer of 2 minutes of HD video.
Further reading
delayed.write and queue.size mechanism flawed? on the Red5 mailing list
Recording high quality (HD) video over slow connections with Red5 is now possible
Related
I'm using vlcj in java to playback wav files, that are being expanded by other services. My problem is the following:
when I open an audio file the player knows that the duration is x sec. If it reaches the end it has to reopen (player.controls().start()) the file to know the expanded duration. This restart causes a micro stutter in playback (like 0.1 second) but it can be heard, and if the expanding process is slow than the stuttering can happen like every 2 seconds.
I have no way to modify the underlying architecture, so i can not use streaming.
The main issue as I see after hours of research is that vlcj do not provide a way to manually update the duration (that i can calculate by the file size) of the played audio file (it probably loads it to memory on open i guess).
Does somebody tried to implement this kind of playback with success?
After extended research in the topic I concluded that stutterless playback with the above described setup cannot be achieved. It can be optimized greatly, but will never be perfect. The root of the problem is that the length (duration) of the played wav file cannot be set manually, thus reload is a must for rescan.
My solution for this was to implement a live555 RTSP streaming server between the storage and the client. The vlcj player connects to the rtsp stream and plays the (expanding) file seamlessly. The drawbacks are that the playtime needs specific handling, and has less performance (slower response to events).
I am trying to create a low latency method to use an android device as a secondary display for a PC. So far all I have found has been either wireless streaming, or a slow usb connection (i.e. using iDisplay).
However, I found a DSLR camera contoller app (https://play.google.com/store/apps/details?id=com.dslr.dashboard/) that is able to stream a live feed of the camera to an android display via USB. Would it be possible to edit the source code of this application so it can read the video output of PC via USB? If so, how would you go about this? Do you think that this would be a low latency alternative?
Thank you!
Lots of fantasy in your question. Have you ever seen a PC outputting data from one of its USB ports to another device? How are you supposed to do that? With a plain male-to-male USB cable, in case you find one? Sorry but things don't go that way. To transfer data (files, or a network) via USB between two computers you'd need some propietary/specific software. Of course, once you have acomplished that is technically possible to transfer files with the screen content. Buy you'd need to develop a software that would capture the computer screen, compress it in real time, and send it through USB with enough low latency to be usable. That's going to be resource intensive.
A better, easier approach would be, maybe, using some sort of remote desktop or VNC on the Android machine, with the computer acting as a server. At least far more feasible than trying to implement a similar protocol by yourself.
Sorry but what you are trying to achieve is flawed from the beginning.
I download an MP4 file from the internet and play it on the blackberry device. I get the following error: "the video portion of the media being played uses an unsupported format". The audio starts playing, but the video doesn't play, while showing this error.
It should be noted that this only happens on Device OS 5 & 6. The same video plays properly or OS 7 and OS 7.1. I am guessing this is because RIM included some updates to MMAPI. What could i do to allow devices prior to OS 7 play the videos? OS 5 & 6 devices play MP4 files, just not all of them.
I have been looking into custom decoding the bytes of the MP4 file, but that will take a lot of time, looking into existing decoder implementation before adapting to J2ME, not an easy task.
Any help would be great here.
Edit:
The video content owners have control of the videos on the server side, but arent willing to re encode, mainly due to size concerns on the server, even though i recommended they do as well.
The resolution of the video is about 720w x 400h. This is quite high for a BB, but Bold 9790 and Torch 9810 both play it without a problem. So why cant Bold 9780 play the same file?
Update:
Regarding the problem with the video playing on a 9790 and not a 9780, those are different devices. The 9790 came out about a year after the 9780, and apparently RIM added more capability.
From 9780 specs:
Video player DivX/WMV/XviD/3gp
From 9790 specs:
DivX/XviD/MP4/H.264/H.263/WMV player
So, that explains why you can't get that video to play on the 9780. If playing this video is fundamental to your app, you might change the settings in BlackBerry App World to list it as incompatible with 9780s. If this is only one of many features of your app, you might at least catch the media exception and inform the user gracefully that their device can't play the video requested, so they don't think it's your app's fault.
Original Answer:
MP4 actually contains a family of related formats.
The actual support for your video probably depends both on your BlackBerry OS version (e.g. 5/6/7) and also the device itself.
Here is a BlackBerry reference document that describes video format capabilities of various BlackBerry devices.
See also this reference document.
Of course, different devices also have different sizes of screens.
It might be useful for you to produce the videos in a variety of formats and resolutions, and have your BlackBerry app download different versions of the video depending on the device. Since video downloads are slow, doing it this way will also ensure that the user sees the fastest possible download on their device. There's no use downloading a higher resolution than the device can display.
You didn't specify whether you have control of the videos on the server side or not, so this may not be an option for you.
I need to set up live streaming from a number of web-cameras to the internet (in browsers), and the streams should be visible only to particular users. I.e. user A logs in to my system with his or her login/password, goes to the video stream page, and sees the stream from a particular cam, and other users cannot see that video, even if they know the url to that stream.
I've looked at a number of solutions so far, but some of them are obsolete, most of them are for image processing, recognition and the like, and some are just a bit too cumbersome, like Red5, for example.
Is there a relatively simple solution for that, that would just allow me to get a videostream from a particular cam connected to my computer?
Thanks in advance.
If you are using Linux I have had success with V4L4J (Video 4 Linux 4 Java) which is small and really quite cool. You would need to do quite a bit of work to get streaming working for a low bandwidth connection but if it is over LAN playing back an MJPEG stream over a TCP socket is easy beans :-)
http://code.google.com/p/v4l4j/
Good luck.
Have a look at Java Media Framework to communicate with your cam.
Let me first state that I do not know Java. I'm a .NET developer with solid C# skills, but I'm actually attempting to learn Java and the Android SDK at the same time (I know it's probably not ideal, but oh well, I'm adventurous :))
That said, my end goal is to write a streaming media player for Android that can accept Windows Media streams. I'm okay with restricting myself to Android 2.0 and greater if I need to. My current device is a Motorola Droid running Android 2.0.1. There is one online radio service I listen to religiously on my PC that only offers Windows Media streaming, and I'd like to transcode the stream so my Android device can play it.
Is such a thing possible? If so, would it be feasible (i.e., would it be too CPU intensive and kill the battery)? Should I be looking into doing this with the NDK in native code instead of Java? I'm not opposed to writing some sort of service in between that runs on a desktop computer (even in C#), but ideally I'd like to explore purely device-based options first. Where should I start?
Thanks in advance for any insight you can provide!
Having a proxy on your PC that captures windows audio output, encodes it, and sends it to your phone is perfectly possible. I had something like that 8 years ago on a linux-based PDA (sharp zaurus). The trick is that you're not trying to decode or access the XM radio stream directly, you're simply capturing what is being sent to the speakers on your desktop and re-sending it. There will be a minor hit in audio quality due to the re-encode, but shouldn't be too bad.
I've done cloud-to-phone transcoding using an alpha version of Android Cloud Services. The transcoding is transparently done on a server and the resulting stream is streamed on the phone. Might worth having a look. http://positivelydisruptive.blogspot.com/2010/08/streaming-m4a-files-using-android-cloud.html