Capture video to memory and play back with actionscript? - java

This may or may not even be possible, but here's the situation: I want to use the ActionScript 3 Camera class to capture a video from a local camera (webcam, built-in camera, etc) and then play that video back within the flash application.
I'm considering the possibility of sending it to a Flash Media Server and then streaming it back as an on-demand video, but I would ideally like to keep the whole thing client-side for best performance.
I'm open to the idea of using a different platform (Java was one consideration) as long as it can be embedded in a web page, but I would like to keep development as straightforward as possible and make the process of accessing the application as easy as possible for the end user, which is why I chose Flash initially.
If anyone knows of a way to do this I welcome any input.

Okay, here's an update for anyone else who might be up against the same hurdle I was. I was able to accomplish what I wanted — to record a video, allow the user to preview it, then upload it from one flash application — by utilizing a utility written by Lee Felarca (zeropointnine — http://www.zeropointnine.com/ ) called flvEncoder.
The concept is as such:
Record audio and video data to raw format (much like Valentin Simonov suggested)
Pass the data to flvEncoder for encoding in Flash FLV format and get a ByteArray back. I know it seems redundant to say Flash FLV, but I word it that way because Flash and Adobe Media Player appear to be the only things capable of interpreting the result.
Create a NetStream instance and put it in Data Generation Mode, use the appendBytes() method to pass the encoded data to a Video object linked to an input NetStream.
Use FileReference.upload() to send the data to the server in an HTTP request.
It could potentially eat a lot of memory, but I only needed to record short videos anyway. I won't post the code here because it's messy and tied to a proprietary project, but I hope this information is helpful to someone. Thanks for the responses!

The easiest way would be to use FMS, Wowza or Red5 media servers. You just use NetStream to send data to your server, save movies there and stream back.
Also I suppose it is the only reliable way of doing it. Camera, Video or NetStream objects don't have access to actual video bytes. What you could do is to add an instance of Video to your Camera and draw it into a bitmap every 1/24th a second. After that you will still have to encode data or you'll run out ouf memory very fast. Here I'm not sure if there are any flv/h264 codecs made with as3 available. But anyway I bet it will be slow.

Related

How to play the audio of a SoundCloud song via URL input

I am coding a Minecraft plugin, (Just a modification to the game Minecraft but used on a Minecraft server, not a client modification)
and I want to create something that when a player inputs
"/playsong"
along with a SoundCloud URL, it will find that URL and loop it.
(being able to play a soundcloud playlist will be much appreciated)
All the other work will be done by me. (checking if the URL is null, testing if the player sends /playsong along with a valid URL, etc.)
Adding comments with // telling me what certain bits of code do will be much appreciated.
Thank you for reading.
~Matthew274
This generally is not possible. Minecraft does not have any way to play an arbitrary song that requires downloading from a remote server. There are a few ways you could still do this, but it isn't simple.
One technique would be to take the song and attempt to convert it to a series of notes for use with /playsound (or noteblocks). That isn't easy nor will it give a perfect result, but it is hypothetically doable (though I don't know any of the details).
The other technique would be to create a resource pack, send it to the client, and then have them play the song. That as well is nontrivial, but still doable. The general procedure would be:
Find and download the song from soundcloud (I don't know soundcloud's API for this, but I assume they have one).
If necessary, convert the song into a .ogg file from .mp3. There's probably also a library for this.
Create a resource pack with that sound (and the appropriate sound index) and temporarily make it for download on your server.
Send a Resource Pack Send packet linking to that resource pack.
Wait for the client to respond with Resource Pack Status of "accepted".
Play the song using the Named Sound Effect packet (or /playsound).
This process is not simple, but if you do somethings like keeping a collection of songs in the playlist and sending multiple songs in the resource pack, you should be able to implement it. You'll need ProtocolLib to send custom packets.
I'm sorry I can't give code for this, but it's a complicated task that would require several parts, none of which I know the full details on.

How to stream server webcam to servlet

I want to show in my servlet the webcam connected to the server, I read in many sites that I may use getUserMedia(); but that only gets the video webcam on the user, and not on the server.
How can I do that? My servlet is programmed in Javascript
First, I heavily doubt that your servlet is written in JavaScript. Are you sure we aren't talking of Java?
What you describe sound like you want to do a live stream of your webcam. Compared to video on demand, this is a demanding task and needs quite some knowledge and experience.
We are not talking of streaming from one point to the other ("unicast streaming"), but a multicast stream where somebody would open a web site and connect to the stream. In order to do that, you have to send the video stream of your webcam to a multicaster, encoded in a way suitable for the intended audience. So what basically happens ist that you capture the video data of your webcam, encode it to a format capable of being streamed, send it to a multicaster which copies the stream to every client that connects to that multicaster. This client can either be a stand alone media player such as Quicktime, VLC or WMP or a player embedded into a website.
So in short and a bit more specific, you have to do the following:
Capture the output of the webcam and encode it according to your intended audience. VLC is a good tool for that.
Set up a multicaster such as the excellent Darwin Streaming Server to which you send the stream to. This server has to be publicly accessible.
Create a link to the stream's description file (rdp file) usually generated by the Darwin Streaming Server. This will connect the client to the stream. An alternative may be a player embedded to your website, which basically is your choice.
Doing this right is not only programmers work, but a lot of sysadmin work, too. You have to do some bandwidth and capacity planning, optimizing the encoder, choosing the right codec and much, much more. All those choices are heavily influenced by the type and size of your intended audience, the purpose of the stream and a lot more.

Live Streaming Topic

This could be quite an intresting topic for people who are intrested in livestreaming from your device to a webserver. (Primary Android/Java)
I have finally found a way on how to livestream from my device's camera to my webserver (website). On a wifi network it takes approx. 1 frame/ second to show on a wifi network, It also works on EDGE/3G network. In this topic/question, I want to discuss new techniques, improvements, ideas about livestreaming as I will share mine with yours (codes are appreciated too.)
My code repeatedly takes a snapshot from the camera preview using setOneShotPreviewCallback() to call onPreviewFrame(). The frame is delivered in YUV format so raw2jpg() converts it into 32 bit ARGB for the jpeg encoder. NV21 is a YUV planar format.
getPicture() is called, by the application, and produces the jpeg data for the image in the private byte array mCurrentFrame and returns that array.
After this, the byteArray mCurrentFrame gets Base64Encoded and send to my webserver in a HTTP POST method together with the string value of Base64 and a own ID code so people won't be able to also send another image to it. At the webserver, it gets decoded again and putted into the file test.jpg. PHP and Javascript is running on the webserver. PHP gets the POST method and JavaScript reloads the image every 750 seconds. This is basically how it works.
Now I am very intrested in your ideas, improvements and other things you would like to add/ask. Here are some of my questions:
1) What would be the best method for live streaming WITH audio? Video Recording OR my method + Audio recording?
2) How would you approach video record streaming?
3) How would you stream audio to the webserver? (Main goal) (With Java, PHP and JavaScript)
4) I am also planning to add typical live streaming feautures to i, e.g. when a famous person appears, you could have the ability to show his name while you are live streaming, or just add an image from your sd directory to your livestream. Would you also decode it and overlay the image, or put the image in your livestream in some way?
This topic is primarly for questions, so please this could be some great help for some people out here. Therefore I added a bounty of 50 (woot!) rep to it.
Sincerely,
XverhelstX
It strikes me that http posting is probably not a good way to do live streaming of video to your server. Other people have been playing with live streaming and they've used a socket to broadcast live video streams and audio streams to their servers.
I thought this was fascinating -- here's a link.
http://www.mattakis.com/blog/kisg/20090708/broadcasting-video-with-android-without-writing-to-the-file-system
But the guy also posted a partial code sample -
String hostname = "your.host.name";
int port = 1234;
Socket socket = new Socket(InetAddress.getByName(hostname), port);
ParcelFileDescriptor pfd = ParcelFileDescriptor.fromSocket(socket);
MediaRecorder recorder = new MediaRecorder();
// Additional MediaRecorder setup (output format ... etc.) omitted
recorder.setOutputFile(pfd.getFileDescriptor());
recorder.prepare();
recorder.start();
The cool part I didn't know about is the ParcelFileDescriptor - which creates a file on the android filesystem that is actually a pipe to a socket - so anything that gets written to the file gets broadcast out over the web to a remote server. Sockets are the right way to go about doing this sort of thing too because they allow you to continuously send data until your recording is complete without having to re-send headers over and over.
What I think is cool about this technique is that he's literally taking the output from MediaRecorder (which is going to be an encoded video stream) and pumping it over a socket to his server. Then he can simply save out the data that's coming in over the socket. No frame by frame, no processing (Android SDK doesn't expose the encoders in the SDK very well and they're pretty performance intensive).
People report that it works, but I haven't tested. Anyway, hope this is helpful.
You are sending a whole snapshot each time? Why don't you try to use some video compressing techniques, like instead of sending a full image each time you send a compressed version (maybe a diff or something like that) and them on the server you create the image based on your last image and the data just received. I think all video codecs do this, you could try looking at some of the open codecs specification to get some ideas.
About audio. I would send the audio stream separated and them sync it with the video streaming based on which video frame we are showing right now.
Basically, I would try to get my streaming as close as possible to how a real video streaming works. Maybe you could look into ffmpeg, ffmpeg has an rtsp server if you could build that for android them you would simplify your work a lot.
Note: I'm not an Android developer.
From what you've said it seems like your just taking a snapshot instead of any real streaming. If your worried about bandwidth then use a lower resolution. Exactly how to do this in android I'm not sure
I think that if there's built in streaming classes that you'll be able to get both the video stream and the audio stream. Don't do any local transcoding (your raw2jpg() counts as transcoding) as it might use too much processing power. Just take the stream, compress it, and send it to your server.
EDIT:
Some Links to get you started
An interesting project that turns the Android phone into an IP camera. You could dig around the code to figure out how they get ahold of the camera stream
An SO question on this topic
1) What would be the best method for live streaming WITH audio? Video Recording OR my method + Audio recording?
This really depends on your view of "best". If you are looking for resources and not the quality, then your way is really good.
Otherwise, you should use a native streaming mechanism or maybe implement a video streaming technique to stream and encode video.
3) How would you stream audio to the webserver? (Main goal) (With Java, PHP and JavaScript)
I suggest that you stick to MediaRecorder because it really does what your doing in a good way. Still try to find a way to get the stream in order to send in your way as files are not the best choice although you could stick to files and send small files in a timely manner. In this way you could put a bigger portion of the load on the server rather than the client.
4) I am also planning to add typical live streaming feautures to i, e.g. when a famous person appears, you could have the ability to show his name while you are live streaming, or just add an image from your sd directory to your livestream. Would you also decode it and overlay the image, or put the image in your livestream in some way?
Do not even try to put it in your livestream. With your php server, you have more capabilities to send this info alone with certain tag and let the server do the processing or maybe integration of these with the video

How do you create a web audio playlist?

I'm researching ways to create a web radio station of sorts. It will have streaming MP3 audio from TV programs for users to listen to. They should have the option of just listening to the stream or pick the shows they'd like to hear and add them to their playlist.
It needs to be usable by folks on mobile devices, so Flash is out for that reason. Also, the admin folks should be able to add programs to the player and maintain the list of available programs.
Are there any existing tools for such an app? We work in a Unix, PHP, Java environment with MySQL and Oracle db. We'll even take a solution that's in ASP.NET! Your assistance is much appreciated. Thanks.
As a server, you might consider using SHOUTcast, by the same folks who've made Winamp. SHOUTcast can stream audio in a number of formats. Or, you can write a web application that dishes content over HTTP with the proper MIME type set.
SHOUTcast - download info # classic.shoutcast.com
To reference content on clients, you should consider using .M3U format for delivery. This allows you to specify a playlist that is application-agnostic.
M3U format # Wikipedia

How to create a stream of jpegs(live) in c++ or c# or java? RTSP?

I am building a server sided html rendering browser that renders html and sends jpegs to the mobile client. I need to figure out how to build a server that grabs jpegs and streams them in a session to a client that i am going to write in j2me
It isn't completely clear what you mean by "live" but I'm guessing that you are talking about making requests to a server side process that renders URLs passed in and returns an image. One of the easiest ways I know of doing this is with Java and SWT. You can use the SWT browser widget and capture the canvas then convert that to whatever image type you want. The browser widget uses firefox to render pages so they should look pretty good.
I would write a servlet that serves one jpeg at a time, and a midlet that requests the next jpeg every so often.
Well, there are better solutions than plain JPEGs. I've implemented systems like this, and you would do better to use a video codec such as MPEG-2, MPEG-4 ASP, H.264, etc than JPEG, and send updates as p-frames (i.e. deltas from the previous image), and if there's "too big" a change (or missed update, or new client added to an existing stream), send an i-frame.
Even without using a video codec, sending differences will often be preferable. Use some other mechanism to encode diffs.
In terms of how to get the buffer to send, you can use a number of framebuffers to render to, and set up the framebuffer code to start a timer when a change is made, and while changes are occurring send periodic changes, when enough time has gone by since the last change (not sent yet) send an update (probably shorter time than the first value), and also probably include some sort of strobe that forces an update that can be invoked on certain occurrences (if you can, for example, get a page-load-completion indication from the browser, which you can with a little work in Firefox by changing the chrome, etc).
[added]
For examples of other solutions, look at remote-desktop protocols and programs like VNC, RDP (Windows Remote Desktop), etc - that's effectively what they're doing, again with fancier compression and damage-region tracking.
For framebuffers, you can use standard linux/etc framebuffer code (probably simplest), or even something like XVFB (which gives you access to more info about what and why things are changing than a raw framebuffer).

Categories

Resources