Yes, this topic keeps popping up from time to time also here on SO.
I've read a lot about this topic and also tried some solutions but I have some constraints:
browser-independency (most browsers should work)
platform-independent (major platforms should be supported)
work out-of-the-box (no plugins!)
low latency (preferred under 1 sec)
bandwidth is limited (MJPEG is not an option)
no transcode!
So going forward: an H264 stream seems perfect for constraints 1 and 2.
Also my source produce a live H264 (to be exact: MPEG-4 AVC, part 10) into an RTSP container.
But RTSP is still not supported in browser.
What I've checked:
How to embed streaming rtsp media into an html5 page
How can I display an RTSP video stream in a web page?
stream RTSP to HTML website
Displaying RTSP on website
stream RTSP to HTML website
How to get RTSP stream over web application
RTSP solution for JavaScript/HTML5
Streaming via RTSP or RTP in HTML5
All the posts above are related to this question, and a lot of valuable information was there.
Also I've read a very good article from 2014 (!) which is detailed and quite forward-looking.
So, as of today, the best solution would be this:
parse the RTSP and extract the h264 stream
restructure the stream (convert it to fragmented MP4)
websocket (see later)
fMP4 can be easily played by HTML5 video if the browser has the MSE (alternative is to use broadway.js that is cool but CPU intensive)
There are solutions where the step 1 and 2 happens on server side, then the fMP4 is pushed into a websocket. The client consumes the data from websocket and pass it to MSE components for displaying.
The article from 2014 shows that step2 can also happen on client side. In this case only step 1 happens on the server, then h264 is pushed into the websocket, and on the client side there is the restructuring and displaying of course.
Streamedian seemed a good solution for the first sight, but they doesn't publish their server side codes, and also their site returned with 502 error for a day.
I don't want to use GStreamer or ffmpeg, they are both too heavy.
However there are nice items which can help:
MP4Box.js - segment an MP4 file for use with the Media Source Extension API
mux.js - inspection and manipulation tools for video files
Going back to my list, step2 can be done with MP4Box - at least I believe / hope.
Step3 and step4 are straightforward, there are tons of howtos on these ones.
However I'm a bit puzzled with step1. It shall be done on server side, preferably in a language which can interact with websockets easily (like java).
That is the point of my question: I need to extract the h264 stream from RTSP in java, how can I do it simple but without calling external programs?
How about use WebCodecs API to decode H.264 packages in browser?
This API will take benefit of hardware acceleration offered by modern browser.
Btw, in my opinion, the delay is mainly occurred in the process of decoding.
Related
I'm looking for a way or a streaming server that allows streaming a list of local video files to multiple receivers, and all the receivers are synced, so they will all see the same video output at the same time.
I am not focused on a specific programming language or framework, I know there is a way of doing it in Actionscript & FMS , but since Flash is dying I'm not considering it a solution.
I suspect that by local video files you actually mean some files hosted on a server. Instead of WebRTC, which is peer-to-peer you'll better be using a media stream server and create some kind of "live" stream for all those files, so that the receivers are synced.
On the client side, just use the video tag with the appropriate URL.
I want to show in my servlet the webcam connected to the server, I read in many sites that I may use getUserMedia(); but that only gets the video webcam on the user, and not on the server.
How can I do that? My servlet is programmed in Javascript
First, I heavily doubt that your servlet is written in JavaScript. Are you sure we aren't talking of Java?
What you describe sound like you want to do a live stream of your webcam. Compared to video on demand, this is a demanding task and needs quite some knowledge and experience.
We are not talking of streaming from one point to the other ("unicast streaming"), but a multicast stream where somebody would open a web site and connect to the stream. In order to do that, you have to send the video stream of your webcam to a multicaster, encoded in a way suitable for the intended audience. So what basically happens ist that you capture the video data of your webcam, encode it to a format capable of being streamed, send it to a multicaster which copies the stream to every client that connects to that multicaster. This client can either be a stand alone media player such as Quicktime, VLC or WMP or a player embedded into a website.
So in short and a bit more specific, you have to do the following:
Capture the output of the webcam and encode it according to your intended audience. VLC is a good tool for that.
Set up a multicaster such as the excellent Darwin Streaming Server to which you send the stream to. This server has to be publicly accessible.
Create a link to the stream's description file (rdp file) usually generated by the Darwin Streaming Server. This will connect the client to the stream. An alternative may be a player embedded to your website, which basically is your choice.
Doing this right is not only programmers work, but a lot of sysadmin work, too. You have to do some bandwidth and capacity planning, optimizing the encoder, choosing the right codec and much, much more. All those choices are heavily influenced by the type and size of your intended audience, the purpose of the stream and a lot more.
This could be quite an intresting topic for people who are intrested in livestreaming from your device to a webserver. (Primary Android/Java)
I have finally found a way on how to livestream from my device's camera to my webserver (website). On a wifi network it takes approx. 1 frame/ second to show on a wifi network, It also works on EDGE/3G network. In this topic/question, I want to discuss new techniques, improvements, ideas about livestreaming as I will share mine with yours (codes are appreciated too.)
My code repeatedly takes a snapshot from the camera preview using setOneShotPreviewCallback() to call onPreviewFrame(). The frame is delivered in YUV format so raw2jpg() converts it into 32 bit ARGB for the jpeg encoder. NV21 is a YUV planar format.
getPicture() is called, by the application, and produces the jpeg data for the image in the private byte array mCurrentFrame and returns that array.
After this, the byteArray mCurrentFrame gets Base64Encoded and send to my webserver in a HTTP POST method together with the string value of Base64 and a own ID code so people won't be able to also send another image to it. At the webserver, it gets decoded again and putted into the file test.jpg. PHP and Javascript is running on the webserver. PHP gets the POST method and JavaScript reloads the image every 750 seconds. This is basically how it works.
Now I am very intrested in your ideas, improvements and other things you would like to add/ask. Here are some of my questions:
1) What would be the best method for live streaming WITH audio? Video Recording OR my method + Audio recording?
2) How would you approach video record streaming?
3) How would you stream audio to the webserver? (Main goal) (With Java, PHP and JavaScript)
4) I am also planning to add typical live streaming feautures to i, e.g. when a famous person appears, you could have the ability to show his name while you are live streaming, or just add an image from your sd directory to your livestream. Would you also decode it and overlay the image, or put the image in your livestream in some way?
This topic is primarly for questions, so please this could be some great help for some people out here. Therefore I added a bounty of 50 (woot!) rep to it.
Sincerely,
XverhelstX
It strikes me that http posting is probably not a good way to do live streaming of video to your server. Other people have been playing with live streaming and they've used a socket to broadcast live video streams and audio streams to their servers.
I thought this was fascinating -- here's a link.
http://www.mattakis.com/blog/kisg/20090708/broadcasting-video-with-android-without-writing-to-the-file-system
But the guy also posted a partial code sample -
String hostname = "your.host.name";
int port = 1234;
Socket socket = new Socket(InetAddress.getByName(hostname), port);
ParcelFileDescriptor pfd = ParcelFileDescriptor.fromSocket(socket);
MediaRecorder recorder = new MediaRecorder();
// Additional MediaRecorder setup (output format ... etc.) omitted
recorder.setOutputFile(pfd.getFileDescriptor());
recorder.prepare();
recorder.start();
The cool part I didn't know about is the ParcelFileDescriptor - which creates a file on the android filesystem that is actually a pipe to a socket - so anything that gets written to the file gets broadcast out over the web to a remote server. Sockets are the right way to go about doing this sort of thing too because they allow you to continuously send data until your recording is complete without having to re-send headers over and over.
What I think is cool about this technique is that he's literally taking the output from MediaRecorder (which is going to be an encoded video stream) and pumping it over a socket to his server. Then he can simply save out the data that's coming in over the socket. No frame by frame, no processing (Android SDK doesn't expose the encoders in the SDK very well and they're pretty performance intensive).
People report that it works, but I haven't tested. Anyway, hope this is helpful.
You are sending a whole snapshot each time? Why don't you try to use some video compressing techniques, like instead of sending a full image each time you send a compressed version (maybe a diff or something like that) and them on the server you create the image based on your last image and the data just received. I think all video codecs do this, you could try looking at some of the open codecs specification to get some ideas.
About audio. I would send the audio stream separated and them sync it with the video streaming based on which video frame we are showing right now.
Basically, I would try to get my streaming as close as possible to how a real video streaming works. Maybe you could look into ffmpeg, ffmpeg has an rtsp server if you could build that for android them you would simplify your work a lot.
Note: I'm not an Android developer.
From what you've said it seems like your just taking a snapshot instead of any real streaming. If your worried about bandwidth then use a lower resolution. Exactly how to do this in android I'm not sure
I think that if there's built in streaming classes that you'll be able to get both the video stream and the audio stream. Don't do any local transcoding (your raw2jpg() counts as transcoding) as it might use too much processing power. Just take the stream, compress it, and send it to your server.
EDIT:
Some Links to get you started
An interesting project that turns the Android phone into an IP camera. You could dig around the code to figure out how they get ahold of the camera stream
An SO question on this topic
1) What would be the best method for live streaming WITH audio? Video Recording OR my method + Audio recording?
This really depends on your view of "best". If you are looking for resources and not the quality, then your way is really good.
Otherwise, you should use a native streaming mechanism or maybe implement a video streaming technique to stream and encode video.
3) How would you stream audio to the webserver? (Main goal) (With Java, PHP and JavaScript)
I suggest that you stick to MediaRecorder because it really does what your doing in a good way. Still try to find a way to get the stream in order to send in your way as files are not the best choice although you could stick to files and send small files in a timely manner. In this way you could put a bigger portion of the load on the server rather than the client.
4) I am also planning to add typical live streaming feautures to i, e.g. when a famous person appears, you could have the ability to show his name while you are live streaming, or just add an image from your sd directory to your livestream. Would you also decode it and overlay the image, or put the image in your livestream in some way?
Do not even try to put it in your livestream. With your php server, you have more capabilities to send this info alone with certain tag and let the server do the processing or maybe integration of these with the video
I'm creating application where user can record voice message, problem is that recorded file is passed to other encoding servers, that requires both video and audio streams to be present.
So my question is how can I use static image to emulate webcam and attach it to NetStream?
ns = new NetStream(nc);
ns.addEventListener(NetStatusEvent.NET_STATUS, watchRecording);
//ns.attachCamera(cam);
ns.attachAudio(mic);
ns.publish(fileName.text, "record");
[+]
Although I don't have access to application on WOWZA server I can negotiate with some one to add few lines and recompile it. So server-side solution in java is also an option here.
I'd recommend you use a server side solution, as creating a client side solution would involve sending more packets than is actually needed. Plus, I'm sure Java has better tools to emulate 'video'.
I write an (Java-) application that streams a video from one peer to another. I use a library that is able to produce and consume an RTP stream (Xuggler that is). I thought about using Red5 Media Server to relay the stream. What I need next is to send and my video stream.
The documentation I read so far always deals with recording streams or streaming prerecorded videos (and of course the web cam). Also there is quiet some amount of Actionscript code that does not help me at the time. (I belief...)
So my question is: Can Red5 help me? (I.e. should I continue reading or is there another - more direct - solution?) Could you please give me some pointer where I would find suitable documentation?
Red5 is primarily for streaming to Flash clients over RTMP,RTMPT,RTMPS etc. It is not limited to these protocols, but they are the ones available "out of the box". If you want to run xuggler inside red5 to consume RTP and then publish to RTMP then you are in luck as this is very ez to do. If you want to republish as RTP, you have some additional work cut out for you.