i am trying to do a live streaming app in android, my question is what is the difference between using http and rtsp is there any way to only java code to do this, i refer so many projects already done are using java with other language combination, is there any way to stream in efficient way using java only
RTSP means Real Time Streaming Protocol, is a protocol specifically designed for streaming purpose, with RTSP you can control absolute positioning within the media stream, recording and possibly device control etc
RTSP introduces a number of new methods and has a different
protocol identifier.
An RTSP server needs to maintain state by default in almost all
cases, as opposed to the stateless nature of HTTP.
Both an RTSP server and client can issue requests.
Data is carried out-of-band by a different protocol.
If you want to use video streaming you have to use RTSP
See this LINK for more details about the protocol RTSP
NB
To show the video content in Android you can use the VideoView
myVideoView = (VideoView) findViewById(R.id.myview);
myVideoView.setVideoPath("rtsp://SERVER_IP_ADDR:5544/");
myVideoView.setMediaController(new MediaController(this));
As described HERE
I would suggest you to go for RTMP (Real time messaging protocol) instead of RTSP, there are number of open source plugins available in the market, like the famous "flowplayer" which is capable of streaming the video as per the industrial standards using the RTMP protocol. It has rapidly developed its capability to stream videos on Apple devices with the existing Flowplayer source plugin. Hope this helps
Flowplaye : flowplayer website
We are currently using the Akamai streaming capability coupled with flowplayer plugin for a flawless streaming experience.
Related
I am stumped.
I need to stream a video using my own server to a player.
How to stream a video stream to the WebView player?
Server ====socket video stream=====> Client ====> WebView player
If you just want a simple solution that works you can use a static video file on your server and the HTML5 video tag in your webView.
On Android, at this time, you need to have hardware acceleration turned on to support HTML5 video - see 'HTML5 video Support' note at the link below. You also need to set a WebChromeClient for the WebView which is called when anything affecting the UI happens - see the note 'Full Screen Support' at the same link also:
http://developer.android.com/reference/android/webkit/WebView.html
You can find a full working example which appears to be well maintained here:
https://github.com/cprcrack/VideoEnabledWebView
Your server then just has to serve the static video file in the same way it would any static content and it will be delivered using HTTP progressive downloading which looks and feels like streaming to your end user.
This does have limitations, however, as it is not 'real' streaming and you can't for example support adaptive bit rate streaming which allows you deliver different bit rates depending on the network connection. It much simpler however and it does not require a dedicated streaming server.
If you do want to use a dedicated streaming server then it is worth being aware that this is a relatively complex domain and that there are some open source streaming servers available that you might want to take a look at, for example:
http://gstreamer.freedesktop.org
http://www.videolan.org/vlc/streaming.html
Is there a java api that allows you to use Netstream and netconnection functions from adobe? I'm working on my android app and I am trying to add a video chat feature on the app. Since the rest of the app is already coded in java it doesn't seem like I can attach a swf file to one of the activities. So I'm looking for alternative solutions to to connect with a web based flash video chat website from the app.
There is the JUV RTMP tool which is Java API to access RTMP servers. It also support audio/video streaming but you will need to provide the codec code.
It is a paid product, but at least works fine. I could never find a good open source solution for this.
I have a plan to develop RTSP streaming server with java. Now I have to decide with library used to decode media and stream data in rtp format. I am looking into vlcj and xuggler for video decoding and streaming. I did some research about differences of these library but I cannot make a decision yet. So I would like to ask you guys if I want to build a server which provide following feature. Do you think which one is better?
1. can stream video on demand to multiple users
2. can receive stream in mms format and restream it to rtsp format
Initially, I try ffmpeg and ffserver but there is audio out of sync problem so I decide to make my own server. the good way to use vlcj is to use out-of-process but I am afraid about performance in video on demand. I think about xuggler but I am afraid that it will have the same problem as I have in ffmpeg.
Could you throw me your opinion which one is appropriate on this situation?
With VLCJ (or Xuggler) depend that the machine has run the necessary libraries installed. Also, personally, I could not do a pilot with VLCJ Streaming server (and I've tried for a long time).
Java SE provides a framework called JMF (Java Media Framework) for developing, among other things, a streaming server
http://www.oracle.com/technetwork/java/javase/tech/index-jsp-140239.html
This framework is not the best there is, but it's work.
As a final note, I can say that I have developed a streaming server in Java with JMF, you can see it here http://code.google.com/p/servidor-streaming-rtp-rstp-java/ to give you a reference.
Regards!
I want to read in a live video stream, like RTSP, run some basic processing on it, and display it on a website. What are some good ways to do this? I have used OpenCV for Python before but found it to be a hassle. I am also familiar with Java and C++ if there are better libraries available. I haven't done a lot of web development before either.
What kind of live video source that you mean? If you don't intend to do this code-wise, you can use the free VLC Player to act as a streaming service in between any kind of media stream source (file, network, capture device, disc) and your web video client.
But, if you intend to do this code-wise, you can you VLCJ library. Other options can be Xuggler or FMJ.
What is better to study. Java or Actionscript for audio & video recording and streaming.
And working with p2p.
Can java stream video/audio via p2p?
Thanks ;)
ActionScript has simple classes with well defined protocols designed specifically for accessing a machines audio and video devices. With flash 10.0, there is support for direct p2p connections with other machines, and with 10.1, support for p2p mesh networks. See the code example here for out of the box audio/video/p2p: ActionScript 3.0 Reference - NetGroup
With Java you can access the sound system (input and output) using Java Sound, and you can stream it over a socket. You'll probably want compression, and if you want that to be MP3 you'll also need JLayer.
I don't know about video in Java, nor do I really know the options of Flash/AIR in this area.