Audio conference using jsp servlet - java

I want a make a setup where each client send their audio stream to a server. Now server mix up different audio stream and broadcast to every client and client play that sound. I tried webrtc to grab the microphone now I faced the problem how I send the data to server. Some how websocket send the blob to the server I got another issue how I mix them up and maintain synchronization. I am planing to do this using java. Most importantly I have to store the entire audio communication to server for later use.

Related

Two way communication between Android and PC

I am working on an Android application which is making real time communication between Android device and PC.
I want to record audio signal and then send it to a server where it will be save to .wav file online.
By now I made an application which is streaming audio and playing it, but I want to save that bytes to a file on the computer.
The problem is that, can I send command from server which firstly starts an streaming applcation on Android and then send command which stops receiving bytes? It will allows me to get array with bytes, which can be save to .wav file.
I'm using TCP protocol.
You may use a socket connection between you computer and the mobile device that the PC may use to notify that the android should start streaming (or even transfer the bytes stream through it).
There is a bunch of libraries (on both client and server side) that implements socket communication. Two big player there are:
SignalR
Socket.io
You may also use the Android Socket API to implement that (if you don't want to use a third party library)
Socket - Android Developers

send Stream Data to icecast

i want to develop client that bind to icecast server just like butt or edcast but using java, I've found some library like jshout,libshout but I cant make it work in windows ;(, so I'm thinking not depend to some library, I got some information how to stream to icecast server from this link Icecast 2: protocol description, streaming to it using C# , my question is how o send the binary stream data to icecast server? should i using socket or there's another way to do that?
Thx
It's a simple HTTP 1.1 PUT request (just for now without chunked encoding) if you are running Icecast 2.4.0 or newer.
Once the connection is established, you just keep sending data from your encoder/muxer.
If you want to know what headers to send etc, then looking at libshout sources should help.

Android sensors reception to sockets

I have been unable to manipulate a problem of receiving a data from sensors and send them via a socket in its own thread without loosing a data.
I have an Android service that needs to receive data from sensors. I want to open a TCP socket that connects to a PC to send the gathered data. My problem is that receiving data is fater than the socket task, it is unable to send all data gathered by sensors, since the socket service is doing things with interface.
Please any idea where I may find an example of a socket in synchronization with the reception of data?
This may help: Using an iOS device to control a game on your browser
Since they are using standard html5 I think it is possible to implement the same in andorid.
Plus nodejs and socket.io are great.

Flex : How can I get the audio stream on server side(tomcat)?

One of my web applications have developed in java and using tomcat server. Now I want add one more feature in my application that is peer to peer audio streaming. Actually I want that anyone speak (using microphone) on the client side and I will hear his voice on my server speakers and vice versa. Also save our communication in any file and also send the audio stream to IP intercom.
For that I am trying to use Flex Builder. Flex NetStream class is good for the streaming and we can also attached microphone. But the problem is on the server side. How can I get the audio stream on server side?
Or any other idea how can I get stream from server to client and vice versa?
I think the easiest way to do this would just be to run another client on your server. Maybe even a "special client".

how can I use multiple Players in J2ME to simulate streaming over Http

I need to connect to a http server from a phone and play a movie while downloading. I understand that you can simulate this using multiple players (Manager.createPlayer(...)) in J2ME but I dont know how.
Thanks
I just want to suggest some possible approach. But, these methods also depends on device's implementation of MMAPI.
Use RTSP protocol instead of HTTP. It needs server side support
Create new DataSource class which downloads video data by using socket
Get video data by using socket and append it to file. While downloading it pass file data to MMAPI as file:/// protocol

Categories

Resources