One of my web applications have developed in java and using tomcat server. Now I want add one more feature in my application that is peer to peer audio streaming. Actually I want that anyone speak (using microphone) on the client side and I will hear his voice on my server speakers and vice versa. Also save our communication in any file and also send the audio stream to IP intercom.
For that I am trying to use Flex Builder. Flex NetStream class is good for the streaming and we can also attached microphone. But the problem is on the server side. How can I get the audio stream on server side?
Or any other idea how can I get stream from server to client and vice versa?
I think the easiest way to do this would just be to run another client on your server. Maybe even a "special client".
Related
I am working on an Android application which is making real time communication between Android device and PC.
I want to record audio signal and then send it to a server where it will be save to .wav file online.
By now I made an application which is streaming audio and playing it, but I want to save that bytes to a file on the computer.
The problem is that, can I send command from server which firstly starts an streaming applcation on Android and then send command which stops receiving bytes? It will allows me to get array with bytes, which can be save to .wav file.
I'm using TCP protocol.
You may use a socket connection between you computer and the mobile device that the PC may use to notify that the android should start streaming (or even transfer the bytes stream through it).
There is a bunch of libraries (on both client and server side) that implements socket communication. Two big player there are:
SignalR
Socket.io
You may also use the Android Socket API to implement that (if you don't want to use a third party library)
Socket - Android Developers
I want a make a setup where each client send their audio stream to a server. Now server mix up different audio stream and broadcast to every client and client play that sound. I tried webrtc to grab the microphone now I faced the problem how I send the data to server. Some how websocket send the blob to the server I got another issue how I mix them up and maintain synchronization. I am planing to do this using java. Most importantly I have to store the entire audio communication to server for later use.
I'm new to gstreamer and I need some push in the right direction. I need to create a chat application that uses webcams and audio on both ends and uses java to do this. I dont want to install other stuff on the client more than my application. And the server can be used as a connection between the clients OR used on one end.
Is it possible to read datastream sent from C++ server program to C++ client over socket connection in java? I have details like port number and server IP.
Or do I need decompile the whole C++ client into Assembly and then somehow translate it into java to do that?
I'm really not sure what kind of data it's transforming, though.. Somebody told me to code HTTP server and run it on my Router but I'm not really sure if that would work?
Here’s the diagrammatic way to look at it.
Server generates data.
it puts it in a packet.
it encrypts the packet.
and sends it over the wire.
It gets to a user’s Computer (= client). (I should be in the control now on..)
(If I could somehow read data at this part?)
The client reads the encrypted packet.
(If I could somehow read data at this part?) (The later, the better :D)
The client decrypts the packet.
(If I could somehow read data at this part?) (The later, the better :D)
The client does something.
As said, the client is .exe file and it's coded using C++. And I don't have source code of it.
All you have to do is define well your application protocol. This is, the format of your data stream. As long as you are using the same format in both ends, it doesn't really matter what language or program you are using. Imagine your browser and the web server. They are both using the same application protocol (HTTP) but they are completely different programs. Even more, there exists different web servers and different browsers.
Then, all you need to do is use the java sockets to listen to some specific port, and use your c++ sockets to write to the specific port. Just make sure you know how the information is "organized".
I am trying to send an image from the android phone, process it on the server side, and then get it back to the phone. I have been able to send the file from the phone to the server but then it seems that the server cannot send the image using the same socket. I am using bufferedinputstream and bufferedoutputsteram. is it possible or would i need 2 different ports? code is in java.
It is possible to have a 2-way-communication by using one socket connection.
In easy words: the server's outputstream is the client's inputstream and vice versa.