Send Audio (WAV) to web and play using sockets in Java - java

I have been stuck on this for awhile, and I have scoured the internet, and can't find any solutions. Pretty much I am trying to send a wav, using https://github.com/mrniko/netty-socketio. I do this by converting the WAV to binary (after reading it in) and then pushing it to the front end using the socket
The issue lays that the data is sent, it is converted to a blob, but the blob won't play, the browser siting a Uncaught (in promise) DOMException: Failed to load because no supported source was found. error.
Any ideas? There are multiple possible points of failure, but I can't figure it out.
Server.JAVA
File file = new File("src/main/resources/test.wav");
AudioInputStream in;
try{
try{
in = AudioSystem.getAudioInputStream(file);
}catch (IOException e) {
System.out.println("Audio io error");
e.printStackTrace();
return;
}
}catch (UnsupportedAudioFileException e) {
System.out.println("Bad Audio File error");
e.printStackTrace();
return;
}
//CONVERT TO BYTE ARRAY
ByteArrayOutputStream buffer = new ByteArrayOutputStream();
int nRead;
byte[] data = new byte[16384];
try{
while ((nRead = in.read(data, 0, data.length)) != -1) {
buffer.write(data, 0, nRead);
}
}catch (java.io.IOException e) {
System.out.println("Can't read into buffer");
e.printStackTrace();
return;
}
final byte[] audio = buffer.toByteArray();
//SERVER
Configuration config = new Configuration();
config.setHostname("localhost");
config.setPort(9092);
config.setMaxFramePayloadLength(1024 * 1024);
config.setMaxHttpContentLength(1024 * 1024);
final SocketIOServer server = new SocketIOServer(config);
server.addEventListener("updateCoordinates", byte[].class, new DataListener<byte[]>() {
#Override
public void onData(SocketIOClient client, byte[] data, AckRequest ackRequest) {
//System.out.println("Just gonna send it");
client.sendEvent("sound", audio);
}
});
server.start();
Thread.sleep(Integer.MAX_VALUE);
server.stop();
Client.js
var socket = io.connect('http://localhost:9092');
socket.emit('updateCoordinates');
socket.on('sound', function(file) {
console.log(file)
console.log("recieved");
var arrayBuffer = new Uint8Array(file).buffer;
console.log(arrayBuffer);
var blob = new Blob([arrayBuffer], {type : 'audio/wav'});
console.log(blob);
const audioUrl = URL.createObjectURL(blob);
const audio = new Audio(audioUrl);
audio.play();
});

Alright. I figured it out. The AudioSystem strips the wav of important metadata, so the front end could not read it. Updated code for the server working is
Path path = Paths.get("src/main/resources/test.wav");
final byte[] audio;
try{
audio = Files.readAllBytes(path);
}
catch (IOException e) {
System.out.println("Audio io error");
e.printStackTrace();
return;
}
//SERVER
Configuration config = new Configuration();
config.setHostname("localhost");
config.setPort(9092);
config.setMaxFramePayloadLength(1024 * 1024);
config.setMaxHttpContentLength(1024 * 1024);
final SocketIOServer server = new SocketIOServer(config);
server.addEventListener("updateCoordinates", byte[].class, new DataListener<byte[]>() {
#Override
public void onData(SocketIOClient client, byte[] data, AckRequest ackRequest) {
//System.out.println("Just gonna send it");
client.sendEvent("sound", audio);
}
});
server.start();
Thread.sleep(Integer.MAX_VALUE);
server.stop();
}

Related

How can I stream .mp3 files in Java?

I want to stream mp3 files and accessing them using spring, but i dont know how :( have already searched the internet but havent found anything yet. I already tried it using Streams and it worked kinda, but every song starts at the beginning and other people also start at the beginning of the song. My code:
Backend:
new Thread(() -> {
stream = new ByteArrayOutputStream();
while(true){
try {
currentSong = files[rd.nextInt(files.length-1)];
InputStream is = new FileInputStream(new File(currentSong));
int read = 0;
byte[] bytes = new byte[1024];
while((read = is.read(bytes)) !=-1){
stream.write(bytes, 0, read);
}
stream.flush();
is.close();
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
}).start();;
Frontend:
public class DnBController {
#GetMapping("/dnb")
public String play(HttpServletResponse httpServletResponse) throws IOException {
OutputStream os = httpServletResponse.getOutputStream();
httpServletResponse.setContentType("audio/mpeg");
DnbradioApplication.stream.writeTo(httpServletResponse.getOutputStream());
return "site.html";
}

How to store audio input in an array from recorder?

I have been working on a project that implements pattern recognition on breathing patterns as a form of communication for speech impaired speakers.
I have an idea of how to do it, but I have a very basic knowledge of Java. I am stuck. I wanted to get the audio data from microphone and store it in an array. In doing so, I can then pass the data and normalise it, extract features from it, and then store the new array in my database.
Please help. Thank you!
First you Should Encode To String
private void encodeAudio(String selectedPath) {
byte[] audioBytes;
try {
// Just to check file size.. Its is correct i-e; Not Zero
File audioFile = new File(selectedPath);
long fileSize = audioFile.length();
ByteArrayOutputStream baos = new ByteArrayOutputStream();
FileInputStream fis = new FileInputStream(new File(selectedPath));
byte[] buf = new byte[1024];
int n;
while (-1 != (n = fis.read(buf)))
baos.write(buf, 0, n);
audioBytes = baos.toByteArray();
// Here goes the Base64 string
_audioBase64 = Base64.encodeToString(audioBytes, Base64.DEFAULT);
} catch (Exception e) {
DiagnosticHelper.writeException(e);
}
}
Then Decode it in Received Device
private void decodeAudio(
String base64AudioData,
File fileName,
String path,
MediaPlayer mp) {
try {
FileOutputStream fos = new FileOutputStream(fileName);
fos.write(Base64.decode(base64AudioData.getBytes(), Base64.DEFAULT));
fos.close();
try {
mp = new MediaPlayer();
mp.setDataSource(path);
mp.prepare();
mp.start();
} catch (Exception e) {
DiagnosticHelper.writeException(e);
}
} catch (Exception e) {
e.printStackTrace();
}
}

simple http server for sending files in android via wifi

I am developing application which will send some files from one android device to an other device. One device will create a hot spot and will use the web browser to receive some files from the other device which will connect to the hot spot using WiFi. I do not known to much about HTTP. I have created simple sever in the sending device. The receiving device will connect to the server with its web browser. When i write the (IP:port number ) in the browser, it say "starting download" but the download does not finished. in the sending device i get IOException: sendto failed EPIPE (Broken pipe) .
my code is
boolean sending_done = false;
while (! sending_done)
{
Socket socket = null;
try
{
socket = serverSocket.accept();
OutputStream outputStream = socket.getOutputStream();
outputStream.write("HTTP/1.1 200 OK\r\n".getBytes());
outputStream.write("Content-Type: application/zip\r\n".getBytes());
outputStream.write("Content-Disposition: attachment; filename=files.zip\r\n".getBytes());
outputStream.write("Content-Encoding: gzip\r\n".getBytes());
outputStream.write("Transfer-Encoding: chunked\r\n".getBytes());
outputStream.write("\r\n".getBytes());
BufferedOutputStream bufferedOutputStream = new BufferedOutputStream(outputStream);
ZipOutputStream zipOutputStream = new ZipOutputStream(bufferedOutputStream);
zipOutputStream.setLevel(ZipOutputStream.STORED);
sendFiles(zipOutputStream, to_send_files_array_list);
byte[] CRLF ="\r\n".getBytes();
writ_header(0, zipOutputStream, CRLF);
zipOutputStream.write(CRLF, 0, CRLF.length);
zipOutputStream.finish();
zipOutputStream.flush();
zipOutputStream.close();
sending_done = true;
}
catch (IOException e)
{
e.printStackTrace();
display_toast(e.toString());
}
finally
{
try
{
if (socket != null)
{
socket.close();
}
} catch (IOException e) {
e.printStackTrace();
}
}
}
private void sendFiles(ZipOutputStream zipOutputStream, ArrayList<File> filesToSend)
{
byte[] CRLF ="\r\n".getBytes();
for(File file: filesToSend)
{
BufferedInputStream bufferedInputStream = null;
ZipEntry zipEntry = null;
try
{
FileInputStream fileInputStream = new FileInputStream(file);
bufferedInputStream = new BufferedInputStream(fileInputStream);
zipEntry = new ZipEntry(file.getName());
zipOutputStream.putNextEntry(zipEntry);
int x;
byte[] buffer = new byte[buffer_size];
while ((x= bufferedInputStream.read(buffer))!= -1)
{
if (x>0)
{
writ_header(x, zipOutputStream, CRLF);
zipOutputStream.write(buffer,0,x);
zipOutputStream.write(CRLF, 0, CRLF.length);
}
}
}
catch (IOException e)
{
e.printStackTrace();
}
finally
{
try
{
if (zipEntry != null)
{
zipOutputStream.closeEntry();
}
if (bufferedInputStream != null)
{
bufferedInputStream.close();
}
}
catch (Exception e)
{
e.printStackTrace();
}
}
}
}
private void writ_header(int length, OutputStream outputStream, byte[] CRLF) throws IOException
{
byte[] header = Integer.toHexString(length).getBytes();
outputStream.write(header, 0, header.length);
outputStream.write(CRLF, 0, CRLF.length);
}

Receiving byte array over a socket

i have a program which accepts connection from a phone that sends a byte array, i am able to test the connection when it is made , however how do i know that i am actually receiving something? how can i "see" if any thing is being sent over the socket. Because from my code below i am not able to have the resulting file "saved.jpg" created. Does this mean that it did not receive anything?
public class wpsServer {
//vars
private int svrPort = 3334;
private ServerSocket serverSocket;
private Image image = null;
public wpsServer()
{
try {
serverSocket = new ServerSocket(svrPort);
System.out.println("Server started on "+svrPort);
}
catch (IOException e) {
System.out.println("Could not listen on port: "+svrPort);
System.exit(-1);
}
}
public void listenForClient()
{
Socket clientSocket = null;
try {
clientSocket = serverSocket.accept();
if(clientSocket.isConnected())
System.out.println("Connected");
byte[] pic = getPicture(clientSocket.getInputStream());
InputStream in = new ByteArrayInputStream(pic);
BufferedImage image = ImageIO.read(in);
File outputfile = new File("saved.jpg");
ImageIO.write(image, "jpg", outputfile);
}
catch (IOException e) {
System.out.println("Accept failed: "+svrPort);
System.exit(-1);
}
}
public byte[] getPicture(InputStream in) {
try {
ByteArrayOutputStream out = new ByteArrayOutputStream();
byte[] data = new byte[1024];
int length = 0;
while ((length = in.read(data))!=-1) {
out.write(data,0,length);
}
return out.toByteArray();
} catch(IOException ioe) {
//handle it
}
return null;
}
}
The in.read call will only return -1 if the other end closes the socket. While the socket is alive, that call will block until more data is available.
What you need to do is change your "protocol": the client should send the array size first, then the data. The server should read that length, and stop reading the file when that's done (go back to waiting for the next file for instance).

Axis2 File Upload by chunk

I'm trying to upload file using Axis2 web service by 1024 chunk size.
My server side looks like this:
public void appendChunk(int count, byte[] buffer){
FileOutputStream fos = null;
try {
File destinationFile = new File("c:\\file1.exe");
fos = new FileOutputStream(destinationFile,true);
fos.write(buffer,0, count);
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
finally{
try {
fos.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
my client side looks like this:
static int CHUNK_SIZE =1024;
public static void main(String[] args) throws IOException, ServiceException {
FileUploadService strub = new FileUploadServiceLocator();
FileUploadServicePortType a = strub.getFileUploadServiceHttpSoap12Endpoint();
byte[] buffer = new byte[CHUNK_SIZE];
FileInputStream fis = null;
File file = new File("C:\\install.exe");
int count;
try {
fis = new FileInputStream(file);
while((count = fis.read(buffer, 0, CHUNK_SIZE)) >0 )
{
a.appendChunk(count, buffer);
}
} catch (FileNotFoundException e) {
e.printStackTrace();
}
finally{
fis.close();
}
}
After it the file size is incorrect and if origina file size is 500 Kb, the original size varies between 200 and 400k.
What am I doing wrong?
Update: I looked at log4j file in Tomcat
Nov 17, 2010 2:08:31 PM org.apache.tomcat.util.net.JIoEndpoint createWorkerThread
INFO: Maximum number of threads (200) created for connector with address null and port 80
It looks like all requests to the web server are done Asynchronously and and I also getting IO exception that the file is used by another process.
try to add fos.flush(); before your fos.close(); in your server implementation.
Change
while((count = fis.read(buffer, 0, CHUNK_SIZE)) >0 )
for
while((count = fis.read(buffer, 0, CHUNK_SIZE)) != -1 )

Categories

Resources