AudioSystem Write, AudioInputStream from modified bytes obtained from InputStream, Java - java

I obtaining bytes from InputStream, but I need to modify and save them to Wav File.
Here my code:
Socket Sending Audio Obtained from Microphone.
AudioFormat adfmt = new AudioFormat(8000.0f, 8, 1, true , true);
int bufferSize = (int) adfmt.getSampleRate()* adfmt.getFrameSize();
byte[] buffer = new byte[bufferSize];
Socket clientSocketO = new Socket(...);
OutputStream output = clientSocketO.getOutputStream();
DataLine.Info dlInfo = new DataLine.Info(TargetDataLine.class, adfmt);
TargetDataLine tdLine = (TargetDataLine) AudioSystem.getLine(dlInfo);
tdLine.open(adfmt);
tdLine.start(); // start capturing
boolean bRunningO = true;
while (bRunningO) {
int count = tdLine.read(buffer, 0, buffer.length);
if (count > 0) {
byte[] outgoingBytes = Arrays.copyOf(buffer, count);
output.write(outgoingBytes);
}
}
tdLine.flush();
In the Other Side Socket receiving bytes :
AudioFormat adfmt = new AudioFormat(8000.0f, 8, 1, true , true);
int bufferSize = (int) adfmt.getSampleRate()* adfmt.getFrameSize();
byte[] buffer = new byte[bufferSize];
Socket clientSocketI = new Socket(...);
InputStream input = clientSocketI.getInputStream();
String fileName = System.getProperty("file.separator") + "SomeFile.wav"
File fileStreamedWav = new File((new File("")).getAbsolutePath() + fileName);
AudioInputStream ais;
ByteArrayInputStream bis;
DataLine.Info dlInfo = new DataLine.Info(SourceDataLine.class, adfmt);
//SourceDataLine sdLine = (SourceDataLine) AudioSystem.getLine(dlInfo);
//sdLine.open(adfmt);
//sdLine.start(); // start playback
AudioFileFormat.Type afType = AudioFileFormat.Type.WAVE;
boolean bRunningI = true;
while (bRunningI) {
try {
int read = input.read(buffer); //Socket Reading bytes
byte[] incomingBytes;
if (read > 0) {
incomingBytes = Arrays.copyOf(buffer, read);
if (incomingBytes!= null) {
//sdLine.write(incomingBytes, 0, incomingBytes.length);
//Same Size bytes, but isn't necessary submit the put Code
byte[] changedBytes = MethodChangerBytes(incomingBytes);
bis = new ByteArrayInputStream(changedBytes);
ais = new AudioInputStream(bis, adfmt,
changedBytes.length/adfmt.getFrameSize());
int W = AudioSystem.write(ais, afType, fileStreamedWav);
System.out.println("AudioSystem.write:" + W);
}
}
} catch (IOException e) {
bRunningI = false;
}
}
Here the code modifier of Bytes, for Now assume amplify by two...
byte[] MethodChangerBytes(byte[] incoming) {
byte[] outgoing = new byte[incoming.length];
for (int i = 0; i < incoming.length; i ++) {
// Really is not important what happens here
double Sample = (double)(short)(((incoming[i] - 128) & 0xFF) << 8);
Sample *= 2.0;
outgoing[i] = (byte)(((int()Sample >> 8) + 128) & 0xFF);
}
return outgoing;
}
When sdLine is uncommented then I can here all sound transmitted.
AudioInputStream(InputStream stream, AudioFormat format, long length)
AudioSystem.write(AudioInputStream stream, AudioFileFormat.Type fileType, File out)
The problem:
This code Only Save the Last Bytes obtained from MethodChangerBytes.
Question:
How Save all bytes processed Wav bytes until Socket connection is closed?
Thank you

Have a buffer:
ByteArrayOutputStream outputStream=new ByteArrayOutputStream();
write to this buffer then move the writing outside the loop; when all bytes are read save:
boolean bRunningI = true;
try {
while (bRunningI) {
int read = input.read(buffer); //Socket Reading bytes
byte[] incomingBytes;
if (read > 0) {
incomingBytes = Arrays.copyOf(buffer, read);
if (incomingBytes!= null) {
//sdLine.write(incomingBytes, 0, incomingBytes.length);
//Same Size bytes, but isn't necessary submit the put Code
byte[] changedBytes = MethodChangerBytes(incomingBytes);
outputStream.write(changedBytes, 0, changedBytes.length);
}
}
}
byte[] allBytes=outputStream.toByteArray();
bis = new ByteArrayInputStream(allBytes);
ais = new AudioInputStream(bis, adfmt,
changedBytes.length/adfmt.getFrameSize());
int W = AudioSystem.write(ais, afType, fileStreamedWav);
System.out.println("AudioSystem.write:" + W);
} catch (IOException e) {
bRunningI = false;
}

Related

How to convert audio file into byte array

I want to convert an audio file into a byte array. I currently did it and I want to know if its works :
private static AudioFormat getFormat() {
float sampleRate = 44100;
int sampleSizeInBits = 16;
int channels = 1;
boolean signed = true;
boolean bigEndian = true;
return new AudioFormat(sampleRate, sampleSizeInBits, channels, signed,
bigEndian);
}
public static byte[] listenSound(File f) {
AudioInputStream din = null;
AudioInputStream outDin = null;
PCM2PCMConversionProvider conversionProvider = new PCM2PCMConversionProvider();
try {
AudioInputStream in = AudioSystem.getAudioInputStream(f);
AudioFormat baseFormat = in.getFormat();
AudioFormat decodedFormat = new AudioFormat(
AudioFormat.Encoding.PCM_SIGNED,
baseFormat.getSampleRate(),
16,
baseFormat.getChannels(),
baseFormat.getChannels() * 2,
baseFormat.getSampleRate(),
false);
din = AudioSystem.getAudioInputStream(decodedFormat, in);
if (!conversionProvider.isConversionSupported(getFormat(), decodedFormat)) {
System.out.println("Conversion Not Supported.");
System.exit(-1);
}
outDin = conversionProvider.getAudioInputStream(getFormat(), din);
ByteArrayOutputStream out = new ByteArrayOutputStream();
int n = 0;
byte[] buffer = new byte[1024];
while (true) {
n++;
if (n > 1000)
break;
int count = 0;
count = outDin.read(buffer, 0, 1024);
if (count > 0) {
out.write(buffer, 0, count);
}
}
in.close();
din.close();
outDin.close();
out.flush();
out.close();
//byte[] b=out.toByteArray();
//for(int i=0; i<b.length; i++)
//System.out.println("b = "+b[i]);
return out.toByteArray();
} catch (Exception e) {
e.printStackTrace();
}
return null;
}
That byte array data is actually known as time domain i need to be sure if it works before transforming this data into frequency domain with Discrete Fourier.
Thanks for your help !
typically the actual number of bytes processed is returned from a call like
count = outDin.read(buffer, 0, 1024);
so in addition to your current hard break after processing 1000 chunks if the API does in fact return a byte count you should check for it :
int size_chunk = 1024
byte[] buffer = new byte[size_chunk];
boolean keep_streaming = true;
while (keep_streaming) {
n++;
if (n > 1000) { // troubleshooting ONLY remove later
keep_streaming = false;
}
int count = 0;
count = outDin.read(buffer, 0, size_chunk);
if (count > 0) {
out.write(buffer, 0, count);
}
if (count < size_chunk) { // input stream has been consumed
keep_streaming = false;
}
}
You did not supply a link to the API doc so I cannot confirm, however assuming outDin.read will output the number of bytes actually processed, the above code will correctly output only the bytes to match input and so will result in a smaller output if input is less than 1 meg of data (your original logic blindly generated a 1 meg output stopped only after seeing 1000 chunks ... it also assumes you intend to truncate input after 1 meg of data as per your lines
if (n > 1000) { // troubleshooting ONLY remove later
keep_streaming = false;
}

Decode mp3 to pcm, and play with audiotrack in Google Android

First of all, if not using function decode_path , I can play .wav file with my code , and it works fine I use Jlayer and audio track to play the song.
Second, if I use function decode_path it can decode mp3 to pcm file , and pass the byte[] to function PlayAudioTrack, and let it play.
The quesion is,I don't know where my code is wrong , I use 320Kbps, 44.1Khz stereo type, Layer3 mp3, but the AudioTrack plays noise but no music~!!!!
can anyone ?
???
My code
public void PlayAudioTrack(String filePath) throws IOException{
int intSize = android.media.AudioTrack.getMinBufferSize(44100, AudioFormat.CHANNEL_CONFIGURATION_STEREO,
AudioFormat.ENCODING_PCM_16BIT);
AudioTrack at = new AudioTrack(AudioManager.STREAM_MUSIC, 44100, AudioFormat.CHANNEL_CONFIGURATION_STEREO,
AudioFormat.ENCODING_PCM_16BIT, intSize, AudioTrack.MODE_STREAM);
//Reading the file..
int count = 512 * 1024; // 512 kb
// byte[] byteData = null;
// byteData = new byte[(int)count];
//we can decode correct byte data here
byte[] byteData = null;
byteData = decode_path(filePath, 0, 20000);
File file = null;
file = new File(filePath);
FileInputStream in = null;
try {
in = new FileInputStream( file );
} catch (FileNotFoundException e) {
e.printStackTrace();
}
int bytesread = 0, ret = 0;
int size = (int) file.length();
at.play();
while (bytesread < size) {
Log.e("devon","write byte array with sizes");
ret = in.read( byteData,0, count);
if (ret != -1) {
Log.e("devon","Write the byte array to the track");
at.write(byteData,0, ret);
bytesread += ret;
}else break;
}
at.stop();
at.release();
}
public static byte[] decode_path(String path, int startMs, int maxMs)
throws IOException{
ByteArrayOutputStream outStream = new ByteArrayOutputStream(1024);
float totalMs = 0;
boolean seeking = true;
File file = new File(path);
InputStream inputStream = new BufferedInputStream(new FileInputStream(file), 8 * 1024);
try {
Bitstream bitstream = new Bitstream(inputStream);
Decoder decoder = new Decoder();
boolean done = false;
while (! done) {
Header frameHeader = bitstream.readFrame();
if (frameHeader == null) {
done = true;
} else {
totalMs += frameHeader.ms_per_frame();
if (totalMs >= startMs) {
seeking = false;
}
if (! seeking) {
SampleBuffer output = (SampleBuffer) decoder.decodeFrame(frameHeader, bitstream);
if (output.getSampleFrequency() != 44100
|| output.getChannelCount() != 2) {
throw new IllegalArgumentException("mono or non-44100 MP3 not supported");
}
short[] pcm = output.getBuffer();
for (short s : pcm) {
outStream.write(s & 0xff);
outStream.write((s >> 8 ) & 0xff);
}
}
if (totalMs >= (startMs + maxMs)) {
done = true;
}
}
bitstream.closeFrame();
}
return outStream.toByteArray();
} catch (BitstreamException e) {
throw new IOException("Bitstream error: " + e);
} catch (DecoderException e) {
Log.w(TAG, "Decoder error", e);
throw new IOException("Decoder error: " + e);
}
}
public void PlayAudioTrack(String filePath) throws IOException{
int intSize = android.media.AudioTrack.getMinBufferSize(44100, AudioFormat.CHANNEL_CONFIGURATION_STEREO,
AudioFormat.ENCODING_PCM_16BIT);
AudioTrack at = new AudioTrack(AudioManager.STREAM_MUSIC, 44100, AudioFormat.CHANNEL_CONFIGURATION_STEREO,
AudioFormat.ENCODING_PCM_16BIT, intSize, AudioTrack.MODE_STREAM);
//Reading the file..
int count = 512 * 1024; // 512 kb
// byte[] byteData = null;
// byteData = new byte[(int)count];
//we can decode correct byte data here
byte[] byteData = null;
byteData = decode_path(filePath, 0, 20000);
int temp =0;
at.play();
while (temp<byteData.length)
{
at.write(byteData, temp, count);
temp+= count;
}
at.stop();
at.release();
}

image receiving at android with no content- socket programming

i've created an application which send image from server (desktop) to client (android) via socket programming............the problem is i'm getting the file at the client side (android), but with no content.
can anyone please tell me what's the problem
Client side (Android)
DataInputStream dis=new DataInputStream(socket.getInputStream());
receiveFile(dis); // call method receiveFile()
public Bitmap receiveFile(InputStream is) throws Exception{
String baseDir = Environment.getExternalStorageDirectory().getAbsolutePath();
String fileName = "myFile.png";
String imageInSD = baseDir + File.separator + fileName;
System.out.println("FILE----------------->"+imageInSD);
int filesize=6022386;
int bytesRead;
int current = 0;
byte [] data = new byte [filesize];
FileOutputStream fos = new FileOutputStream(imageInSD);
BufferedOutputStream bos = new BufferedOutputStream(fos);
bytesRead = is.read(data,0,data.length);
current = bytesRead;
int index = 0;
while (index < filesize)
{
bytesRead = is.read(data, index, filesize - index);
if (bytesRead < 0)
{
throw new IOException("Insufficient data in stream");
}
index += filesize;
}
bos.write(data, 0 , current);
bos.flush();
bos.close();
return null;
}
Server (Desktop)
send(socket.getOutputStream()); // call method send()
public void send(OutputStream os) throws Exception{
// sendfile
File myFile = new File ("C:/div.png");
System.out.println("the file is read");
byte [] mybytearray = new byte [(int)myFile.length()+1];
FileInputStream fis = new FileInputStream(myFile);
BufferedInputStream bis = new BufferedInputStream(fis);
bis.read(mybytearray,0,mybytearray.length);
System.out.println("Sending...");
os.write(mybytearray,0,mybytearray.length);
os.flush();
}
The correct way to copy a stream in Java is as follows:
while ((count = in.read(buffer)) > 0)
{
out.write(buffer, 0, count);
}
At present your code:
Assumes read() fills the buffer. There is nothing in the Javadoc that says so.
Ignores the result returned by read(), which in addition to being the invaluable count, could also be -1 indicating EOS.
Wastefully allocates a buffer the entire size of the file.
Assumes the size of the file fits into an int.
Relies on the receiver magically knowing the size of the incoming file.
The code above makes none of these assumptions, and works with any buffer size from 1 upwards.
Looking at your code, I see you want to receive a file save it to the external storage and return the Bitmap of that file. That's what I guess you want to do but your code, as it is, does not do that. If you want you can use the following code to accomplish that task. First the server sends 4 bytes indicating the file's size followed by the file's contents; the client read that 4 bytes and then reads the whole file saving it to disk every chunk it reads. Finally, it converts the received file to a bitmap and returns it.
The client code:
public Bitmap receiveFile(InputStream is) throws Exception
{
String baseDir = Environment.getExternalStorageDirectory().getAbsolutePath();
String fileName = "myFile.png";
String imageInSD = baseDir + File.separator + fileName;
System.out.println("FILE----------------->" + imageInSD);
// read first 4 bytes containing the file size
byte[] bSize = new byte[4];
is.read(bSize, 0, 4);
int filesize;
filesize = (int) (bSize[0] & 0xff) << 24 |
(int) (bSize[1] & 0xff) << 16 |
(int) (bSize[2] & 0xff) << 8 |
(int) (bSize[3] & 0xff);
int bytesRead;
// You may but don't have to read the whole file in memory
// 8k buffer is good enough
byte[] data = new byte[8 * 1024];
int bToRead;
FileOutputStream fos = new FileOutputStream(imageInSD);
BufferedOutputStream bos = new BufferedOutputStream(fos);
while (filesize > 0)
{
// EDIT: just in case there is more data in the stream.
if (filesize > data.length) bToRead=data.length;
else bToRead=filesize;
bytesRead = is.read(data, 0, bToRead);
if (bytesRead > 0)
{
bos.write(data, 0, bytesRead);
filesize -= bytesRead;
}
}
bos.close();
// I guess you want to return the received image as a Bitmap
Bitmap bmp = null;
FileInputStream fis = new FileInputStream(imageInSD);
try
{
bmp = BitmapFactory.decodeStream(fis);
}
catch (Exception e)
{
// in case of an error set it to null
bmp = null;
}
finally
{
fis.close();
}
return bmp;
}
The server code:
public void send(OutputStream os) throws Exception
{
// sendfile
File myFile = new File("C:/div.png");
System.out.println("the file is read");
int fSize = (int) myFile.length();
byte[] bSize = new byte[4];
bSize[0] = (byte) ((fSize & 0xff000000) >> 24);
bSize[1] = (byte) ((fSize & 0x00ff0000) >> 16);
bSize[2] = (byte) ((fSize & 0x0000ff00) >> 8);
bSize[3] = (byte) (fSize & 0x000000ff);
// send 4 bytes containing the filesize
os.write(bSize, 0, 4);
byte[] mybytearray = new byte[(int) fSize];
FileInputStream fis = new FileInputStream(myFile);
BufferedInputStream bis = new BufferedInputStream(fis);
int bRead = bis.read(mybytearray, 0, mybytearray.length);
System.out.println("Sending...");
os.write(mybytearray, 0, bRead);
os.flush();
bis.close();
}

Socket Programming Java: How to receive multiple files from outputstream?

My server is sending several files to my client. However at the client side, it only receives the first file because I don't know how to iterate and get the second file.
The Server sends like this:
ListIterator iter = missingfiles.listIterator();
//missingfiles contain all the filenames to be sent
String filename;
while (iter.hasNext()) {
// System.out.println(iter.next());
filename=(String) iter.next();
File myFile = new File("src/ee4210/files/"+filename);
byte[] mybytearray = new byte[(int) myFile.length()];
FileInputStream fis = new FileInputStream(myFile);
BufferedInputStream bis = new BufferedInputStream(fis);
//bis.read(mybytearray, 0, mybytearray.length);
DataInputStream dis = new DataInputStream(bis);
dis.readFully(mybytearray, 0, mybytearray.length);
OutputStream os = _socket.getOutputStream();
//Sending file name and file size to the server
DataOutputStream dos = new DataOutputStream(os);
dos.writeUTF(myFile.getName());
dos.writeLong(mybytearray.length);
dos.write(mybytearray, 0, mybytearray.length);
dos.flush();
}
The client receives like this: (It will only receive the first file and I don't know how to make it loop to receive the next file)
int bytesRead;
int current = 0;
int filecount = 0;
InputStream in;
try {
in = _socket.getInputStream();
DataInputStream clientData = new DataInputStream(in);
String fileName = clientData.readUTF();
OutputStream output = new FileOutputStream(
"src/ee4210/files/"+ fileName);
long size = clientData.readLong();
byte[] buffer = new byte[1024];
while (size > 0
&& (bytesRead = clientData.read(buffer, 0,
(int) Math.min(buffer.length, size))) != -1) {
output.write(buffer, 0, bytesRead);
size -= bytesRead;
}
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
How to receive multiple files from outputstream?
The obvious answer is 'one at a time, with extra information to tell you where one stops and another starts'. The most usual technique is to send the file size ahead of the file, e.g. as a long via DataOutputStream.writeLong(), and at the receiver change your read loop to stop after exactly that many bytes, close the output file, and continue the outer loop that reads the next long or end-of-stream.
You can try this.
I've used a lazy method to check that the end of all 3 files have been received.
int bytesRead;
int current = 0;
int filecount = 0;
InputStream in;
try
{
in = _socket.getInputStream();
DataInputStream clientData = new DataInputStream(in);
while(true)
{
String fileName = clientData.readUTF();
// will throw an EOFException when the end of file is reached. Exit loop then.
OutputStream output = new FileOutputStream("src/ee4210/files/"+ fileName);
long size = clientData.readLong();
byte[] buffer = new byte[1024];
while (size > 0
&& (bytesRead = clientData.read(buffer, 0,
(int) Math.min(buffer.length, size))) != -1)
{
output.write(buffer, 0, bytesRead);
size -= bytesRead;
}
output.close();
}
}
catch (EOFException e)
{
// means we have read all the files
}
catch (IOException e)
{
// TODO Auto-generated catch block
e.printStackTrace();
}

Wav file convert to byte array in java

My project is 'Speech Recognition of Azeri speech'. I have to write a program that converts wav files to byte array.
How to convert audio file to byte[]?
Basically as described by the snippet in the first answer, but instead of the BufferedInputStream use AudioSystem.getAudioInputStream(File) to get the InputStream.
Using the audio stream as obtained from AudioSystem will ensure that the headers are stripped, and the input file decode to a byte[] that represents the actual sound frames/samples - which can then be used for FFT etc.
Write this file into ByteArrayOutputStream
ByteArrayOutputStream out = new ByteArrayOutputStream();
BufferedInputStream in = new BufferedInputStream(new FileInputStream(WAV_FILE));
int read;
byte[] buff = new byte[1024];
while ((read = in.read(buff)) > 0)
{
out.write(buff, 0, read);
}
out.flush();
byte[] audioBytes = out.toByteArray();
import java.io.*;
import java.nio.ByteBuffer;
import java.nio.ByteOrder;
import java.util.LinkedHashMap;
import javax.sound.sampled.*;
/**
* This class reads a .wav file and converts it to a bunch of byte arrays.
*
* The info represented by these byte arrays is then printed out.
*
* An example of playing these byte arrays with the speakers is used.
*
* It also converts the byte arrays to a .wav file.
*
* An extension of this concept can record from a microphone.
* In this case, some values like sampling rate would need to be assumed.
*
* See https://ccrma.stanford.edu/courses/422/projects/WaveFormat/ for .wav file spec
*
* #author sizu
*/
public class WavFileHelper {
public static void main(String[] args) {
final String NEWLINE = "\n";
int recordingSampleRate = 22050;
short recordingBitsPerSample = 16;
short recordingNumChannels = 2;
String inputFile = "/input.wav"; // Place the wav file in the top level directory, ie S:/input.wav
String outputFile = "/output.wav";
String recordedFile = "/capture.wav";
System.out.println("START");
try {
WavData wavInputData = new WavData();
WavData wavRecordData = new WavData();
wavRecordData.put(WaveSection.SAMPLE_RATE, recordingSampleRate);
wavRecordData.put(WaveSection.BITS_PER_SAMPLE, recordingBitsPerSample);
wavRecordData.put(WaveSection.NUM_CHANNELS, recordingNumChannels);
System.out.println(NEWLINE+"CONVERT WAV FILE TO BYTE ARRAY");
wavInputData.read(inputFile);
System.out.println(NEWLINE+"CONVERT BYTE ARRAY TO WAV FILE");
wavInputData.write(outputFile);
System.out.println(NEWLINE+"DISPLAY BYTE ARRAY INFORMATION FOR INPUT FILE");
wavInputData.printByteInfo();
System.out.println(NEWLINE+"START RECORDING - You can connect the microphone to the speakers");
WavAudioRecorder recorder = new WavFileHelper.WavAudioRecorder(wavRecordData);
recorder.startRecording();
System.out.println(NEWLINE+"PLAY BYTE ARRAY (THIS WILL BE RECORDED)");
WavAudioPlayer player = new WavFileHelper.WavAudioPlayer(wavInputData);
player.playAudio();
System.out.println(NEWLINE+"STOP RECORDING FOR RECORDING");
recorder.stopRecording();
System.out.println(NEWLINE+"DISPLAY BYTE ARRAY INFORMATION");
wavRecordData.printByteInfo();
System.out.println(NEWLINE+"SAVE RECORDING IN WAV FILE");
wavRecordData.write(recordedFile);
} catch (Exception ex) {
ex.printStackTrace();
}
System.out.println("FINISH");
}
public static enum WaveSection {
// 12 Bytes
CHUNK_ID(4, ByteOrder.BIG_ENDIAN),
CHUNK_SIZE(4, ByteOrder.LITTLE_ENDIAN),
FORMAT(4, ByteOrder.BIG_ENDIAN),
// 24 Bytes
SUBCHUNK1_ID(4, ByteOrder.BIG_ENDIAN),
SUBCHUNK1_SIZE(4, ByteOrder.LITTLE_ENDIAN),
AUDIO_FORMAT(2, ByteOrder.LITTLE_ENDIAN),
NUM_CHANNELS(2, ByteOrder.LITTLE_ENDIAN),
SAMPLE_RATE(4, ByteOrder.LITTLE_ENDIAN),
BYTE_RATE(4, ByteOrder.LITTLE_ENDIAN),
BLOCK_ALIGN(2, ByteOrder.LITTLE_ENDIAN),
BITS_PER_SAMPLE(2, ByteOrder.LITTLE_ENDIAN),
// 8 Bytes
SUBCHUNK2_ID(4, ByteOrder.BIG_ENDIAN),
SUBCHUNK2_SIZE(4, ByteOrder.LITTLE_ENDIAN),
DATA(0, ByteOrder.LITTLE_ENDIAN),
;
private Integer numBytes;
private ByteOrder endian;
WaveSection(Integer numBytes, ByteOrder endian){
this.numBytes = numBytes;
this.endian = endian;
}
}
public static class WavData extends LinkedHashMap<WaveSection, byte[]>{
static int HEADER_SIZE = 44; // There are 44 bits before the data section
static int DEFAULT_SUBCHUNK1_SIZE = 16;
static short DEFAULT_AUDIO_FORMAT = 1;
static short DEFAULT_BLOCK_ALIGN = 4;
static String DEFAULT_CHUNK_ID = "RIFF";
static String DEFAULT_FORMAT = "WAVE";
static String DEFAULT_SUBCHUNK1_ID = "fmt ";
static String DEFAULT_SUBCHUNK2_ID = "data";
public WavData(){
this.put(WaveSection.CHUNK_ID, DEFAULT_CHUNK_ID);
this.put(WaveSection.FORMAT, DEFAULT_FORMAT);
this.put(WaveSection.SUBCHUNK1_ID, DEFAULT_SUBCHUNK1_ID);
this.put(WaveSection.SUBCHUNK1_SIZE, DEFAULT_SUBCHUNK1_SIZE);
this.put(WaveSection.AUDIO_FORMAT, DEFAULT_AUDIO_FORMAT);
this.put(WaveSection.BLOCK_ALIGN, DEFAULT_BLOCK_ALIGN);
this.put(WaveSection.SUBCHUNK2_ID, DEFAULT_SUBCHUNK2_ID);
this.put(WaveSection.CHUNK_SIZE, 0);
this.put(WaveSection.SUBCHUNK2_SIZE, 0);
this.put(WaveSection.BYTE_RATE, 0);
}
public void put(WaveSection waveSection, String value){
byte[] bytes = value.getBytes();
this.put(waveSection, bytes);
}
public void put(WaveSection waveSection, int value) {
byte[] bytes = ByteBuffer.allocate(4).order(ByteOrder.LITTLE_ENDIAN).putInt(value).array();
this.put(waveSection, bytes);
}
public void put(WaveSection waveSection, short value) {
byte[] bytes = ByteBuffer.allocate(2).order(ByteOrder.LITTLE_ENDIAN).putShort(value).array();
this.put(waveSection, bytes);
}
public byte[] getBytes(WaveSection waveSection) {
return this.get(waveSection);
}
public String getString(WaveSection waveSection) {
byte[] bytes = this.get(waveSection);
return new String(bytes);
}
public int getInt(WaveSection waveSection) {
byte[] bytes = this.get(waveSection);
return ByteBuffer.wrap(bytes).order(ByteOrder.LITTLE_ENDIAN).getInt();
}
public short getShort(WaveSection waveSection) {
byte[] bytes = this.get(waveSection);
return ByteBuffer.wrap(bytes).order(ByteOrder.LITTLE_ENDIAN).getShort();
}
public void printByteInfo() {
for (WaveSection waveSection : WaveSection.values()) {
if (waveSection.numBytes == 4
&& waveSection.endian == ByteOrder.BIG_ENDIAN) {
System.out.println("SECTION:" + waveSection + ":STRING:"
+ this.getString(waveSection));
} else if (waveSection.numBytes == 4
&& waveSection.endian == ByteOrder.LITTLE_ENDIAN) {
System.out.println("SECTION:" + waveSection + ":INTEGER:"
+ this.getInt(waveSection));
} else if (waveSection.numBytes == 2
&& waveSection.endian == ByteOrder.LITTLE_ENDIAN) {
System.out.println("SECTION:" + waveSection + ":SHORT:"
+ this.getShort(waveSection));
} else {
// Data Section
}
}
}
public void read(String inputPath) throws Exception {
// Analyze redundant info
int dataSize = (int) new File(inputPath).length() - HEADER_SIZE;
WaveSection.DATA.numBytes = dataSize; // Can't have two threads using this at the same time
// Read from File
DataInputStream inFile = new DataInputStream(new FileInputStream(inputPath));
for (WaveSection waveSection : WaveSection.values()) {
byte[] readBytes = new byte[waveSection.numBytes];
for (int i = 0; i < waveSection.numBytes; i++) {
readBytes[i] = inFile.readByte();
}
this.put(waveSection, readBytes);
}
inFile.close();
}
public void write(String outputPath) throws Exception {
// Analyze redundant info
int dataSize = this.get(WaveSection.DATA).length;
this.put(WaveSection.CHUNK_SIZE, dataSize+36);
this.put(WaveSection.SUBCHUNK2_SIZE, dataSize);
int byteRate = this.getInt(WaveSection.SAMPLE_RATE)*this.getShort(WaveSection.BLOCK_ALIGN);
this.put(WaveSection.BYTE_RATE, byteRate);
// Write to File
DataOutputStream dataOutputStream = new DataOutputStream(new FileOutputStream(outputPath));
for (WaveSection waveSection : WaveSection.values()) {
dataOutputStream.write(this.getBytes(waveSection));
}
dataOutputStream.close();
}
public AudioFormat createAudioFormat() {
boolean audioSignedSamples = true; // Samples are signed
boolean audioBigEndian = false;
float sampleRate = (float) this.getInt(WaveSection.SAMPLE_RATE);
int bitsPerSample = (int) this.getShort(WaveSection.BITS_PER_SAMPLE);
int numChannels = (int) this.getShort(WaveSection.NUM_CHANNELS);
return new AudioFormat(sampleRate, bitsPerSample,
numChannels, audioSignedSamples, audioBigEndian);
}
}
public static class WavAudioPlayer {
WavData waveData = new WavData();
public WavAudioPlayer(WavData waveData){
this.waveData = waveData;
}
public void playAudio() throws Exception {
byte[] data = waveData.getBytes(WaveSection.DATA);
// Create an audio input stream from byte array
AudioFormat audioFormat = waveData.createAudioFormat();
InputStream byteArrayInputStream = new ByteArrayInputStream(data);
AudioInputStream audioInputStream = new AudioInputStream(byteArrayInputStream,
audioFormat, data.length / audioFormat.getFrameSize());
// Write audio input stream to speaker source data line
DataLine.Info dataLineInfo = new DataLine.Info(SourceDataLine.class,
audioFormat);
SourceDataLine sourceDataLine = (SourceDataLine) AudioSystem.getLine(dataLineInfo);
sourceDataLine.open(audioFormat);
sourceDataLine.start();
// Loop through input stream to write to source data line
byte[] tempBuffer = new byte[10000];
int cnt;
while ((cnt = audioInputStream.read(tempBuffer, 0, tempBuffer.length)) != -1) {
sourceDataLine.write(tempBuffer, 0, cnt);
}
// Cleanup
sourceDataLine.drain();
sourceDataLine.close();
byteArrayInputStream.close();
}
}
public static class WavAudioRecorder implements Runnable {
WavData waveData = new WavData();
boolean recording = true;
Thread runningThread;
ByteArrayOutputStream byteArrayOutputStream;
public WavAudioRecorder(WavData waveData){
this.waveData = waveData;
}
public void startRecording(){
this.recording = true;
this.runningThread = new Thread(this);
runningThread.start();
}
public WavData stopRecording() throws Exception{
this.recording = false;
runningThread.stop();
waveData.put(WaveSection.DATA, byteArrayOutputStream.toByteArray());
return waveData;
}
public void run() {
try {
// Create an audio output stream for byte array
byteArrayOutputStream = new ByteArrayOutputStream();
// Write audio input stream to speaker source data line
AudioFormat audioFormat = waveData.createAudioFormat();
DataLine.Info info = new DataLine.Info(TargetDataLine.class, audioFormat);
TargetDataLine targetDataLine = (TargetDataLine) AudioSystem.getLine(info);
targetDataLine.open(audioFormat);
targetDataLine.start();
// Loop through target data line to write to output stream
int numBytesRead;
byte[] data = new byte[targetDataLine.getBufferSize() / 5];
while(recording) {
numBytesRead = targetDataLine.read(data, 0, data.length);
byteArrayOutputStream.write(data, 0, numBytesRead);
}
// Cleanup
targetDataLine.stop();
targetDataLine.close();
byteArrayOutputStream.close();
} catch (Exception ex) {
ex.printStackTrace();
}
}
}
}
Convert file to byte array
fileToByteArray("C:\..\my.mp3");
`public static byte[] fileToByteArray(String name){
Path path = Paths.get(name);
try {
return Files.readAllBytes(path);
} catch (IOException e) {
e.printStackTrace();
return null;
}
}`

Categories

Resources