Float conversion for JVST input/output data - java

I am trying to process audio using VST plugins load via JVST.
Roughly speaking, what I am doing is the following:
1 open an audio input stream that takes a wav file
2 until the file is not finished
2.1 read a block of frames and store them as byte[]
2.2 convert byte[] to float[]
2.3 process the float[] with a JVST call to the VST plugin
2.4 convert float[] to byte[]
2.5 push byte[] in the audio output stream
What happens, now, is that if I comment out 2.3, the audio is converted from bytes to floats and back, and sounds perfect. If I instead perform the VST processing, borderline white noise comes out. I don't really know how to proceed. My intuition is that something might be wrong with the byte[] to float[] conversion, but I don't know what. I tried changing the endian of the bytes, didn't work.
Anybody has suggestions?
Here is the actual code file:
public class ByteConv
{
public static void main(String[] args) throws Exception {
AEffect effect = VST.load("G:/AnalogDelay");
// Startup the plugin
// Ask the plugin to display its GUI using the SWT window handle
MiniHost miniHost = new MiniHost(effect);
// miniHost.setBlockOnOpen(true);
// miniHost.open();
effect.open();
effect.setSampleRate(44100.0f);
effect.setBlockSize(512);
File file = new File("C:\\Users\\Laimon\\Desktop\\wma-01.wav");
try {
AudioFormat format = AudioSystem.getAudioFileFormat(file).getFormat();
System.out.println(format.toString());
AudioInputStream inputStream = AudioSystem.getAudioInputStream(file);
SourceDataLine sourceLine = AudioSystem.getSourceDataLine(format);
sourceLine.open();
sourceLine.start();
int bytesPerFrame = inputStream.getFormat().getFrameSize();
if (bytesPerFrame == AudioSystem.NOT_SPECIFIED) {
// some audio formats may have unspecified frame size
// in that case we may read any amount of bytes
bytesPerFrame = 1;
}
// Set an arbitrary buffer size of 512 frames.
int numBytes = 512 * bytesPerFrame;
byte[] audioBytes = new byte[numBytes];
int numBytesRead = 0;
// Try to read numBytes bytes from the file.
while ((numBytesRead = inputStream.read(audioBytes)) != -1) {
// Convert byte[] into float[] for processing
float[] monoInput = byteArrayToFloatArray(audioBytes, numBytesRead);
// Prepare input array with same wave on all channels
float[][] vstInput = new float[effect.numInputs][];
for (int i = 0; i < vstInput.length; i++)
vstInput[i] = monoInput;
// Allocate output array of same size
float[][] vstOutput = new float[effect.numOutputs][monoInput.length];
effect.processReplacing(vstInput, vstOutput, vstInput[0].length);
audioBytes = floatArrayToByteArray(vstOutput[0]);
sourceLine.write(audioBytes, 0, numBytesRead);
}
} catch(IOException | LineUnavailableException | UnsupportedAudioFileException ex) {
}
VST.dispose(effect);
}
private static float[] byteArrayToFloatArray(byte[] barray, int n) {
// We assume n in between 0 and barray.length
System.arraycopy(barray, 0, barray, 0, n);
ByteBuffer bb = ByteBuffer.wrap(barray);
FloatBuffer fb = bb.asFloatBuffer();
float[] flush = new float[fb.capacity()];
fb.get(flush);
return flush;
}
private static byte[] floatArrayToByteArray(float[] farray) {
ByteBuffer bb = ByteBuffer.allocate(farray.length*4);
for (int i = 0; i < farray.length; i++)
bb.putFloat(i*4, farray[i]);
return bb.array();
}
}
Thanks in advance for any help!

Related

How to read and write Bytes without skipping the zeros in Java

Does anyone know how to input or output bytes without skipping the zeros
I am trying to write a program that exports an array of ints to unsigned shorts.
I have written code to write and read wave files, but they aren't formatted right.
Read Example
// dwChunkSize
byteConvertedLong = extractBytes(4);
dwFormatChunkSize = convertBytesToLong(byteConvertedLong);
System.out.println("Format Chunk size: " + dwFormatChunkSize);
// wFormatTag
byteConvertedInt = extractBytes(2);
System.out.println("Format Tag: " + convertBytesToInt(byteConvertedInt));
functions for reading data:
// convert byte to long
public long convertBytesToLong(byte[] values) {
byte[] spliceToArray = {0, 0, 0, 0,
values[0], values[1], values[2], values[3]};
ByteBuffer debuffer = ByteBuffer.wrap(spliceToArray);
long returnValue = (long)debuffer.getLong();
return returnValue;
}
// convert byte to int
public int convertBytesToInt(byte[] values) {
byte[] spliceToArray = {0, 0, values[0], values[1]};
ByteBuffer debuffer = ByteBuffer.wrap(spliceToArray);
int returnValue = debuffer.getInt();
return returnValue;
}
// extract bytes to DataOutputStream
public byte[] extractBytes(int bytesToExtract)
throws IOException {
// define byte array
byte[] extractedBytes = new byte[bytesToExtract];
// extract bytes
dis.read(extractedBytes, byteTracker, bytesToExtract);
return extractedBytes;
}
Write example
// dwChunkSize
byteConvertedLong = convertLongToBytes(dwFormatChunkSize);
appendBytes(byteConvertedLong, 4, 8);
// wFormatTag
byteConvertedInt = convertIntToByte(W_FORMAT_TAG);
appendBytes(byteConvertedInt, 2, 4);
Functions for writing;
// convert long to byte
public byte[] convertLongToBytes(long value) {
ByteBuffer buffer = ByteBuffer.allocate(8);
buffer.putLong(value);
return buffer.array();
}
// convert int to byte
public byte[] convertIntToByte(int value) {
ByteBuffer buffer = ByteBuffer.allocate(4);
buffer.putInt(value);
return buffer.array();
}
// append bytes to DataOutputStream
public void appendBytes(byte[] bytesToAppend, int start, int end)
throws IOException {
for (int i = start; i < end; i++) {
dos.writeByte(bytesToAppend[i]);
}
}
I have to use Long and int variabls to read and write ints and shorts respectively so that they are written as unsigned numbers.
I have been following instructions on this site https://blogs.msdn.microsoft.com/dawate/2009/06/23/intro-to-audio-programming-part-2-demystifying-the-wav-format/ to make sure all the data is formatted right
The main problem with both reading and writing is that if I read 1 as a short (0000000000000001), it will skip the zeros and start reading from 1 (10000000000000000).
If that isn't the problem I don't know what is?
It turned out that Wave files are written in little endian and I was writing in big endian. I needed to implement a function that reversed the bytes of the byte array.
I came up with this.
// bigToLittleEndien method
public byte[] bigToLittleEndien(byte[] oldArray) {
// new array
byte[] newArray = new byte[oldArray.length];
// reverse the order of byes
for (int i = 0, j = oldArray.length - 1; i < oldArray.length; i++, j--) {
newArray[i] = oldArray[j];
}
// return the new bytes
return newArray;
}
I had some other problems that were small but I fixed them all.

How do I use the byte array given by targetdataline to gain information about the audio?

I am feeding audio from an electric cello into the mic port on my computer, and I would like my program to understand when no audio is being inputted and if audio is being inputted, what note/frequency is being played.
I am able to get the cello to play through the targetdataline and out to the sourcedataline in java. I also implemented a live frequency detection portion using fft according to Java: How to get current frequency of audio input?, but it doesn't work that well with the cello. It does perform somewhat well when I whistle, however.
I would appreciate any guidance into understanding how to use the information gained by the targetdataline and the cello output to see what is being played. Alternative approaches such as using different applications are welcome.
AudioFormat format = new AudioFormat(AudioFormat.Encoding.PCM_SIGNED, 44100, 16, 2, 4, 44100, false);
try {
//making SourceDataLine for writing
DataLine.Info info = new DataLine.Info(SourceDataLine.class, format);
final SourceDataLine sourceLine = (SourceDataLine)AudioSystem.getLine(info);
sourceLine.open(format/*, #*/);
//#5800 is the smallest i got it to work so far
//making TargetDataLine for getting in
info = new DataLine.Info(TargetDataLine.class, format);
final TargetDataLine targetLine = (TargetDataLine)AudioSystem.getLine(info);
targetLine.open(format);
final byte[] buf = new byte[2048]; // <--- increase this for higher frequency resolution
final int numberOfSamples = buf.length / format.getFrameSize();
final JavaFFT fft = new JavaFFT(numberOfSamples);
Thread liveThread = new Thread() {
#Override public void run() {
int readBytes;
try {
while(true) {
readBytes = targetLine.read(buf, 0, buf.length);
sourceLine.write(buf, 0, readBytes);
final float[] samples = decode(buf, format);
final float[][] transformed = fft.transform(samples);
final float[] realPart = transformed[0];
final float[] imaginaryPart = transformed[1];
final double[] magnitudes = toMagnitudes(realPart, imaginaryPart);
System.out.println("length" + magnitudes.length);
System.out.println(ecello.findMaxMagnitude(magnitudes));
}
}
catch(Exception e) {
e.printStackTrace();
}
}
};
targetLine.start();
sourceLine.start();
liveThread.start();
System.out.println("Started recording...");
Thread.sleep(3000000);
targetLine.stop();
targetLine.close();
System.out.println("Ended recording");
System.exit(0);
}
catch(Exception e) {
e.printStackTrace();
}
}
private int findMaxMagnitude(double[] input){
//Calculates Maximum Magnitude of the array
double max = input[0];
double temp;
int index = 0;
for(int i = 1; i<input.length; i++){
temp = input[i];
if(temp>max){
max = temp;;
index = i;
}
}
return index;
}
Using this fft on the cello input has not given good results. I think I can detect when no input is being played by checking the magnitude of the biggest frequency and see if it passes a threshold, but that is future work.

Mixing wave files

I can not mix two audio extension files wav. My work:
byte[] bufData1 = null;
byte[] bufData2 = null;
ArrayList<Byte> bufData3 = new ArrayList<Byte>();
Creating two arrays with raw audio data
public void bootloadInputData(String p1, String p2) throws IOException {
bufData1 = bootloadReadFileByte(p1);
bufData2 = bootloadReadFileByte(p2);
System.arraycopy(bufData1, 44, bufData1, 0, (bufData1.length - 44));
System.arraycopy(bufData2, 44, bufData2, 0, (bufData2.length - 44));
}
public byte[] bootloadReadFileByte(String path) throws IOException{
ByteArrayOutputStream out = null;
InputStream input = null;
try{
out = new ByteArrayOutputStream();
input = new BufferedInputStream(new FileInputStream(path));
int data = 0;
while((data = input.read()) != -1){
out.write(data);
}
}
finally{
if(null != input){
input.close();
}
if(null != out){
out.close();
}
}
return out.toByteArray();
}
Mixing the bytes of raw audio data
public void bootloadOutputData() throws IOException {
for(int i = 0; i < ((bufData1.length + bufData2.length) / 4); i += 4) {
if(i < bufData1.length){
bufData3.add(bufData1[i]);
bufData3.add(bufData1[i+1]);
bufData3.add(bufData1[i+2]);
bufData3.add(bufData1[i+3]);
}
if(i < bufData2.length){
bufData3.add(bufData2[i]);
bufData3.add(bufData2[i+1]);
bufData3.add(bufData2[i+2]);
bufData3.add(bufData2[i+3]);
}
}
}
Create a new file, fill in the header and raw audio data.
private void bootloadCreateWaveMix(String p1, String p2, String p3) throws IOException {
int size1 = 0;
int size2 = 0;
FileInputStream fis1 = null;
FileInputStream fis2 = null;
try {
fis1 = new FileInputStream(p1);
fis2 = new FileInputStream(p2);
size1 = fis1.available();
size2 = fis2.available();
} finally {
if(fis1 != null){
fis1.close();
}
if(fis2 != null){
fis2.close();
}
}
int mNumBytes = (size1 + size2);
DataOutputStream out = null;
try {
out = new DataOutputStream(new FileOutputStream(p3));
writeId(out, "RIFF");
writeInt(out, 36 + mNumBytes);
writeId(out, "WAVE");
writeId(out, "fmt ");
writeInt(out, 16);
writeShort(out, (short) 1);
writeShort(out, (short) 4);
writeInt(out, (int) 44100);
writeInt(out, 2 * 44100 * 16 / 8);
writeShort(out, (short)(2 * 16 / 8));
writeShort(out, (short) 16);
writeId(out, "data");
writeInt(out, mNumBytes);
out.write(toByteArray(bufData3));
} finally {
if(out != null){
out.close();
}
}
}
private static void writeId(OutputStream out, String id) throws IOException {
for (int i = 0; i < id.length(); i++) out.write(id.charAt(i));
}
private static void writeInt(OutputStream out, int val) throws IOException {
out.write(val >> 0);
out.write(val >> 8);
out.write(val >> 16);
out.write(val >> 24);
}
private static void writeShort(OutputStream out, short val) throws IOException {
out.write(val >> 0);
out.write(val >> 8);
}
public static byte[] toByteArray(ArrayList<Byte> in) {
byte[] data = new byte[in.size()];
for (int i = 0; i < data.length; i++) {
data[i] = (byte) in.get(i);
}
return data;
}
Question:
This code does not correctly create a file that the computer can not
play, but the device can. Reproduction is bad, there is some kind of
interference at the end of the merged files. Also, playback ends when
the first file ends, even if the second file is larger than the first
one. Another problem with the channels on the idea is two stereo
files, and in the title I indicate 4 life even though 2. The files
will always be 44100/16 bit / stereo
If I understand correctly, you want to do the following:
Given 2 input WAV files, mix them together to a single WAV file.
The contents of the output will be the input files played at the same time, not one after the other.
The length of the new file will be the length of the longest of the input files.
All files, input and output, are 16 bit, stereo 44100Hz.
If that's the case, here are (some of) your mistakes:
You need to parse the incoming files so that you don't read their headers as audio data (Do not skip this step just because you already know the format of the audio. You need to read the headers to confirm the data format and accurately determine the number of samples in your input. Also, note that 2/16/44100 WAV files can have different size headers because they can contain various chunks, so you can't just skip over X bytes and then read the file -- you must parse the header!).
If the WAV files are all 16-bit, you need to convert the incoming data from bytes to shorts (note, this is not a simple typecasting -- you must pack 2 bytes into each short. I believe you can use a DataInputStream for this, but be sure to take endianness into account -- WAV files are little-endian and Java is big-endian). Once you've got the shorts representing your samples, average the shorts from the separate files to do the mixing. Your averaged values must then be converted back to bytes (DataOutputStream) to save the resulting file. When you've run out of data from one file, substitute zero.
Your calculation of numBytes is incorrect -- it is not the sum of raw bytes in both files, but a somewhat more complex calculation. In your case, you want it to be equal to something like this:
n1 = number of samples in file 1
n2 = number of samples in file 2
n = MAX( n1 + n2 )
numBytes = n * (number of channels) * (number of bytes per channel) = n * 2 * 2
I strongly urge you to consider using a library like JMF to tackle 1 & 2.

how to do framing for audio signal in java

I want to split my audio file (.wav format) in frames of 32 milliseconds each. Sampling frequency - 16khz, number of channels - 1(mono), pcm signal, sample size = 93638.
After getting the data in the byte format, I am converting the byte array storing the wav file data to double array since I require it to pass it to a method which accepts a double array, I am using the following code can someone tell me how to proceed?
import javax.sound.sampled.AudioFileFormat;
import javax.sound.sampled.AudioInputStream;
import javax.sound.sampled.AudioSystem;
import java.io.ByteArrayOutputStream;
import java.io.File;
import java.nio.ByteBuffer;
public class AudioFiles
{
public static void main(String[] args)
{
String file = "D:/p.wav";
AudioFiles afiles = new AudioFiles();
byte[] data1 = afiles.readAudioFileData(file);
byte[] data2 = afiles.readWAVAudioFileData(file);
System.out.format("data len1: %d\n", data1.length);
System.out.format("data len2: %d\n", data2.length);
/* for(int i=0;i<data2.length;i++)
{
System.out.format("\t"+data2[i]);
}*/
System.out.println();
/* for(int j=0;j<data1.length;j++)
{
System.out.format("\t"+data1[j]);
}*/
System.out.format("diff len: %d\n", data2.length - data1.length);
double[] d = new double[data1.length];
d = toDoubleArray(data1);
for (int j = 0; j < data1.length; j++)
{
System.out.format("\t" + d[j]);
}
daub a = new daub();
a.daubTrans(d);
}
public static double[] toDoubleArray(byte[] byteArray)
{
int times = Double.SIZE / Byte.SIZE;
double[] doubles = new double[byteArray.length / times];
for (int i = 0; i < doubles.length; i++)
{
doubles[i] = ByteBuffer.wrap(byteArray, i * times, times).getDouble();
}
return doubles;
}
public byte[] readAudioFileData(final String filePath)
{
byte[] data = null;
try
{
final ByteArrayOutputStream baout = new ByteArrayOutputStream();
final File file = new File(filePath);
final AudioInputStream audioInputStream = AudioSystem
.getAudioInputStream(file);
byte[] buffer = new byte[4096];
int c;
while ((c = audioInputStream.read(buffer, 0, buffer.length)) != -1)
{
baout.write(buffer, 0, c);
}
audioInputStream.close();
baout.close();
data = baout.toByteArray();
}
catch (Exception e)
{
e.printStackTrace();
}
return data;
}
public byte[] readWAVAudioFileData(final String filePath)
{
byte[] data = null;
try
{
final ByteArrayOutputStream baout = new ByteArrayOutputStream();
final AudioInputStream audioInputStream = AudioSystem.getAudioInputStream(new File(filePath));
AudioSystem.write(audioInputStream, AudioFileFormat.Type.WAVE, baout);
audioInputStream.close();
baout.close();
data = baout.toByteArray();
}
catch (Exception e)
{
e.printStackTrace();
}
return data;
}
}
I want to pass the double array d to method performing wavelet transform, in the frames of 32 millisecond since it accepts a double array.
In my previous question I was given a reply that:
At 16kHz sample rate you'll have 16 samples per millisecond. Therefore, each 32ms frame would be 32*16=512 mono samples. Multiply by the number of bytes-per-sample (typically 2 or 4) and that will be the number of bytes per frame.
I want to know whether my frame size changes when I convert my array from byte format to double format or does it remains the same??
My Previous Question.

(ask) XOR calculation issue on file in java

I would like to execute XOR operation in my code. However I have strange behavior on the output. Sometimes the result is right but sometime it's not.
Here's the situation:
I have file which I already split into two parts and then I created one parity file using xor operation on both file (source files). So now I have three files. Then I deleted one of the source file. I would like to retrieve the missing file within xor operation between parity file and the remaining source file regarding the missing file. I am using hash function to check whether the output is correct or not. If the function is called only one time, everything is fine, but whenever I have many operations to retrieve the missing file on other files, sometimes my function generates the wrong result.
When they generate the wrong results, it's always generating the same file. BUT if I put thread.sleep for 1 second, they always generate the correct result even if I have more than 1000 operations.
Could somebody help me to spot which part of my code is broke?
private byte[] parityByte(byte[] firstByte, byte[] secondByte) {
int size1;
size1 = firstByte.length;
int size2;
size2 = secondByte.length;
byte[] parity;
parity = new byte[size1];
for (int i = 0; i < size2; i++) {
parity[i] = (byte) (firstByte[i] ^ secondByte[i]);
}
for (int i = size2; i < size1; i++) {
parity[i] = firstByte[i];
}
return parity;
}
/**
* get original chunks
*/
public Chunk getOriginal(Chunk parity, Chunk compare, String orig) throws FileNotFoundException, IOException {
File par = new File(parity.getHash());
InputStream parity = new BufferedInputStream(new FileInputStream(parity.getHash()));
InputStream source = new BufferedInputStream(new FileInputStream(compare.getHash()));
int size = (int) par.length();
int bufferSize = size;
byte[] firstBuffer = new byte[size];
byte[] secondBuffer = new byte[size];
long remainSize;
byte[] destByte = new byte[1];
parity.read(destByte, 0, 1);
Integer dest = new Integer(destByte[0]);
remainSize = size - 1 - dest;
OutputStream originalChunk;
originalChunk = new FileOutputStream(orig);
while (remainSize > 0) {
if (remainSize > bufferSize) {
remainSize -= bufferSize;
} else {
bufferSize = (int) remainSize;
firstBuffer = new byte[bufferSize];
secondBuffer = new byte[bufferSize];
remainSize = 0;
}
parity.read(firstBuffer, 0, bufferSize);
source.read(secondBuffer, 0, bufferSize);
originalChunk.write(parityByte(firstBuffer, secondBuffer));
}
originalChunk.flush();
parity.close();
source.close();
originalChunk.close();
Chunk tempChunk = Chunk.newChunk(orig);
return tempChunk;
}
Thank you
sorry for my bad english.
You are assuming that all the reads fill the buffer. Check the Javadoc. The read(byte[] ...) method returns a value, and it is for a reason.
Have a look at DataInputStream.readFully() for a simple solution.

Categories

Resources