Play audio in multiple outputs - java

I need to play music in different audio outputs.
For instance, I have two musics: music1 and music2,
and they have to play in separate threads in diferent speakers.
Assuming that I have more than one audio device that is able
to play sound:
I found this method (here - it is the BasicPlayer):
protected void createLine() throws LineUnavailableException
{
log.info("Create Line");
if (m_line == null)
{
AudioFormat sourceFormat = m_audioInputStream.getFormat();
log.info("Create Line : Source format : " + sourceFormat.toString());
int nSampleSizeInBits = sourceFormat.getSampleSizeInBits();
if (nSampleSizeInBits <= 0) nSampleSizeInBits = 16;
if ((sourceFormat.getEncoding() == AudioFormat.Encoding.ULAW) || (sourceFormat.getEncoding() == AudioFormat.Encoding.ALAW)) nSampleSizeInBits = 16;
if (nSampleSizeInBits != 8) nSampleSizeInBits = 16;
AudioFormat targetFormat = new AudioFormat(AudioFormat.Encoding.PCM_SIGNED, sourceFormat.getSampleRate(), nSampleSizeInBits, sourceFormat.getChannels(), sourceFormat.getChannels() * (nSampleSizeInBits / 8), sourceFormat.getSampleRate(), false);
log.info("Create Line : Target format: " + targetFormat);
// Keep a reference on encoded stream to progress notification.
m_encodedaudioInputStream = m_audioInputStream;
try
{
// Get total length in bytes of the encoded stream.
encodedLength = m_encodedaudioInputStream.available();
}
catch (IOException e)
{
log.error("Cannot get m_encodedaudioInputStream.available()", e);
}
// Create decoded stream.
m_audioInputStream = AudioSystem.getAudioInputStream(targetFormat, m_audioInputStream);
AudioFormat audioFormat = m_audioInputStream.getFormat();
DataLine.Info info = new DataLine.Info(SourceDataLine.class, audioFormat, AudioSystem.NOT_SPECIFIED);
Mixer mixer = getMixer(m_mixerName);
if (mixer != null)
{
log.info("Mixer : "+mixer.getMixerInfo().toString());
m_line = (SourceDataLine) mixer.getLine(info);
}
else
{
m_line = (SourceDataLine) AudioSystem.getLine(info);
m_mixerName = null;
}
log.info("Line : " + m_line.toString());
log.debug("Line Info : " + m_line.getLineInfo().toString());
log.debug("Line AudioFormat: " + m_line.getFormat().toString());
}
}
With a little debugging, I've found out that the mixer is always null. Why is that?
The mixer shoudn't be the device that outputs sound through a target line?
This program always playback in the default device set on my computer, what can I do to change that?

I've actually just started working with the Java Sound API for one of my own projects, but from what I understand, Mixer is just an interface, not an object. That can explain a part of your problem.

Related

Audio Line is always unavailable

I have a problem with my code suddenly giving an error that it wasn't giving about 24hrs ago. No matter what input device I set as the default in Windows, I always get the following error:
javax.sound.sampled.LineUnavailableException: line with format PCM_SIGNED 8000.0 Hz, 8 bit, stereo, 2 bytes/frame not supported.at java.desktop/com.sun.media.sound.DirectAudioDevice$DirectDL.implOpen(DirectAudioDevice.java:484)
at java.desktop/com.sun.media.sound.AbstractDataLine.open(AbstractDataLine.java:115)
at java.desktop/com.sun.media.sound.AbstractDataLine.open(AbstractDataLine.java:142)
Code for the recorder class:
public class SoundRecorder {
// record duration, in milliseconds
static final long RECORD_TIME = 60000; // 1 minute
// path of the wav file
File wavFile = new File("Audio//RecordAudio.wav");
// format of audio file
AudioFileFormat.Type fileType = AudioFileFormat.Type.WAVE;
// the line from which audio data is captured
TargetDataLine line;
/**
* Defines an audio format
*/
AudioFormat getAudioFormat() {
float sampleRate = 8000;
int sampleSizeInBits = 8;
int channels = 2;
boolean signed = true;
boolean bigEndian = true;
AudioFormat format = new AudioFormat(sampleRate, sampleSizeInBits,
channels, signed, bigEndian);
return format;
}
/**
* Captures the sound and record into a WAV file
*/
void start() {
try {
AudioFormat format = getAudioFormat();
DataLine.Info info = new DataLine.Info(TargetDataLine.class, format);
System.out.println(format.properties());
// checks if system supports the data line
if (!AudioSystem.isLineSupported(info)) {
System.out.println(Arrays.toString(AudioSystem.getAudioFileTypes()));
//System.out.println(line.getLineInfo());
System.out.println("Line not supported");
System.exit(0);
}
line = (TargetDataLine) AudioSystem.getLine(info);
System.out.println(format.properties());
//System.out.println(line.getLineInfo());
//Error is here
>>>line.open(format);
System.out.println(format.properties());
System.out.println(line.getLineInfo());
line.start(); // start capturing
System.out.println("Start capturing...");
AudioInputStream ais = new AudioInputStream(line);
System.out.println("Start recording...");
// start recording
AudioSystem.write(ais, fileType, wavFile);
} catch (LineUnavailableException ex) {
ex.printStackTrace();
} catch (IOException ioe) {
ioe.printStackTrace();
}
}
/**
* Closes the target data line to finish capturing and recording
*/
void finish() {
line.stop();
line.close();
System.out.println("Finished");
}
}
Main:
SoundRecorder audio = new SoundRecorder();
long recordTime = 60000L;
Thread stopper = new Thread(() -> {
try {
Thread.sleep(recordTime);
} catch (InterruptedException var4) {
var4.printStackTrace();
}audio.finish();
});
stopper.start();
System.out.println("stopper started");
audio.start();
System.out.println("audio started");
stopper.join();}
I've tried restarting my computer, changing the encoding to signed, unsigned, float, alwa, and ulaw. I've tried changing the sample rate, size, and endian as well. I've also tried changing the default device from the audio mixer to the microphone as well.

Google Speech-to-Text for streaming audio with Java

I am trying to use the Google Speech-to-Text API to do some voice-to-voice translation (also using Translation and Text-to-Speech). I would like for a person to speak into the microphone and for that text to be transcribed to text. I used the streaming audio tutorial found in the google documentation as a base for this method. I would also like the audio stream to stop when the person has stopped speaking.
Here is the modified method:
public static String streamingMicRecognize(String language) throws Exception {
ResponseObserver<StreamingRecognizeResponse> responseObserver = null;
try (SpeechClient client = SpeechClient.create()) {
responseObserver =
new ResponseObserver<StreamingRecognizeResponse>() {
ArrayList<StreamingRecognizeResponse> responses = new ArrayList<>();
public void onStart(StreamController controller) {}
public void onResponse(StreamingRecognizeResponse response) {
responses.add(response);
}
public void onComplete() {
SPEECH_TO_TEXT_ANSWER = "";
for (StreamingRecognizeResponse response : responses) {
StreamingRecognitionResult result = response.getResultsList().get(0);
SpeechRecognitionAlternative alternative = result.getAlternativesList().get(0);
System.out.printf("Transcript : %s\n", alternative.getTranscript());
SPEECH_TO_TEXT_ANSWER = SPEECH_TO_TEXT_ANSWER + alternative.getTranscript();
}
}
public void onError(Throwable t) {
System.out.println(t);
}
};
ClientStream<StreamingRecognizeRequest> clientStream =
client.streamingRecognizeCallable().splitCall(responseObserver);
RecognitionConfig recognitionConfig =
RecognitionConfig.newBuilder()
.setEncoding(RecognitionConfig.AudioEncoding.LINEAR16)
.setLanguageCode(language)
.setSampleRateHertz(16000)
.build();
StreamingRecognitionConfig streamingRecognitionConfig =
StreamingRecognitionConfig.newBuilder().setConfig(recognitionConfig).build();
StreamingRecognizeRequest request =
StreamingRecognizeRequest.newBuilder()
.setStreamingConfig(streamingRecognitionConfig)
.build(); // The first request in a streaming call has to be a config
clientStream.send(request);
// SampleRate:16000Hz, SampleSizeInBits: 16, Number of channels: 1, Signed: true,
// bigEndian: false
AudioFormat audioFormat = new AudioFormat(16000, 16, 1, true, false);
DataLine.Info targetInfo =
new Info(
TargetDataLine.class,
audioFormat); // Set the system information to read from the microphone audio stream
if (!AudioSystem.isLineSupported(targetInfo)) {
System.out.println("Microphone not supported");
System.exit(0);
}
// Target data line captures the audio stream the microphone produces.
TargetDataLine targetDataLine = (TargetDataLine) AudioSystem.getLine(targetInfo);
targetDataLine.open(audioFormat);
targetDataLine.start();
System.out.println("Start speaking");
playMP3("beep-07.mp3");
long startTime = System.currentTimeMillis();
// Audio Input Stream
AudioInputStream audio = new AudioInputStream(targetDataLine);
long estimatedTime = 0, estimatedTimeStoppedSpeaking = 0, startStopSpeaking = 0;
int currentSoundLevel = 0;
Boolean hasSpoken = false;
while (true) {
estimatedTime = System.currentTimeMillis() - startTime;
byte[] data = new byte[6400];
audio.read(data);
currentSoundLevel = calculateRMSLevel(data);
System.out.println(currentSoundLevel);
if (currentSoundLevel > 20) {
estimatedTimeStoppedSpeaking = 0;
startStopSpeaking = 0;
hasSpoken = true;
}
else {
if (startStopSpeaking == 0) {
startStopSpeaking = System.currentTimeMillis();
}
estimatedTimeStoppedSpeaking = System.currentTimeMillis() - startStopSpeaking;
}
if ((estimatedTime > 15000) || (estimatedTimeStoppedSpeaking > 1000 && hasSpoken)) { // 15 seconds or stopped speaking for 1 second
playMP3("beep-07.mp3");
System.out.println("Stop speaking.");
targetDataLine.stop();
targetDataLine.drain();
targetDataLine.close();
break;
}
request =
StreamingRecognizeRequest.newBuilder()
.setAudioContent(ByteString.copyFrom(data))
.build();
clientStream.send(request);
}
} catch (Exception e) {
System.out.println(e);
}
responseObserver.onComplete();
String ans = SPEECH_TO_TEXT_ANSWER;
return ans;
}
The output is supposed to be transcribed text in string form. However, it is very inconsistent. Most of the time, it returns an empty string. However, sometimes the program does work and does return the transcribed text.
I have also tried to record the audio seperately while the program is running. Although the method returned an empty string, when I saved the audio file recorded seperately and sent that directly through the api, it returned the correct transcribed text.
I do not understand why/how the program is only working some of the time.

Java audio player cuts audio at the end

Hey there stack overflow.
I'm creating a playlist player in Java, so far so good, I got all the logic down and the project is nearing completion. We've been testing the playback by creating some large playlist and just let the thing go from start to end. The playback sounds good, but sometimes the audio is cut off at the end. This happens very rarely. The last x seconds (time varies) are not played.
The files im testing with are all PCM wave file of 16 or 24 bit sampling size. Im using the Java sound engine in combination with Java zooms mp3 and ogg spi to support other types of audio files.
So far I have this logged a couple of times and my first thought was that the file might be corrupt, this is not the case. I've tried playing the file on its own and it played fully!
I've tried to find the problem but i just cant find it. I dont think theres anything wrong with my audio player, im running out of ideas.
Here is how i create my audio input stream:
public static AudioInputStream getUnmarkableAudioInputStream(Mixer mixer, File file)
throws UnsupportedAudioFileException
{
if (!file.exists() || !file.canRead()) {
return null;
}
AudioInputStream stream;
try {
stream = getAudioInputStream(file);
} catch (IOException e) {
logger.error("failed to retrieve stream from file", e);
return null;
}
AudioFormat baseFormat = stream.getFormat();
DataLine.Info info = new DataLine.Info(SourceDataLine.class, baseFormat);
boolean supportedDirectly = false;
if (mixer == null) {
supportedDirectly = AudioSystem.isLineSupported(info);
} else {
supportedDirectly = mixer.isLineSupported(info);
}
// compare the AudioFormat with the desired one
if (baseFormat.getEncoding() != AudioFormat.Encoding.PCM_SIGNED || !supportedDirectly) {
AudioFormat decodedFormat = new AudioFormat(
AudioFormat.Encoding.PCM_SIGNED,
baseFormat.getSampleRate(), 16, baseFormat.getChannels(),
baseFormat.getChannels() * 2, baseFormat.getSampleRate(),
false);
// convert the audio format to the supported one
if (AudioSystem.isConversionSupported(decodedFormat, baseFormat)) {
stream = AudioSystem.getAudioInputStream(decodedFormat, stream);
} else {
logger.debug(
"Audio format {} is not supported "
+ "and can not be converted to default format",
baseFormat.toString());
return null;
}
}
return stream;
}
And this is my audio player thread:
final class PlayerThread extends Thread
{
private byte[] buffer;
/**
* Initialize the buffer
*/
public void initBuffer()
{
linelock.lock();
try {
buffer = new byte[line.getBufferSize() / 5];
} finally {
linelock.unlock();
}
}
public void run()
{
initBuffer();
while (!isInterrupted()) {
checkState();
// if the line is just cleared go to the start of the loop
if (line == null || isInterrupted()) {
continue;
}
write();
}
// clean up all resources
close();
// change the state
state = Player.State.STOPPED;
}
private void checkState()
{
if (state != Player.State.PLAYING) {
if (line != null) {
line.flush();
}
try {
synchronized (this) {
this.wait();
}
} catch (InterruptedException e) {
// reset the interupt status
interrupt();
}
}
}
private void write()
{
// how much bytes could be written on the line
int available = line.available();
// is the space on the line big enough to write the buffer to
if (available >= buffer.length) {
// fill the buffer array
int read = 0;
try {
read = audioStream.read(buffer, 0, buffer.length);
} catch (Throwable ball) {
logger.error("Error in audio engine (read)", ball);
}
// if there was something to read, write it to the line
// otherwise stop the player
if (read >= 0) {
try {
linelock.lock();
line.write(buffer, 0, read);
} catch (Throwable ball) {
logger.error("Error in audio engine (write)", ball);
} finally {
linelock.unlock();
}
bytesRead += read;
} else {
line.drain();
MoreDefaultPlayer.this.stop();
}
}
}
private void close()
{
// invoke close on listeners
invokePlayerClosedOnListeners();
// destroy the volume chain
vc.removeVolumeListener(MoreDefaultPlayer.this);
// close the stream
try {
audioStream.close();
} catch (IOException e) {
logger.error("failed to close audio stream");
}
clearAllListeners();
linelock.lock();
try {
// quit the line
line.stop();
line.close();
line = null;
} finally {
linelock.unlock();
}
}
}
As you can see I drain the line after, so i dont think the problem is the line being closed before everything from the stream is played.
Can anyone see what might be wrong with this code?
I don't see an obvious answer, but there are a couple things that raise yellow flags for me. The common practise is to put the line.write() method in a while loop, not to invoke it repeatedly. There is usually no need to test for line.available() or to handle locking the line. The method line.write() will handle the necessary blocking if there is no space on the line available. I've always been cautioned not to lock or block audio lines unnecessarily.
Is the locking logic an integral part of the handling of the sequence of queues? The error you are describing could be in that handling. (Maybe there is an interaction with the test of available() compared to the buffer size? Is the amount of cutoff roughly equal to the buffer size?)
I would consider implementing a LineListener to announce when a cue is finished, and making that event the trigger of the playback of the next cue. An LineEvent of type STOP can be issued when the given file is done, notifying whatever handles the queue to proceed to the next file.

Java audio Stream Closed error

I am trying to add sound to a game I am making, but every time I try to load the sound, I get a Stream Closed Exception. I don't understand why this is happening.
Loads the sound:
public class WavPlayer extends Thread {
/*
* #param s The path of the wav file.
* #return The sound data loaded into the WavSound object
*/
public static WavSound loadSound(String s){
// Get an input stream
InputStream is = WavPlayer.class.getClassLoader().getResourceAsStream(s);
AudioInputStream audioStream;
try {
// Buffer the input stream
BufferedInputStream bis = new BufferedInputStream(is);
// Create the audio input stream and audio format
audioStream = AudioSystem.getAudioInputStream(bis); //!Stream Closed Exception occurs here
AudioFormat format = audioStream.getFormat();
// The length of the audio file
int length = (int) (audioStream.getFrameLength() * format.getFrameSize());
// The array to store the samples in
byte[] samples = new byte[length];
// Read the samples into array to reduce disk access
// (fast-execution)
DataInputStream dis = new DataInputStream(audioStream);
dis.readFully(samples);
// Create a sound container
WavSound sound = new WavSound(samples, format, (int) audioStream.getFrameLength());
// Don't start the sound on load
sound.setState(SoundState.STATE_STOPPED);
// Create a new player for each sound
new WavPlayer(sound);
return sound;
} catch (Exception e) {
// An error. Mustn't happen
}
return null;
}
// Private variables
private WavSound sound = null;
/**
* Constructs a new player with a sound and with an optional looping
*
* #param s The WavSound object
*/
public WavPlayer(WavSound s) {
sound = s;
start();
}
/**
* Runs the player in a separate thread
*/
#Override
public void run(){
// Get the byte samples from the container
byte[] data = sound.getData();
InputStream is = new ByteArrayInputStream(data);
try {
// Create a line for the required audio format
SourceDataLine line = null;
AudioFormat format = sound.getAudioFormat();
// Calculate the buffer size and create the buffer
int bufferSize = sound.getLength();
// System.out.println(bufferSize);
byte[] buffer = new byte[bufferSize];
// Create a new data line to write the samples onto
DataLine.Info info = new DataLine.Info(SourceDataLine.class, format);
line = (SourceDataLine) AudioSystem.getLine(info);
// Open and start playing on the line
try {
if (!line.isOpen()) {
line.open();
}
line.start();
} catch (Exception e){}
// The total bytes read
int numBytesRead = 0;
boolean running = true;
while (running) {
// Destroy this player if the sound is destroyed
if (sound.getState() == SoundState.STATE_DESTROYED) {
running = false;
// Release the line and release any resources used
line.drain();
line.close();
}
// Write the data only if the sound is playing or looping
if ((sound.getState() == SoundState.STATE_PLAYING)
|| (sound.getState() == SoundState.STATE_LOOPING)) {
numBytesRead = is.read(buffer, 0, buffer.length);
if (numBytesRead != -1) {
line.write(buffer, 0, numBytesRead);
} else {
// The samples are ended. So reset the position of the
// stream
is.reset();
// If the sound is not looping, stop it
if (sound.getState() == SoundState.STATE_PLAYING) {
sound.setState(SoundState.STATE_STOPPED);
}
}
} else {
// Not playing. so wait for a few moments
Thread.sleep(Math.min(1000 / Global.FRAMES_PER_SECOND, 10));
}
}
} catch (Exception e) {
// Do nothing
}
}
The error message I get is: "Exception in thread "main" java.io.IOException: Stream closed
at java.io.BufferedInputStream.getInIfOpen(BufferedInputStream.java:134)
at java.io.BufferedInputStream.fill(BufferedInputStream.java:218)
at java.io.BufferedInputStream.read(BufferedInputStream.java:237)
at java.io.DataInputStream.readInt(DataInputStream.java:370)
at com.sun.media.sound.WaveFileReader.getFMT(WaveFileReader.java:224)
at com.sun.media.sound.WaveFileReader.getAudioInputStream(WaveFileReader.java:140)
at javax.sound.sampled.AudioSystem.getAudioInputStream(AudioSystem.java:1094)
at stm.sounds.WavPlayer.loadSound(WavPlayer.java:42)
at stm.STM.(STM.java:265)
at stm.STM.main(STM.java:363)"
Most probably the file path in this line is not correct:
WavPlayer sound1 = WavPlayer.loadSound("coin.wav");
You should pass the path of the 'coin.wav' file instead of just its name.
For instance if its under a folder named sounds, which let's say right under the root of project, that parameter should be 'sounds/coin.wav'.
The problem is in your static method loadSound. This method returns null when an exception is thrown. You catch it but you do nothing with it,
NEVER make empty catch.
Catch specific exceptions.
I would change your method signature loadSound as
public static WavSound loadSound(String s) throws Exception // rather than exception specific exception!!
And then your method without try-catch

How to tell when AudioTrack object has finished playing?

I'm trying to play a PCM file in Android using the AudioTrack class. I can get the file to play just fine, but I cannot reliably tell when playback has finished. AudioTrack.getPlayState says playback has stopped when it hasn't finished playing. I'm having the same problem with AudioTrack.setNotificationMarkerPosition, and I'm pretty sure my marker is set to the end of the file (although I'm not completely sure I'm doing it right). Likewise, playback continues when getPlaybackHeadPosition is at the end of the file and has stopped incrementing. Can anyone help?
I found that using audioTrack.setNotificationMarkerPosition(audioLength) and audioTrack.setPlaybackPositionUpdateListener worked for me. See the following code:
// Get the length of the audio stored in the file (16 bit so 2 bytes per short)
// and create a short array to store the recorded audio.
int audioLength = (int) (pcmFile.length() / 2);
short[] audioData = new short[audioLength];
DataInputStream dis = null;
try {
// Create a DataInputStream to read the audio data back from the saved file.
InputStream is = new FileInputStream(pcmFile);
BufferedInputStream bis = new BufferedInputStream(is);
dis = new DataInputStream(bis);
// Read the file into the music array.
int i = 0;
while (dis.available() > 0) {
audioData[i] = dis.readShort();
i++;
}
// Create a new AudioTrack using the same parameters as the AudioRecord.
audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC, RECORDER_SAMPLE_RATE, RECORDER_CHANNEL_OUT,
RECORDER_AUDIO_ENCODING, audioLength, AudioTrack.MODE_STREAM);
audioTrack.setNotificationMarkerPosition(audioLength);
audioTrack.setPlaybackPositionUpdateListener(new OnPlaybackPositionUpdateListener() {
#Override
public void onPeriodicNotification(AudioTrack track) {
// nothing to do
}
#Override
public void onMarkerReached(AudioTrack track) {
Log.d(LOG_TAG, "Audio track end of file reached...");
messageHandler.sendMessage(messageHandler.obtainMessage(PLAYBACK_END_REACHED));
}
});
// Start playback
audioTrack.play();
// Write the music buffer to the AudioTrack object
audioTrack.write(audioData, 0, audioLength);
} catch (Exception e) {
Log.e(LOG_TAG, "Error playing audio.", e);
} finally {
if (dis != null) {
try {
dis.close();
} catch (IOException e) {
// don't care
}
}
}
This works for me:
do{ // Montior playback to find when done
x = audioTrack.getPlaybackHeadPosition();
}while (x< pcmFile.length() / 2);

Categories

Resources