I am developing a small real-time application to record sound waves. It has two modules: recording , listening.
here is how it should work :
The program starts listening.
A sound wave arrives.
The program recognizes that a signal has arrived, and starts
recording it.
When the signal is over (no more loud sounds), the program stops
recording and saves the result to a file.
So in order to recognize when the signal is over - we should listen to the wave (capture) along with recording, so we can detect when the sound is over.
In order to implement this, iv'e used the Java sound API, but i have one problem:
The target-data-line object is shared between the recording-thread and the capture-thread. In this case, two threads are working on the same target-data-line : The capture and the recorder threads.
which cases some real-time problems.
I have tried to open two target-data-lines, one for recording and one for capturing , but the program throws an exception when trying to open the second one.
How can i fix the problem ?
please help.
You need to use a single thread which has exclusive access to the TargetDataLine. This thread can then generate events which your recording and listening thread can subscribe to.
Related
I am using Java Sound API to capture sound on a Windows machine by reading data from a TargetDataLine. It works fine if I open a line, read data from the line and then close it. However, If I reopen it once closed, I will get a LineUnavailableException. Can someone explain to me what is going on? If I want to record multiple sound clips, one after another, say repeating this: start -------> record ---------> stop several times, how can I do it?
Thanks
The API says:
Some lines, once closed, cannot be reopened. Attempts to reopen such
a line will always result in a LineUnavailableException.
I think the reason they say "some lines" is that it depends on external factors pertaining to the particular system.
You will need to create a line for each additional recording, it seems.
I want to pass some message between two communicating channels in call, My requirement is two java applications will act as two different users in a call, There should be some message that can be shared only between two channels in specific call, so that if one application is going to play something it can send message saying that you record now and vice-verse. I will be thankful if somebody can help me out.
You can use AMI to watch for UserA trying to share data, and then set it on UserB's channel.
Background to this...
Sounds like a similar problem that I had. With two users in a call, I wanted user A to start recording the call. I wanted the recording to start on User B's channel so that if the call was transfer, that channel was not destroyed and the recording would continue. Simply calling MixMonitor starts the recording on the channel that calls MixMonitor which would be UserA's channel.
I wrote a small application that monitors UserA and listens for a UserEvent from UserA (see 'core show application UserEvent') and then start Mixmonitor on UserB's channel. It also has to keep track of channels so that it knows which channel belongs to UserB.
I need to be able to launch certain procedure only when there is some sound output initiated. This is like when some sound is "beeped" (having complete silence before). I image for this to handle is somehow to monitor the strength of the output channel (where usually all sound is outputted), and if it reaches the certain border, I will launch my "procedure". The sound can be initiated by any program in the system, I just need to monitor the output strength and react.
Is there a way to make such "monitor" in java?
Thanks.
i have 2 audio input of a concert.
The first is a wav file and the second is taken by microphone in real time.
I need play the first file in synch with the microphone input.
What library can i use?
Is there any tutorial, guide or example for do this?
thanks
Take a look here
This is entire sound api documentation
http://download.oracle.com/javase/1.5.0/docs/guide/sound/programmer_guide/
Also
Chapter 4: Synchronizing Playback on Multiple Lines
Chapter 6: Processing Audio with Controls
BUT
here is what i found in jsresource faq
How can I synchronize two or more playback lines ?
The synchronization functions in Mixer are not implemented. Nevertheless, playback typically stays in sync
How can I synchronize playback (SourceDataLines) with recording (TargetDataLines)?
As with multiple playback lines from the same Mixer object, playback and recording lines from the same Mixer object stay in sync once they are started. In practice, this means that you can achieve synchronization this easy way only by using the "Direct Audio Device" mixers. Since the "Java Sound Audio Engine" only provides playback lines, but no recording lines, playback/recording sync is not as easy with the "Java Sound Audio Engine".
If playback and recording lines originate from different Mixer objects, you need to synchronize the soundcards that are represented by the Mixer objects. So the situation is similar to external synchronization.
AND
The main problem is buffering and processing mic audio hits and timing realtime , a practical way is using external clock
And here is a bunch of java sound resources , i think u should look at monitoring sound section in api documentation and try to trigger timedelay based on hits and monitor outputs , it's little complicated i also interested in this question i will try to find out if i did i will let u know
Take a look at this links and it's going to be easy as i found and read description of this processing libraries
http://sonia.pitaru.com/
http://visualap.java.net/
http://www.softsynth.com/jsyn/ Check this out
http://jmetude.dihardja.de/
http://www.tree-axis.com/Ess/
http://www.abstract-codex.net/tactu5/index.html
http://code.google.com/p/echonestp5/
I'm trying to write a simple app that should mute my mobile phone for a given time. It's my first Android app, but after many hours of reading I think it is nearly completed. But it still has one problem that I can not fix.
I'm using a activity to display the GUI. It has Buttons to set the start and end time, and everything else needed. When the user has entered all the parameters, they are passed to a service. This service uses a handler object, to register 2 callbacks (with Handler.postDelayed). One for start Mute and one for End Mute (in SetMuteIntervall).
The first tests seemed to work, but if I try to mute it for like 30 minutes, it never unmutes. I think it has something to do with the fact, that the mobilephone is or was in standby mode. I also tried to use Handler.postAt() but that didn't work either (and time relative to uptime was somewhat confusing).
So, what should I do to guarantee, that my callbacks are called, regardless whether the phone is in standby or not?
Here's the source of my program:
http://pastebin.com/XAgCeAq9
http://pastebin.com/33nepFV5
Try to use AlarmManager for planning some actions in future. AlarmManager is not standby-mode-dependend and will fire even if device is sleeping.
Your thread are actually stopped then the phone is in stand by mode. If you still want to use thread you can use WakeLock to prevent CPU from going to stand by mode (but still to switch screen off) but this is not the best way in your case.