AudioManager not working - neither turns on speakerphone nor mutes - java

In a WebRT app that does video chat, I want to set cell phones to play sound using speakerphone, not earpiece.
I've tried using the AppRTCAudioManager(which wraps AudioManager) to set mute and set speakerphone, using the AudioManager directly to set mute, placing the code in different methods, etc.
android.permission.MODIFY_AUDIO_SETTINGS is granted.
private void gotRemoteStream(MediaStream stream){
...
runOnUiThread(new Runnable() {
#Override
public void run() {
...
audioManager = AppRTCAudioManager.create(context);...
audioManager.start((x, y) -> {}) // a callback that does logging, omitted for clarity
// First try setting audio device with the AppRTCAudioManager, which wraps AudioManager.
audioManager.selectAudioDevice(AppRTCAudioManager.AudioDevice.SPEAKER_PHONE);
AppRTCAudioManager.AudioDevice ad = audioManager.getSelectedAudioDevice();
//Shows SPEAKER_PHONE, but sound still plays from earpiece.
// Next try it with the AudioManager itself.
AudioManager am = (AudioManager)context.getSystemService(Context.AUDIO_SERVICE);
am.setMode(AudioManager.MODE_IN_COMMUNICATION);
am.setMicrophoneMute(true);
am.setSpeakerphoneOn(true);
boolean mute = am.isMicrophoneMute(); //false
boolean on = am.isSpeakerphoneOn(); //false
}
The audioManager.getSelectedAudioDevice method shows the audio device as speakerphone, but am.isSpeakerphoneOn reveals that speakerphone is still off.
Also, setting the microphone to mute with am.setMicrophoneMute doesn't do anything. am.IsMicrophone mute still says false.
Thanks in advance for any help.
UPDATE: my cell phone mysteriously started playing thru speakerphone. But the AudioManager code still doesn't do anything. I changed the AudioManager code to select EARPIECE as the audio device, but now it still plays thru speakerphone.

Finally found the answer; I had mistyped the permission <uses-permission android:name="audioid.permission.MODIFY_AUDIO_SETTINGS"/>! Apparently the Android Studio compiler doesn't check for errors in the Manifest file. So it didn't alert me I had typed "audioid" instead of "android".

Related

Why isOperational() in mobile vision text Recognizer in a device return true and in other return false?

Why isOperational() in mobile vision text recognizer returns false?
At first, mobile vision only show preview camera and after many tries to get the result, I saw that the texts recognized but in one device it works and in other device does not.
What should I do?
For example, in one device, isOperational() returns false, and it goes to readstate() and after that goes to looper() and stays on it!
in other device it only return false and doesn't go to looper.
I want ask other questions about it:
My first question is: how does isOperational() work? I can't understand it.
Maybe it goes to looper to download the native library in a queue and after many try, at last download completes and work. Can it be correct? Or is it just a bug that it goes to looper? Anywhere, what should I do?
Can I work on this when it works in one device I tried and in other does not? Or it must work in every device to I can work on it? And I get .apk from project but it can't install in devices, why?
Should it check for network?
Should it check for access to the memory?
note: it works with camera API and its deprecated. maybe the problem is with this!
TextRecognizer textRecognizer = new TextRecognizer.Builder(context).build();
textRecognizer.setProcessor(new OcrDetectorProcessor(graphicOverlay));
if (!textRecognizer.**isOperational**()) {
// Note: The first time that an app using a Vision API is installed on a
// device, GMS will download a native libraries to the device in order to do detection.
// Usually this completes before the app is run for the first time. But if that
// download has not yet completed, then the above call will not detect any text,
// barcodes, or faces.
//
// isOperational() can be used to check if the required native libraries are currently
// available. The detectors will automatically become operational once the library
// downloads complete on device.
Log.w(TAG, "Detector dependencies are not yet available.");
// Check for low storage. If there is low storage, the native library will not be
// downloaded, so detection will not become operational.*
IntentFilter lowstorageFilter = new IntentFilter(Intent.ACTION_DEVICE_STORAGE_LOW);
boolean hasLowStorage = registerReceiver(null, lowstorageFilter) != null;
if (hasLowStorage) {
Toast.makeText(this, R.string.low_storage_error, Toast.LENGTH_LONG).show();
Log.w(TAG, getString(R.string.low_storage_error));
}
}
*// Creates and starts the camera. Note that this uses a higher resolution in comparison
// to other detection examples to enable the text recognizer to detect small pieces of text.*
cameraSource =
new CameraSource.Builder(getApplicationContext(), textRecognizer)
.setFacing(CameraSource.CAMERA_FACING_BACK)
.setRequestedPreviewSize(1280, 1024)
.setRequestedFps(2.0f)
.setFlashMode(useFlash ? Camera.Parameters.FLASH_MODE_TORCH : null)
.setFocusMode(autoFocus ? Camera.Parameters.FOCUS_MODE_CONTINUOUS_VIDEO : null)
.build();
}
It doesn't produce any error and show preview camera but doesn't recognize texts in some devices.

android - Why MediaPlayer delayes in playing mp3 audio?

I made a small app to practice audio playing in android using MediaPlayer, the app works great but there is a small delay of 1 second after clicking the play button and it's noticeable, I noticed this happens only when starting the audio file, when paused it resumes immediately with no delay,I googled around and seen people suggests using SoundPool instead of MediaPlayer but SoundPool is recommended for short audio clips while my app is playing a full song, what's the cause of this delay? is there a fix or work around this issue?
Here's my code:
private MediaPlayer mMediaPlayer;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
mMediaPlayer = MediaPlayer.create(this,R.raw.this_is_america);
Button btnPlay = findViewById(R.id.btnPlay);
Button btnPause = findViewById(R.id.btnPause);
Button btnStop = findViewById(R.id.btnStop);
btnPlay.setOnClickListener(this);
btnPause.setOnClickListener(this);
btnStop.setOnClickListener(this);
}
#Override
public void onClick(View v) {
switch (v.getId()) {
case R.id.btnPlay:
Toast.makeText(getApplicationContext(), "Playing song",
Toast.LENGTH_SHORT).show();
mMediaPlayer.start();
break;
case R.id.btnPause:
Toast.makeText(getApplicationContext(), "Pausing song",
Toast.LENGTH_SHORT).show();
mMediaPlayer.pause();
break;
case R.id.btnStop:
Toast.makeText(getApplicationContext(), "Song stopped",
Toast.LENGTH_SHORT).show();
mMediaPlayer.reset();
mMediaPlayer = MediaPlayer.create(this,R.raw.this_is_america);
break;
}
}
Update: Turned out to be the mp3 audio file had silent pause at the beginning , tried another song and it works fine no noticeable delays , thank you greeble31 for suggesting to check that.
Your song has a silent pause at the beginning ;)
As #Lucefer mentioned, the Android platform has some small unavoidable latency due to the implementation of the audio stack. Or, at least it did a few years back, not sure what the current state of affairs is. At any rate, this delay is generally far too small (~10ms) to notice at the beginning of an audio file; it has more to do with response times for apps that simulate musical instruments and the like.
There is an audio latency issue in the Android operating system. There is a few milliseconds delay while audio recording and audio playing. You can learn more about this by navigating below URLs. And this latency is related to the device types.
https://developer.android.com/ndk/guides/audio/audio-latency
https://source.android.com/devices/audio/latency/measurements
You can use a native toolkit or native program (C, C++ ndk base) to minimize this latency. But you cannot do reduce it to 0Seconds. only minimize the latency.
There is https://superpowered.com/superpowered-android-media-server . You can get support from this but as I know you need to pay for it. I didn't try it. Therefore i don't know how much it reduced the latency.
If you want to remove silent part from your mp3. you can use ffmpeg wrapper for it. Go to https://github.com/WritingMinds/ffmpeg-android-java link there is a working ffmpeg wrapper for Android. you can easily use it.
using FFMPEG with silencedetect to remove audio silence you can find relevant FFmpeg command for it by navigating to this.
in the wrapper you don't need ffmpeg part. You need to replace it from -y. You can get those details when navigate to that wrapper

How to use flash without stopping camera feed?

I am currently working on a barcode scanning app, which uses the mobile vision api for the majority of processes. I am trying to implement a flash button so that a user may scan in low light, but for some reason the activation of flash freezes the camera feed. Is there some way to start flash with a button while the feed is active? To activate flash without interfering with other threads? Thanks!
i used this code in my custom camera Application.when User clicks the FlashOn Button then Flash will be start.i think this code will help to you.
try this code (OnButton Click) :
private void btnFlashOnClick() {
if (mCamera != null) {
// First get the Camera Parameters.
Camera.Parameters parameters = mCamera.getParameters();
// set FlashMode to camera parameters.
parameters.setFlashMode(Camera.Parameters.FLASH_MODE_TORCH);
// set Parameters Objects to Camera.
mCamera.setParameters(parameters);
// Finally, Start the Preview Of a Camera
mCamera.startPreview(); // this Line is Usefull for MyApp.If you don't need then Remove this Line.
}
}
this code is works fine in my App..Hope this will helps you...(:
It really depends what camera api you are using as there are few.
CameraManager has
void setTorchMode (String cameraId, boolean enabled)
that lets you operate flash regardless of current state of the camera (and without a need to restart one), but it could be overridden by other apps too

LibVLC fails to show subtitles for streamed video

I tried to create a video player on android device, which could stream videos from local server. I found an this example. I followed to the guideline and at the end I was able to play local video on my device. Then I started to modify this project. I set VideoActivity as launcher activity and commented out all data received from previous activity. I modified initialization of media from
Media m = new Media(libvlc, media);
mMediaPlayer.setMedia(m);
to
Uri uri = Uri.parse("http://samples.mplayerhq.hu/MPEG-4/embedded_subs/1Video_2Audio_2SUBs_timed_text_streams_.mp4");
Media m = new Media(libvlc, uri);
mMediaPlayer.setMedia(m);
Added 2 new methods (as you see, those methods are dummy methods simply to see the result)
private void setSubtitles(){
MediaPlayer.TrackDescription[] tds = mMediaPlayer.getSpuTracks();
mMediaPlayer.setSpuTrack(tds[tds.length - 1].id);
}
private void seekTo(long time) {
mMediaPlayer.setTime(time);
}
And called them on play event
#Override
public void onEvent(MediaPlayer.Event event) {
VideoActivity player = mOwner.get();
switch(event.type) {
case MediaPlayer.Event.EndReached:
Log.d(TAG, "MediaPlayerEndReached");
player.releasePlayer();
break;
case MediaPlayer.Event.Playing:
---> player.seekTo(1000);
---> player.setSubtitles();
case MediaPlayer.Event.Paused:
case MediaPlayer.Event.Stopped:
default:
break;
}
}
But when ever I tried to play the video, I got 12401-12591/? A/libc﹕ ### ABORTING: INVALID HEAP ADDRESS IN dlfree. I get this error some seconds after subtitles ar shown (Also subtitles are too small). This error is shown only when I try to show subtitles. Without subtitles video is playing without any error.
I tried to play this video on VLC application and it was able to show subtitles. I tried to create new project - got the same error. I analysed VLC android source code and it looks that they are using almost the same logic for loading subtitles (Only setting different surfaceViews for video and subtitles, but it gave me the same error). So what am I missing? What does cause player to brake and how to fix it? Maybe there is already working example of code for showing subtitles.

Android: Playing sound over Sco Bluetooth headset

For the past few days I have been trying to play any sound over my sco bluetooth headset from my android phone. My final goal with this project is to eventually make a garage door opener, but first I need to be able to play sound over the headset.
Here is the basis of the current code I'm using:
==Manifest==
<uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" />
==Code==
audioManager = (AudioManager) getSystemService(AUDIO_SERVICE);
audioManager.setMode(AudioManager.MODE_IN_CALL);
audioManager.startBluetoothSco();
audioManager.setBluetoothScoOn(true);
short[] soundData = new short [8000*20];
for (int iii = 0; iii < 20*8000; iii++) {
soundData[iii] = 32767;
iii++;
soundData[iii] = -32768;
}
audioTrack = new AudioTrack(AudioManager.STREAM_VOICE_CALL,
8000, AudioFormat.CHANNEL_OUT_MONO,
AudioFormat.ENCODING_PCM_16BIT, soundData.length
* Short.SIZE, AudioTrack.MODE_STATIC);
audioTrack.write(soundData, 0, soundData.length);
audioTrack.play();
Before I run this, I pair my bluetooth headset to my phone and have it connected. I have verified it works by calling my voicemail. When I run my code however, no sound comes from anywhere.
Here are the effects of the different lines of code:
When I'm just running my app:
audioManager.setMode(AudioManager.MODE_IN_CALL);
This line makes all of my sound stop working no matter what I do, so it is typically commented out.
audioManager.startBluetoothSco();
audioManager.setBluetoothScoOn(true);
These two lines make the sound stop coming out of the front speaker and make my headset click and hiss like it's turned on but there's no output.
AudioManager.STREAM_VOICE_CALL
This is part of my call to the AudioTrack constructor, but it makes quite a difference. Since this is set to STREAM_VOICE_CALL the sound comes out of the front speaker, if I set this to STREAM_MUSIC, the sound comes out the back speaker instead.
When I open my app during a call:
audioManager.setMode(AudioManager.MODE_IN_CALL);
During a call, this line has no effect because MODE_IN_CALL was already set. But what's different however is that my sound is mixed with the phone call, whereas normally it doesn't play at all.
audioManager.startBluetoothSco();
audioManager.setBluetoothScoOn(true);
These, with their counterpart off halves, control where the audio is coming from. If I turn these off, my sound and the telephone call come from the front speaker, with these on, the telephone call comes from my headset and my sound is lost.
As to why my code is not working, I honestly have no idea. I believe I have fulfilled the checklist for using startBluetoothSco().
Even if a SCO connection is established, the following restrictions
apply on audio output streams so that they can be routed to SCO headset:
- the stream type must be STREAM_VOICE_CALL - the format must be mono -
the sampling must be 16kHz or 8kHz
So, does anyone have an idea of what I am doing wrong? There was one time that I managed to get my sound to play through the headset, but it was only a short tone when I forgot to stop() my AudioTrack so I have to assume it was a glitch.
I found the solution on this page.
You have to call audioManager.setMode(AudioManager.MODE_IN_CALL); only after the socket is connected, i.e., you received AudioManager.SCO_AUDIO_STATE_CONNECTED. I could hear the TTS on my Spica running android 2.2.2.
Edit:
Here is my (old) implementation:
public class BluetoothNotificationReceiver
extends BroadcastReceiver
{
/**
*
*/
public BluetoothNotificationReceiver(Handler h)
{
super();
bnrHandler = h;
}
/* (non-Javadoc)
* #see android.content.BroadcastReceiver#onReceive(android.content.Context, android.content.Intent)
*/
#Override
public void onReceive(Context arg0, Intent arg1)
{
String action = arg1.getAction();
if (action.equalsIgnoreCase(AudioManager.ACTION_SCO_AUDIO_STATE_CHANGED))
{
int l_state = arg1.getIntExtra(AudioManager.EXTRA_SCO_AUDIO_STATE, -1);
Log.d("bnr", "Audio SCO: " + AudioManager.ACTION_SCO_AUDIO_STATE_CHANGED);
switch(l_state)
{
case AudioManager.SCO_AUDIO_STATE_CONNECTED:
{
Log.i("bnr", "SCO_AUDIO_STATE_CONNECTED");
}
break;
case AudioManager.SCO_AUDIO_STATE_DISCONNECTED:
{
Log.e("bnr", "SCO_AUDIO_STATE_DISCONNECTED");
}
break;
default: Log.e("bnr", "unknown state received:"+l_state);
}
}
else
Log.e("bnr", "onReceive:action="+action);
}
}
I don't know if it is still working with the new APIs.
Try setting the audiomanager to AudioManager.MODE_NORMAL. Thos worked for me.
Add to manifest:
<uses-permission android:name="android.permission.BLUETOOTH_ADMIN" />
<uses-permission android:name="android.permission.BLUETOOTH" />
I actually managed to fix this. The problem was that my pitch that I played over the headset was too high of a frequency. I fixed it by simply doubling up the code that creates the audio file so that it goes HIGH HIGH LOW LOW instead of HIGH LOW HIGH LOW.

Categories

Resources