I am new to Android, and i am developing an audiometer on Android Studio. One of the steps is to see if the mic is receiving a sound that i am sending from the earphones, to check if they are working properly.
I am doing both things in the same activity, sending sound, and checking if there is any sound coming in.
I was able to send a tone of 1kHz of frequency for 2 seconds, using AudioTrack class, and the next step is to check if the mic is receiving something near that frequency. Since i wasn't able to even make the mic work, i am lowering my goals to just check if the microphone is receiving anything.
I've checked several links and none helped me, or because i am not familiar with android or because it wasn't what i needed, among others:
Detect sound level, How to detect when a user stops talking into the microphone and Detect 'Whistle' sound in android
I've already put the permissions on the Manifest:
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
And my CalibrationActivity.java is:
import android.content.Intent;
import android.media.AudioFormat;
import android.media.AudioManager;
import android.media.AudioTrack;
import android.media.MediaRecorder;
import android.os.Bundle;
import android.os.Environment;
import android.os.Handler;
import android.os.SystemClock;
import android.support.v7.app.AppCompatActivity;
import android.util.Log;
import java.io.IOException;
public class CalibrationActivity extends AppCompatActivity {
private MediaRecorder myRecorder;
private String outputFile = null;
private final int duration = 2; // seconds
private final int sampleRate = 4000;
private final int numSamples = duration * sampleRate;
private final double sample[] = new double[numSamples];
private final double freqOfTone = 1000; // hz
private final byte generatedSnd[] = new byte[2 * numSamples];
Handler handler = new Handler();
int getAmplitude = myRecorder.getMaxAmplitude();
public int result(){
if (getAmplitude != 0) {
return 1;
}else {
return 0;
}
}
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_calibration);
outputFile = Environment.getExternalStorageDirectory().
getAbsolutePath() + "/teste.3gpp";
myRecorder = new MediaRecorder();
myRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
myRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
myRecorder.setAudioEncoder(MediaRecorder.OutputFormat.AMR_NB);
myRecorder.setOutputFile(outputFile);
Intent intent;
if(result()==1){
intent = new Intent(this, FirstTestActivity.class);
}else{
intent = new Intent(this, End1Activity.class);
}
}
void start_recording() {
try {
myRecorder.prepare();
myRecorder.start();
} catch (IllegalStateException e) {
// start:it is called before prepare()
// prepare: it is called after start() or before setOutputFormat()
e.printStackTrace();
} catch (IOException e) {
// prepare() fails
e.printStackTrace();
}
}
void stop_recording(){
try {
myRecorder.stop();
myRecorder.release();
myRecorder = null;
} catch (IllegalStateException e) {
// it is called before start()
e.printStackTrace();
} catch (RuntimeException e) {
// no valid audio/video data has been received
e.printStackTrace();
}
}
#Override
protected void onResume() {
super.onResume();
// Use a new tread as this can take a while
final Thread thread = new Thread(new Runnable() {
public void run() {
genTone();
handler.post(new Runnable() {
public void run() {
playSound();
}
});
}
});
thread.start();
start_recording();
SystemClock.sleep(3000);
stop_recording();
}
void genTone() {
// fill out the array
for (int i = 0; i < numSamples; ++i) {
sample[i] = Math.sin(2 * Math.PI * i / (sampleRate / freqOfTone));
}
// convert to 16 bit pcm sound array
// assumes the sample buffer is normalised.
int idx = 0;
for (final double dVal : sample) {
// scale to maximum amplitude
final short val = (short) ((dVal * 32767));
// in 16 bit wav PCM, first byte is the low order byte
generatedSnd[idx++] = (byte) (val & 0x00ff);
generatedSnd[idx++] = (byte) ((val & 0xff00) >>> 8);
}
}
void playSound() {
final AudioTrack audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC,
sampleRate, AudioFormat.CHANNEL_OUT_MONO,
AudioFormat.ENCODING_PCM_16BIT, generatedSnd.length,
AudioTrack.MODE_STATIC);
audioTrack.write(generatedSnd, 0, generatedSnd.length);
audioTrack.play();
}
}
I wrote that based on examples that i found online, mostly from here, here and here, so there are a few parts of the code that i don't really understand.
The idea here is to send the sound from the earphones, and the user will be informed to put their earphones close to the mic. Then, the code should let the mic recording for 3 seconds and then check if the amplitude of the sound is different from 0, if that is the case, the application then goes to FirstTestActivity, else, to End1Activity. But once i try running the code, the aplication suddenly crashes, and i don't know why. I've been working on that for several weeks and i could not find a solution that probably is pretty simple. Thanks on advance.
Based on your lack on knowledge on the subject it may be useful for you to just use a library instead like the one here
Related
Context : I am trying to use tensorflow lite with a camera capture
When I launch my app, my app crashes and I have this error message on my phone (no error in the build step) :
java.lang.NoClassDefFoundError: Failed resolution of:
Lcom/google/android/things/pio/GpioCallback;
at http://com.google.android .things.contrib.driver.button.ButtonInputDriver.(http://ButtonInputDriver.java:44 )
at http:// com.google.android .things.contrib.driver.rainbowhat.RainbowHat.createButtonInputDriver(htt ps:// t.co/pCxFVemJoJ)
at http ://com.google.android .things.contrib.driver.rainbowhat.RainbowHat.createButtonCInputDriver(htt ps: // t.co/L5GdT2osCC)
at com.example.androidthings.imageclassifier.ImageClassifierActivity.initButton(http:// ImageClassifierActivity.java:186 )
at com.example.androidthings.imageclassifier.ImageClassifierActivity.onCreate(http://ImageClassifierActivity.java:172 )
at http://android.app .Activity.performCreate(http s:// t.co/iAgPPJxDqN)
at http://android.app .Activity.performCreate(http s:// t.co/x1jCqk83Vz)
at http://android.app .Instrumentation.callActivityOnCreate(http s://
t.co/sH4ZVEXClz)
at http://android.app .ActivityThread.performLaunchActivity(http s: //
t.co/7ivkHUx5ol)
at http://android.app .ActivityThread.handleLaunchActivity(http s: //
t.co/JrImVgSukO)
at http://android.app .servertransaction.LaunchActivityItem.execute(http s://
t.co/xJzsqMgH1S)
at http://android.app .servertransaction.TransactionExecutor.executeCallbacks(htt ps://
t.co/pmDJcEYPAe)
at http://android.app .servertransaction.TransactionExecutor.execute(htt ps:// t.co/B56pWorl1W)
at http://android.app .ActivityThread$H.handleMessage(htt ps: //
t.co/3pIiwfbaZj)
at android.os.Handler.dispatchMessage(ht tp://Handler.java:106 )
at android.os.Looper.loop(ht tp://Looper.java:201 )
at http://android.app .ActivityThread.main(ht tps:// t.co/X8nXQTeKz7)
at java.lang.reflect.Method.invoke(Native Method)
at http://com.android .internal.os.RuntimeInit$MethodAndArgsCaller.run(ht tps:// t.co/5IltBH9HjT)
at http://com.android .internal.os.ZygoteInit.main(ht tps: //t.co/ UDaAWnMnAC)
Caused by: java.lang.ClassNotFoundException: Didn't find class "h ttp://com.google.android .things.pio.GpioCallback" on path: DexPathList[[zip file "/system/framework/org.apache.http.legacy.boot.jar", zip file "/data/app/com.example.androidthings.imageclassifier-n33NGwKNRyCZU-UICaS1Qw==/base.apk"],nativeLibraryDirectories=[/data/app/com.example.androidthings.imageclassifier-n33NGwKNRyCZU-UICaS1Qw==/lib/arm64, /data/app/com.example.androidthings.imageclassifier-n33NGwKNRyCZU-UICaS1Qw==/base.apk!/lib/arm64-v8a, /system/lib64, /vendor/lib64]]
at dalvik.system.BaseDexClassLoader.findClass(h tp://BaseDexClassLoader.java:171 )
at java.lang.ClassLoader.loadClass(ht tp:// ClassLoader.java:379 )
at java.lang.ClassLoader.loadClass(ht tp:// ClassLoader.java:312 )
Here is the ImageClassifierActivity:
package com.example.androidthings.imageclassifier;
import android.app.Activity;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.media.ImageReader;
import android.os.Bundle;
import android.util.Log;
import android.view.KeyEvent;
import android.view.WindowManager;
import android.widget.ImageView;
import android.widget.TextView;
import com.example.androidthings.imageclassifier.classifier.Recognition;
import com.example.androidthings.imageclassifier.classifier.TensorFlowHelper;
import com.google.android.things.contrib.driver.button.ButtonInputDriver;
import com.google.android.things.contrib.driver.rainbowhat.RainbowHat;
import org.tensorflow.lite.Interpreter;
import java.io.IOException;
import java.nio.ByteBuffer;
import java.nio.ByteOrder;
import java.util.Collection;
import java.util.Iterator;
import java.util.List;
public class ImageClassifierActivity extends Activity {
private static final String TAG = "ImageClassifierActivity";
/** Camera image capture size */
private static final int PREVIEW_IMAGE_WIDTH = 640;
private static final int PREVIEW_IMAGE_HEIGHT = 480;
/** Image dimensions required by TF model */
private static final int TF_INPUT_IMAGE_WIDTH = 224;
private static final int TF_INPUT_IMAGE_HEIGHT = 224;
/** Dimensions of model inputs. */
private static final int DIM_BATCH_SIZE = 1;
private static final int DIM_PIXEL_SIZE = 3;
/** TF model asset files */
private static final String LABELS_FILE = "labels.txt";
private static final String MODEL_FILE = "mobilenet_quant_v1_224.tflite";
private ButtonInputDriver mButtonDriver;
private boolean mProcessing;
private ImageView mImage;
private TextView mResultText;
private Interpreter mTensorFlowLite;
private List<String> mLabels;
private CameraHandler mCameraHandler;
private ImagePreprocessor mImagePreprocessor;
/**
* Initialize the classifier that will be used to process images.
*/
private void initClassifier() {
try {
mTensorFlowLite = new Interpreter(TensorFlowHelper.loadModelFile(this, MODEL_FILE));
mLabels = TensorFlowHelper.readLabels(this, LABELS_FILE);
} catch (IOException e) {
Log.w(TAG, "Unable to initialize TensorFlow Lite.", e);
}
}
/**
* Clean up the resources used by the classifier.
*/
private void destroyClassifier() {
mTensorFlowLite.close();
}
/**
* Process an image and identify what is in it. When done, the method
* {#link #onPhotoRecognitionReady(Collection)} must be called with the results of
* the image recognition process.
*
* #param image Bitmap containing the image to be classified. The image can be
* of any size, but preprocessing might occur to resize it to the
* format expected by the classification process, which can be time
* and power consuming.
*/
private void doRecognize(Bitmap image) {
// Allocate space for the inference results
byte[][] confidencePerLabel = new byte[1][mLabels.size()];
// Allocate buffer for image pixels.
int[] intValues = new int[TF_INPUT_IMAGE_WIDTH * TF_INPUT_IMAGE_HEIGHT];
ByteBuffer imgData = ByteBuffer.allocateDirect(
DIM_BATCH_SIZE * TF_INPUT_IMAGE_WIDTH * TF_INPUT_IMAGE_HEIGHT * DIM_PIXEL_SIZE);
imgData.order(ByteOrder.nativeOrder());
// Read image data into buffer formatted for the TensorFlow model
TensorFlowHelper.convertBitmapToByteBuffer(image, intValues, imgData);
// Run inference on the network with the image bytes in imgData as input,
// storing results on the confidencePerLabel array.
mTensorFlowLite.run(imgData, confidencePerLabel);
// Get the results with the highest confidence and map them to their labels
Collection<Recognition> results = TensorFlowHelper.getBestResults(confidencePerLabel, mLabels);
// Report the results with the highest confidence
onPhotoRecognitionReady(results);
}
/**
* Initialize the camera that will be used to capture images.
*/
private void initCamera() {
mImagePreprocessor = new ImagePreprocessor(PREVIEW_IMAGE_WIDTH, PREVIEW_IMAGE_HEIGHT,
TF_INPUT_IMAGE_WIDTH, TF_INPUT_IMAGE_HEIGHT);
mCameraHandler = CameraHandler.getInstance();
mCameraHandler.initializeCamera(this,
PREVIEW_IMAGE_WIDTH, PREVIEW_IMAGE_HEIGHT, null,
new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable(ImageReader imageReader) {
Bitmap bitmap = mImagePreprocessor.preprocessImage(imageReader.acquireNextImage());
onPhotoReady(bitmap);
}
});
}
/**
* Clean up resources used by the camera.
*/
private void closeCamera() {
mCameraHandler.shutDown();
}
/**
* Load the image that will be used in the classification process.
* When done, the method {#link #onPhotoReady(Bitmap)} must be called with the image.
*/
private void loadPhoto() {
mCameraHandler.takePicture();
}
// --------------------------------------------------------------------------------------
// NOTE: The normal codelab flow won't require you to change anything below this line,
// although you are encouraged to read and understand it.
#Override
protected void onCreate(final Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);
setContentView(R.layout.activity_camera);
mImage = findViewById(R.id.imageView);
mResultText = findViewById(R.id.resultText);
updateStatus(getString(R.string.initializing));
initCamera();
initClassifier();
initButton();
updateStatus(getString(R.string.help_message));
}
/**
* Register a GPIO button that, when clicked, will generate the {#link KeyEvent#KEYCODE_ENTER}
* key, to be handled by {#link #onKeyUp(int, KeyEvent)} just like any regular keyboard
* event.
*
* If there's no button connected to the board, the doRecognize can still be triggered by
* sending key events using a USB keyboard or `adb shell input keyevent 66`.
*/
private void initButton() {
try {
mButtonDriver = RainbowHat.createButtonCInputDriver(KeyEvent.KEYCODE_ENTER);
mButtonDriver.register();
} catch (IOException e) {
Log.w(TAG, "Cannot find button. Ignoring push button. Use a keyboard instead.", e);
}
}
private Bitmap getStaticBitmap() {
Log.d(TAG, "Using sample photo in res/drawable/sampledog_224x224.png");
return BitmapFactory.decodeResource(this.getResources(), R.drawable.sampledog_224x224);
}
#Override
public boolean onKeyUp(int keyCode, KeyEvent event) {
if (keyCode == KeyEvent.KEYCODE_ENTER) {
if (mProcessing) {
updateStatus("Still processing, please wait");
return true;
}
updateStatus("Running photo recognition");
mProcessing = true;
loadPhoto();
return true;
}
return super.onKeyUp(keyCode, event);
}
/**
* Image capture process complete
*/
private void onPhotoReady(Bitmap bitmap) {
mImage.setImageBitmap(bitmap);
doRecognize(bitmap);
}
/**
* Image classification process complete
*/
private void onPhotoRecognitionReady(Collection<Recognition> results) {
updateStatus(formatResults(results));
mProcessing = false;
}
/**
* Format results list for display
*/
private String formatResults(Collection<Recognition> results) {
if (results == null || results.isEmpty()) {
return getString(R.string.empty_result);
} else {
StringBuilder sb = new StringBuilder();
Iterator<Recognition> it = results.iterator();
int counter = 0;
while (it.hasNext()) {
Recognition r = it.next();
sb.append(r.getTitle());
counter++;
if (counter < results.size() - 1) {
sb.append(", ");
} else if (counter == results.size() - 1) {
sb.append(" or ");
}
}
return sb.toString();
}
}
/**
* Report updates to the display and log output
*/
private void updateStatus(String status) {
Log.d(TAG, status);
mResultText.setText(status);
}
#Override
protected void onDestroy() {
super.onDestroy();
try {
destroyClassifier();
} catch (Throwable t) {
// close quietly
}
try {
closeCamera();
} catch (Throwable t) {
// close quietly
}
try {
if (mButtonDriver != null) mButtonDriver.close();
} catch (Throwable t) {
// close quietly
}
}
}
I am building an application which requires the mic to cancel any sound coming from the speaker. It seems that this issue is almost a conspiracy on-line as others with the exact same problem were never responded to for an extended duration.
Android's native hardware accelerated AcousticEchoCanceler does not seem to work on most devices. Tests where made on many devices and the ones that seemed to work Include Nexus 5, and Moto X while almost all Samsung devices tested could not remove background sound.Note: All phones tested return true for AcousticEchoCanceler.isAvailable()
However, there must be a solution since applications such as Skype or WhatsApp seem to cancel sounds outside their app context, i.e. a call is on speaker and the Microphone cancels any feedback received.
This simplified recording app records sound to a file and plays it later when play is clicked.
MainActivity.java
public class MainActivity extends Activity {
Button startRec, stopRec, playBack;
int minBufferSizeIn;
AudioRecord audioRecord;
short[] audioData;
Boolean recording;
int sampleRateInHz = 48000;
private String TAG = "TAG";
/**
* Called when the activity is first created.
*/
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
startRec = (Button) findViewById(R.id.startrec);
stopRec = (Button) findViewById(R.id.stoprec);
playBack = (Button) findViewById(R.id.playback);
startRec.setOnClickListener(startRecOnClickListener);
stopRec.setOnClickListener(stopRecOnClickListener);
playBack.setOnClickListener(playBackOnClickListener);
playBack.setEnabled(false);
startRec.setEnabled(true);
stopRec.setEnabled(false);
minBufferSizeIn = AudioRecord.getMinBufferSize(sampleRateInHz,
AudioFormat.CHANNEL_IN_MONO,
AudioFormat.ENCODING_PCM_16BIT);
audioData = new short[minBufferSizeIn];
audioRecord = new AudioRecord(MediaRecorder.AudioSource.VOICE_COMMUNICATION,
sampleRateInHz,
AudioFormat.CHANNEL_IN_MONO,
AudioFormat.ENCODING_PCM_16BIT,
minBufferSizeIn);
}
OnClickListener startRecOnClickListener
= new OnClickListener() {
#Override
public void onClick(View arg0) {
playBack.setEnabled(false);
startRec.setEnabled(false);
stopRec.setEnabled(true);
Thread recordThread = new Thread(new Runnable() {
#Override
public void run() {
recording = true;
startRecord();
}
});
recordThread.start();
}
};
OnClickListener stopRecOnClickListener
= new OnClickListener() {
#Override
public void onClick(View arg0) {
playBack.setEnabled(true);
startRec.setEnabled(false);
stopRec.setEnabled(false);
recording = false;
}
};
OnClickListener playBackOnClickListener
= new OnClickListener() {
#Override
public void onClick(View v) {
playBack.setEnabled(false);
startRec.setEnabled(true);
stopRec.setEnabled(false);
playRecord();
}
};
#TargetApi(Build.VERSION_CODES.JELLY_BEAN)
private void startRecord() {
File file = new File(Environment.getExternalStorageDirectory(), "test.pcm");
try {
FileOutputStream outputStream = new FileOutputStream(file);
BufferedOutputStream bufferedOutputStream = new BufferedOutputStream(outputStream);
DataOutputStream dataOutputStream = new DataOutputStream(bufferedOutputStream);
NoiseSuppressor ns;
AcousticEchoCanceler aec;
if (NoiseSuppressor.isAvailable()) {
ns = NoiseSuppressor.create(audioRecord.getAudioSessionId());
if (ns != null) {
ns.setEnabled(true);
} else {
Log.e(TAG, "AudioInput: NoiseSuppressor is null and not enabled");
}
}
if (AcousticEchoCanceler.isAvailable()) {
aec = AcousticEchoCanceler.create(audioRecord.getAudioSessionId());
if (aec != null) {
aec.setEnabled(true);
} else {
Log.e(TAG, "AudioInput: AcousticEchoCanceler is null and not enabled");
}
}
audioRecord.startRecording();
while (recording) {
int numberOfShort = audioRecord.read(audioData, 0, minBufferSizeIn);
for (int i = 0; i < numberOfShort; i++) {
dataOutputStream.writeShort(audioData[i]);
}
}
audioRecord.stop();
dataOutputStream.close();
} catch (IOException e) {
e.printStackTrace();
}
}
void playRecord() {
File file = new File(Environment.getExternalStorageDirectory(), "test.pcm");
int shortSizeInBytes = Short.SIZE / Byte.SIZE;
int bufferSizeInBytes = (int) (file.length() / shortSizeInBytes);
short[] audioData = new short[bufferSizeInBytes];
try {
FileInputStream inputStream = new FileInputStream(file);
BufferedInputStream bufferedInputStream = new BufferedInputStream(inputStream);
DataInputStream dataInputStream = new DataInputStream(bufferedInputStream);
int i = 0;
while (dataInputStream.available() > 0) {
audioData[i] = dataInputStream.readShort();
i++;
}
dataInputStream.close();
AudioTrack audioTrack = new AudioTrack(
AudioManager.STREAM_MUSIC, sampleRateInHz,
AudioFormat.CHANNEL_OUT_MONO,
AudioFormat.ENCODING_PCM_16BIT,
bufferSizeInBytes,
AudioTrack.MODE_STREAM);
while(audioTrack.getState() != AudioTrack.STATE_INITIALIZED){
}
audioTrack.play();
audioTrack.write(audioData, 0, bufferSizeInBytes);
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
}
activity_main.xml
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="fill_parent"
android:layout_height="fill_parent"
android:orientation="vertical" >
<TextView
android:layout_width="fill_parent"
android:layout_height="wrap_content"
android:text="#string/hello_world" />
<Button
android:id="#+id/startrec"
android:layout_width="fill_parent"
android:layout_height="wrap_content"
android:text="Start Recording Test" />
<Button
android:id="#+id/stoprec"
android:layout_width="fill_parent"
android:layout_height="wrap_content"
android:text="Stop Recording" />
<Button
android:id="#+id/playback"
android:layout_width="fill_parent"
android:layout_height="wrap_content"
android:text="Play Back" />
</LinearLayout>
AndroidManfist.xml permissions
<uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS"/>
<uses-permission android:name="android.permission.RECORD_AUDIO"/>
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE"/>
To Verify if the device works simply play something in the background and then click Start Recording record a small sector and then click Stop Recording at this point click Play Back and check if you hear the background sound. If you can hear the background sound then AEC is not working.
But why is this inconsistency occurring, or how do I achieve echo cancellation (I am already using WebRTC within my app for noise cancellation within my apps context)
Any help would be appreciated !
I was having the same problem on my S6 device. I played with a variety of settings and found a set that seem to enable AEC. The differences between my and your setup seem to be:
16k sample rate
audioManager.setMode(AudioManager.MODE_IN_COMMUNICATION);
For others, I'm not sure precisely what settings are needed to get AEC working. I do know that my same app with
48k sample rate
NO audioManager.setMode(AudioManager.MODE_IN_COMMUNICATION);
NO android.permission.MODIFY_AUDIO_SETTINGS
does not successfully AEC.
Problem:
AudioTrack audioTrack = new AudioTrack(
AudioManager.STREAM_MUSIC, sampleRateInHz,
AudioFormat.CHANNEL_OUT_MONO,
AudioFormat.ENCODING_PCM_16BIT,
bufferSizeInBytes,
AudioTrack.MODE_STREAM);
should be:
AudioTrack audioTrack = new AudioTrack(
AudioManager.STREAM_MUSIC, sampleRateInHz,
AudioFormat.CHANNEL_OUT_MONO,
AudioFormat.ENCODING_PCM_16BIT,
bufferSizeInBytes,
AudioTrack.MODE_STREAM
sessionId); // this param is important, which is audioRecord. getAudioSessionId()
I have built a upd server listen code to upload on my phone and listen to data packets. This code will compile but for some reason it wont load to my phone. I don't see any errors anywhere. Why could this be ? this is my whole Mainactivity code :
package roman10.tutorial.udpcommserver;
import java.io.IOException;
import java.net.DatagramPacket;
import java.net.DatagramSocket;
import java.net.SocketException;
import java.net.InetSocketAddress;
import android.app.Activity;
import android.os.Bundle;
import android.util.Log;
import android.widget.TextView;
public class UdpServer extends Activity {
/** Called when the activity is first created. */
private TextView textView;
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
textView = (TextView) findViewById(R.id.text1);
runUdpServer();
}
private static final int UDP_SERVER_PORT = 4000;
private static final int MAX_UDP_DATAGRAM_LEN = 1500;
private static final String ipAdd = new String("172.30.42.80");
private void runUdpServer() {
String lText;
// byte[] lMsg = new byte[MAX_UDP_DATAGRAM_LEN];
// DatagramPacket dp = new DatagramPacket(lMsg, lMsg.length);
// DatagramSocket ds = null;
byte buffer[] = new byte[MAX_UDP_DATAGRAM_LEN];
DatagramPacket packet = new DatagramPacket(buffer, buffer.length);
try {
// ds = new DatagramSocket(UDP_SERVER_PORT);
// //disable timeout for testing
// //ds.setSoTimeout(100000);
// ds.receive(dp);
//lText = new String(lMsg, 0, dp.getLength());
// Log.i("UDP packet received", lText);
// textView.setText(lText);
DatagramSocket s = new DatagramSocket();
InetSocketAddress address = new InetSocketAddress(ipAdd, UDP_SERVER_PORT);
s.bind(address);
lText = new String(buffer,0,packet.getLength());
Log.i("UDP packet received", lText);
textView.setText(lText);
System.out.println("Waiting...");
s.receive(packet);
if (s != null) {
s.close();
}
// s.close();
System.out.println("Received!");
} catch (SocketException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
System.out.print("we are done ");
}
}
}
1)
You are not returning from your onCreate (until you have received something). The activity can't continue to start (certainly not call onStart, onResume etc.
2)
Then you are doing I/O on the main thread, network I/O even, not recommended and usually detected as not allowed on android and force closed. You will get a NetworkOnMainThreadException
Fortunately both those issues can be solved by creating a separate thread and executing runUdpServer() there.
Short example, instead of:
runUdpServer();
do:
Thread thread = new Thread() {
public void run() {
runUdpServer();
}
};
thread.start();
I'm developing an APP that get user specified latitude,longitude, and altitude, then fake this GPS location on the phone, and show that I am at that location in google map. I have the required permission on manifest file and mocked location is enabled in developer settings.
LocationManager lm = (LocationManager) getSystemService(Context.LOCATION_SERVICE);
//lm.clearTestProviderEnabled(mocLocationProvider);
lm.addTestProvider(mocLocationProvider, false, false, false, false, false, false, false, 0, 10);
lm.setTestProviderEnabled(mocLocationProvider, true);
mockLocation = new Location(mocLocationProvider); // a string
mockLocation.setLatitude(Integer.parseInt(latitude.getText().toString())); // double
mockLocation.setLongitude(Integer.parseInt(longitude.getText().toString()));
mockLocation.setAltitude(Integer.parseInt(altitude.getText().toString()));
mockLocation.setTime(System.currentTimeMillis());
lm.setTestProviderLocation( mocLocationProvider, mockLocation);
But looks like my GPS location is not changed at all on google map, what is the problem?
Update: I just installed an app called "fake GPS location" on my phone and that app works fine, but I still don't know what's wrong with my code, but I think mine is a formal way to achieve this.
Update #2: Although some of similar applications can run on my phone, but I found some exceptions, http://www.cowlumbus.nl/forum/MockGpsProvider.zip, this app is not working on my phone. can someone help me with this issue? millions of thanks! I'm not getting any error message when setting the location each time.
Update#3 : I noticed that this app is fairly old, so it does not run on 4.1. if so, how to do the same thing in the new version? my phone is samsung galaxy s3, hope it helps.
Update#4: for your info, the code from app in my update#2 is:
package nl.cowlumbus.android.mockgps;
import java.io.BufferedReader;
import java.io.InputStream;
import java.io.InputStreamReader;
import java.util.ArrayList;
import java.util.List;
import android.app.Activity;
import android.content.Context;
import android.location.Location;
import android.location.LocationListener;
import android.location.LocationManager;
import android.os.AsyncTask;
import android.os.Bundle;
import android.util.Log;
import android.widget.TextView;
public class MockGpsProviderActivity extends Activity implements LocationListener {
public static final String LOG_TAG = "MockGpsProviderActivity";
private static final String MOCK_GPS_PROVIDER_INDEX = "GpsMockProviderIndex";
private MockGpsProvider mMockGpsProviderTask = null;
private Integer mMockGpsProviderIndex = 0;
/** Called when the activity is first created. */
/* (non-Javadoc)
* #see android.app.Activity#onCreate(android.os.Bundle)
*/
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
/** Use saved instance state if necessary. */
if(savedInstanceState instanceof Bundle) {
/** Let's find out where we were. */
mMockGpsProviderIndex = savedInstanceState.getInt(MOCK_GPS_PROVIDER_INDEX, 0);
}
/** Setup GPS. */
LocationManager locationManager = (LocationManager) getSystemService(Context.LOCATION_SERVICE);
if(locationManager.isProviderEnabled(LocationManager.GPS_PROVIDER)){
// use real GPS provider if enabled on the device
locationManager.requestLocationUpdates(LocationManager.GPS_PROVIDER, 0, 0, this);
}
else if(!locationManager.isProviderEnabled(MockGpsProvider.GPS_MOCK_PROVIDER)) {
// otherwise enable the mock GPS provider
locationManager.addTestProvider(MockGpsProvider.GPS_MOCK_PROVIDER, false, false,
false, false, true, false, false, 0, 5);
locationManager.setTestProviderEnabled(MockGpsProvider.GPS_MOCK_PROVIDER, true);
}
if(locationManager.isProviderEnabled(MockGpsProvider.GPS_MOCK_PROVIDER)) {
locationManager.requestLocationUpdates(MockGpsProvider.GPS_MOCK_PROVIDER, 0, 0, this);
/** Load mock GPS data from file and create mock GPS provider. */
try {
// create a list of Strings that can dynamically grow
List<String> data = new ArrayList<String>();
/** read a CSV file containing WGS84 coordinates from the 'assets' folder
* (The website http://www.gpsies.com offers downloadable tracks. Select
* a track and download it as a CSV file. Then add it to your assets folder.)
*/
InputStream is = getAssets().open("mock_gps_data.csv");
BufferedReader reader = new BufferedReader(new InputStreamReader(is));
// add each line in the file to the list
String line = null;
while ((line = reader.readLine()) != null) {
data.add(line);
}
// convert to a simple array so we can pass it to the AsyncTask
String[] coordinates = new String[data.size()];
data.toArray(coordinates);
// create new AsyncTask and pass the list of GPS coordinates
mMockGpsProviderTask = new MockGpsProvider();
mMockGpsProviderTask.execute(coordinates);
}
catch (Exception e) {}
}
}
#Override
public void onDestroy() {
super.onDestroy();
// stop the mock GPS provider by calling the 'cancel(true)' method
try {
mMockGpsProviderTask.cancel(true);
mMockGpsProviderTask = null;
}
catch (Exception e) {}
// remove it from the location manager
try {
LocationManager locationManager = (LocationManager) getSystemService(Context.LOCATION_SERVICE);
locationManager.removeTestProvider(MockGpsProvider.GPS_MOCK_PROVIDER);
}
catch (Exception e) {}
}
#Override
public void onSaveInstanceState(Bundle savedInstanceState) {
// store where we are before closing the app, so we can skip to the location right away when restarting
savedInstanceState.putInt(MOCK_GPS_PROVIDER_INDEX, mMockGpsProviderIndex);
super.onSaveInstanceState(savedInstanceState);
}
#Override
public void onLocationChanged(Location location) {
// show the received location in the view
TextView view = (TextView) findViewById(R.id.text);
view.setText( "index:" + mMockGpsProviderIndex
+ "\nlongitude:" + location.getLongitude()
+ "\nlatitude:" + location.getLatitude()
+ "\naltitude:" + location.getAltitude() );
}
#Override
public void onProviderDisabled(String provider) {
// TODO Auto-generated method stub
}
#Override
public void onProviderEnabled(String provider) {
// TODO Auto-generated method stub
}
#Override
public void onStatusChanged(String provider, int status, Bundle extras) {
// TODO Auto-generated method stub
}
/** Define a mock GPS provider as an asynchronous task of this Activity. */
private class MockGpsProvider extends AsyncTask<String, Integer, Void> {
public static final String LOG_TAG = "GpsMockProvider";
public static final String GPS_MOCK_PROVIDER = "GpsMockProvider";
/** Keeps track of the currently processed coordinate. */
public Integer index = 0;
#Override
protected Void doInBackground(String... data) {
// process data
for (String str : data) {
// skip data if needed (see the Activity's savedInstanceState functionality)
if(index < mMockGpsProviderIndex) {
index++;
continue;
}
// let UI Thread know which coordinate we are processing
publishProgress(index);
// retrieve data from the current line of text
Double latitude = null;
Double longitude = null;
Double altitude= null;
try {
String[] parts = str.split(",");
latitude = Double.valueOf(parts[0]);
longitude = Double.valueOf(parts[1]);
altitude = Double.valueOf(parts[2]);
}
catch(NullPointerException e) { break; } // no data available
catch(Exception e) { continue; } // empty or invalid line
// translate to actual GPS location
Location location = new Location(GPS_MOCK_PROVIDER);
location.setLatitude(latitude);
location.setLongitude(longitude);
location.setAltitude(altitude);
location.setTime(System.currentTimeMillis());
location.setLatitude(latitude);
location.setLongitude(longitude);
location.setAccuracy(16F);
location.setAltitude(0D);
location.setTime(System.currentTimeMillis());
location.setBearing(0F);
// show debug message in log
Log.d(LOG_TAG, location.toString());
// provide the new location
LocationManager locationManager = (LocationManager) getSystemService(Context.LOCATION_SERVICE);
locationManager.setTestProviderLocation(GPS_MOCK_PROVIDER, location);
// sleep for a while before providing next location
try {
Thread.sleep(200);
// gracefully handle Thread interruption (important!)
if(Thread.currentThread().isInterrupted())
throw new InterruptedException("");
} catch (InterruptedException e) {
break;
}
// keep track of processed locations
index++;
}
return null;
}
#Override
protected void onProgressUpdate(Integer... values) {
Log.d(LOG_TAG, "onProgressUpdate():"+values[0]);
mMockGpsProviderIndex = values[0];
}
}
}
Problem solved: I added the following code to set my current location and it can successfully show up in google map application.
location.setLatitude(latitude);
location.setLongitude(longitude);
location.setBearing(bearing);
location.setSpeed(speed);
location.setAltitude(altitude);
location.setTime(new Date().getTime());
location.setProvider(LocationManager.GPS_PROVIDER);
location.setAccuracy(1);
Conclusion: If you want to use mock location service in the new version of android, you have to set every attribute by yourself.
I am trying to play a short sound byte after processing a scanned bar code. My code currently works fine for as many as twenty scans. However, eventually the MediaPlayer throws the following error repeatedly even after the app has been killed:
MediaPlayer: Error (-38, 0)
MediaPlayer: Attempt to perform seekTo in wrong state: mPlayer=0xXXXXXX, mCurrentState=0
--the X's representing a random 6 digit memory address--
I originally was playing the sound byte off of the UI thread. Since I've created a handler in an attempt to mitigate the issue. This is how I access the handler:
try {
mHandler.post(mScanFeedback);
} catch (IllegalStateException e) {
System.out.println("Media player state error");
e.printStackTrace();
}
Here is the code for the handler:
private Runnable mScanFeedback = new Runnable(){
public void run() {
if(getString(R.string.working).equals(mStatusHourly)) {
final MediaPlayer mediaPlayer = MediaPlayer.create(getBaseContext(), R.raw.bleep_working);
mediaPlayer.setOnErrorListener(new OnErrorListener() {
public boolean onError(MediaPlayer mp, int what, int extra) {
mediaPlayer.reset();
System.out.println("Media Player onError callback!");
return true;
}
});
mediaPlayer.start();
try {
Thread.sleep(150);
} catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} finally {
mediaPlayer.release();
}
} else if(getString(R.string.not_working).equals(mStatusHourly)) {
final MediaPlayer mediaPlayer = MediaPlayer.create(getBaseContext(), R.raw.bleep_not_working);
mediaPlayer.setOnErrorListener(new OnErrorListener() {
public boolean onError(MediaPlayer mp, int what, int extra) {
mediaPlayer.reset();
System.out.println("Media Player onError callback!");
return true;
}
});
mediaPlayer.start();
try {
Thread.sleep(275);
} catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} finally {
mediaPlayer.release();
}
} else {
System.out.println("Audio feedback failed as status was indeterminate.");
}
}
};
In the beginning I didn't call release() and adding it hasn't seemed to make it work any better or worse. The onError callback is never called when the problem occurs. I've tried to reset() the media player after each time it is played but that throws an error. Right now I resort to restarting the phone to keep my Logcat from being unusable by the onslaught of the same two error lines repeated continually.
I'm using zxing's bar code scanner and there is a short beep played within that activity as confirmation that the bar code has been captured. A small part of me wonders if their isn't a conflict there.
I'm still new to programming and this is my first question on stack overflow. Let me know if I should have provided any additional information or if I should try to keep it a little more lean.
Update:
I was unable to resolve the issue with the MediaPlayer. However, I was able to work around the issue by switching to a SoundPool implementation. The class below provides the needed functionality.
import java.util.HashMap;
import android.content.Context;
import android.media.AudioManager;
import android.media.SoundPool;
public class SoundManager {
private SoundPool mSoundPool;
private HashMap mSoundPoolMap;
private AudioManager mAudioManager;
private Context mContext;
public void initSounds(Context theContext) {
mContext = theContext;
mSoundPool = new SoundPool(4, AudioManager.STREAM_MUSIC, 0);
mSoundPoolMap = new HashMap();
mAudioManager = (AudioManager)mContext.getSystemService(Context.AUDIO_SERVICE);
}
public void addSound(int index, int SoundID) {
mSoundPoolMap.put(index, mSoundPool.load(mContext, SoundID, 1));
}
public void playSound(int index) {
float streamVolume = mAudioManager.getStreamVolume(AudioManager.STREAM_MUSIC);
streamVolume = streamVolume / mAudioManager.getStreamMaxVolume(AudioManager.STREAM_MUSIC);
mSoundPool.play(index, streamVolume, streamVolume, 1, 0, 1f);
}
public void playLoopedSound(int index) {
float streamVolume = mAudioManager.getStreamVolume(AudioManager.STREAM_MUSIC);
streamVolume = streamVolume / mAudioManager.getStreamMaxVolume(AudioManager.STREAM_MUSIC);
mSoundPool.play(index, streamVolume, streamVolume, 1, -1, 1f);
}
}
Which I then accessed from my Activity with:
mSoundManager = new SoundManager();
mSoundManager.initSounds(getBaseContext());
mSoundManager.addSound(1, R.raw.bleep_working);
mSoundManager.addSound(2, R.raw.bleep_not_working);
mSoundManager.playSound(1);
mSoundManager.playSound(2);