Android: how to make my app remember Google Drive Authorization? - java

I got this code from the Internet and I modified it to upload a specific file automatically to Google Drive when starting the activity.
It is always ask me to select my Google account when I start this activity!
I want it ask about Google account once only when started for the first time after install it, How to do that?
package com.example.googledrive;
import android.os.Bundle;
import android.app.Activity;
import android.view.Menu;
import java.util.Arrays;
import java.io.IOException;
import android.accounts.AccountManager;
import android.content.ContentResolver;
import android.content.Context;
import android.content.Intent;
import android.net.Uri;
import android.widget.Toast;
import com.google.api.client.extensions.android.http.AndroidHttp;
import com.google.api.client.googleapis.extensions.android.gms.auth.GoogleAccountCredential;
import com.google.api.client.googleapis.extensions.android.gms.auth.UserRecoverableAuthIOException;
import com.google.api.client.http.FileContent;
import com.google.api.client.json.gson.GsonFactory;
import com.google.api.services.drive.Drive;
import com.google.api.services.drive.DriveScopes;
import com.google.api.services.drive.model.File;
public class MainActivity extends Activity {
static final int REQUEST_ACCOUNT_PICKER = 1;
static final int REQUEST_AUTHORIZATION = 2;
static final int REQUEST_DOWNLOAD_FILE = 3;
static final int RESULT_STORE_FILE = 4;
private static Uri mFileUri;
private static Drive mService;
private GoogleAccountCredential mCredential;
private Context mContext;
#Override
protected void onCreate(Bundle savedInstanceState)
{
super.onCreate(savedInstanceState);
// setup for credentials for connecting to the Google Drive account
mCredential = GoogleAccountCredential.usingOAuth2(this, Arrays.asList(DriveScopes.DRIVE));
// start activity that prompts the user for their google drive account
startActivityForResult(mCredential.newChooseAccountIntent(), REQUEST_ACCOUNT_PICKER);
// mContext = getApplicationContext();
setContentView(R.layout.activity_main);
}
#Override
public boolean onCreateOptionsMenu(Menu menu)
{
getMenuInflater().inflate(R.menu.main, menu);
return true;
}
#Override
protected void onActivityResult(final int requestCode, final int resultCode, final Intent data)
{
switch (requestCode) {
case REQUEST_ACCOUNT_PICKER:
if (resultCode == RESULT_OK && data != null && data.getExtras() != null) {
String accountName = data.getStringExtra(AccountManager.KEY_ACCOUNT_NAME);
if (accountName != null) {
mCredential.setSelectedAccountName(accountName);
mService = getDriveService(mCredential);
}
saveFileToDrive();
}
break;
case REQUEST_AUTHORIZATION:
if (resultCode == Activity.RESULT_OK) {
//account already picked
} else {
startActivityForResult(mCredential.newChooseAccountIntent(), REQUEST_ACCOUNT_PICKER);
}
break;
}
}
private Drive getDriveService(GoogleAccountCredential credential)
{
return new Drive.Builder(AndroidHttp.newCompatibleTransport(), new GsonFactory(), credential)
.build();
}
private void saveFileToDrive()
{
Thread t = new Thread(new Runnable() {
#Override
public void run() {
try {
// Create URI from real path
String path;
path = "/sdcard/DCIM/Camera/a.png";
mFileUri = Uri.fromFile(new java.io.File(path));
ContentResolver cR = MainActivity.this.getContentResolver();
// File's binary content
java.io.File fileContent = new java.io.File(mFileUri.getPath());
FileContent mediaContent = new FileContent(cR.getType(mFileUri), fileContent);
showToast("Selected " + mFileUri.getPath() + "to upload");
// File's meta data.
File body = new File();
body.setTitle(fileContent.getName());
body.setMimeType(cR.getType(mFileUri));
com.google.api.services.drive.Drive.Files f1 = mService.files();
com.google.api.services.drive.Drive.Files.Insert i1 = f1.insert(body, mediaContent);
File file = i1.execute();
if (file != null)
{
showToast("Uploaded: " + file.getTitle());
}
} catch (UserRecoverableAuthIOException e) {
startActivityForResult(e.getIntent(), REQUEST_AUTHORIZATION);
} catch (IOException e) {
e.printStackTrace();
showToast("Transfer ERROR: " + e.toString());
}
}
});
t.start();
}
public void showToast(final String toast)
{
runOnUiThread(new Runnable() {
#Override
public void run() {
Toast.makeText(getApplicationContext(), toast, Toast.LENGTH_SHORT).show();
}
});
}
}

It's been a while, so you probably managed to do this by now, but I was just looking for the same thing and it's fairly easy. In the code you use and how it's used on several sites you get an Intent after calling a static method on GoogleAccountCredential.
You can also create your own, by calling a static method on AccountPicker and define a params that it's only required to pick once.
Intent accountPicker = AccountPicker.newChooseAccountIntent(null, null, new String[]{"com.google"}, false, null, null, null, null);
startActivityForResult(accountPicker, GOOGLE_ACCOUNT_PICKED);
The fourth param is what you want and you can read more about it here:
https://developers.google.com/android/reference/com/google/android/gms/common/AccountPicker?hl=en

Related

Custom Object detection app crashed/stops after uploading video file and converting it to images for detection (Android Studio)

Context:
I used a template of an android app from https://github.com/pramod722445/Custom_Object_Detection_App
and added java files to detect images from a video. I trained the model on Google Colab and converted it to a tflite file and renamed it as bline_model.tflite with a text folder with the label bline_label.txt in Android Studio. When I run the application, it stops/crashes after the video to image phase is done. I think the error originates from either objectDetectorClass.java or PreviewImageActivity.java. How do I fix it?
CODE
MainActivity.java
package bline.detector.com.ffmpegvideoeditor.activity;
import android.Manifest;
import android.app.AlertDialog;
import android.app.ProgressDialog;
import android.content.ContentUris;
import android.content.Context;
import android.content.DialogInterface;
import android.content.Intent;
import android.content.pm.PackageManager;
import android.database.Cursor;
import android.net.Uri;
import android.os.Build;
import android.os.Bundle;
import android.os.Environment;
import android.provider.DocumentsContract;
import android.provider.MediaStore;
import android.util.Log;
import android.view.View;
import android.widget.Button;
import android.widget.Toast;
import android.widget.VideoView;
import androidx.appcompat.app.AppCompatActivity;
import androidx.core.app.ActivityCompat;
import com.github.hiteshsondhi88.libffmpeg.ExecuteBinaryResponseHandler;
import com.github.hiteshsondhi88.libffmpeg.FFmpeg;
import com.github.hiteshsondhi88.libffmpeg.LoadBinaryResponseHandler;
import com.github.hiteshsondhi88.libffmpeg.exceptions.FFmpegCommandAlreadyRunningException;
import com.github.hiteshsondhi88.libffmpeg.exceptions.FFmpegNotSupportedException;
import org.florescu.android.rangeseekbar.RangeSeekBar;
import org.opencv.android.OpenCVLoader;
import java.io.File;
import java.util.ArrayList;
import java.util.List;
import bline.detector.com.ffmpegvideoeditor.R;
public class MainActivity extends AppCompatActivity {
static {
if (!OpenCVLoader.initDebug())
Log.d("ERROR", "Unable to load OpenCV");
else
Log.d("SUCCESS", "OpenCV loaded");
}
private Button uploadVideo;
private Button extractImages;
private VideoView videoView;
private String filePath;
private static final String FILEPATH = "filepath";
private Uri selectedVideoUri;
private FFmpeg ffmpeg;
private static final int REQUEST_TAKE_GALLERY_VIDEO = 100;
private static final String TAG = "amirah";
private int choice = 0;
private ProgressDialog progressDialog;
private RangeSeekBar rangeSeekBar;
private Context mContext;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
mContext = this;
uploadVideo = (Button) findViewById(R.id.uploadVideo);
extractImages = (Button) findViewById(R.id.extractImages);
videoView = (VideoView) findViewById(R.id.videoView);
rangeSeekBar = (RangeSeekBar) findViewById(R.id.rangeSeekBar);
progressDialog = new ProgressDialog(this);
progressDialog.setTitle(null);
progressDialog.setCancelable(false);
rangeSeekBar.setEnabled(false);
loadFFMpegBinary();
uploadVideo.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
if (Build.VERSION.SDK_INT >= 23)
getPermission();
else
uploadVideo();
}
});
extractImages.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
choice = 1;
if (selectedVideoUri != null) {
extractImagesVideo(rangeSeekBar.getSelectedMinValue().intValue() * 1000, rangeSeekBar.getSelectedMaxValue().intValue() * 1000);
} else
Toast.makeText(MainActivity.this, "Please Upload a Video", Toast.LENGTH_LONG).show();
}
});
}
private void getPermission() {
String[] params = null;
String writeExternalStorage = Manifest.permission.WRITE_EXTERNAL_STORAGE;
String readExternalStorage = Manifest.permission.READ_EXTERNAL_STORAGE;
int hasWriteExternalStoragePermission = ActivityCompat.checkSelfPermission(this, writeExternalStorage);
int hasReadExternalStoragePermission = ActivityCompat.checkSelfPermission(this, readExternalStorage);
List<String> permissions = new ArrayList<String>();
if (hasWriteExternalStoragePermission != PackageManager.PERMISSION_GRANTED)
permissions.add(writeExternalStorage);
if (hasReadExternalStoragePermission != PackageManager.PERMISSION_GRANTED)
permissions.add(readExternalStorage);
if (!permissions.isEmpty()) {
params = permissions.toArray(new String[permissions.size()]);
}
if (params != null && params.length > 0) {
ActivityCompat.requestPermissions(MainActivity.this,
params,
100);
} else
uploadVideo();
}
/**
* Handling response for permission request
*/
#Override
public void onRequestPermissionsResult(int requestCode,
String permissions[], int[] grantResults) {
super.onRequestPermissionsResult(requestCode, permissions, grantResults);
switch (requestCode) {
case 100: {
if (grantResults.length > 0
&& grantResults[0] == PackageManager.PERMISSION_GRANTED) {
uploadVideo();
}
}
break;
}
}
private void uploadVideo() {
try {
Intent intent = new Intent();
intent.setType("video/*");
intent.setAction(Intent.ACTION_GET_CONTENT);
startActivityForResult(Intent.createChooser(intent, "Select Video"), REQUEST_TAKE_GALLERY_VIDEO);
} catch (Exception e) {
}
}
#Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
super.onActivityResult(requestCode, resultCode, data);
if (resultCode == RESULT_OK) {
if (requestCode == REQUEST_TAKE_GALLERY_VIDEO) {
selectedVideoUri = data.getData();
videoView.setVideoURI(selectedVideoUri);
videoView.start();
}
}
}
private void loadFFMpegBinary() {
try {
if (ffmpeg == null) {
ffmpeg = FFmpeg.getInstance(this);
}
ffmpeg.loadBinary(new LoadBinaryResponseHandler() {
#Override
public void onFailure() { showUnsupportedExceptionDialog();
}
#Override
public void onSuccess() {
Log.d(TAG, "ffmpeg: successfully Loaded");
}
});
} catch (FFmpegNotSupportedException e) {
showUnsupportedExceptionDialog();
} catch (Exception e) {
Log.d(TAG, "exception: " + e);
}
}
private void showUnsupportedExceptionDialog() {
new AlertDialog.Builder(MainActivity.this)
.setIcon(android.R.drawable.ic_dialog_alert)
.setTitle("Not Supported")
.setMessage("Device Not Supported")
.setCancelable(false)
.setPositiveButton(android.R.string.ok, new DialogInterface.OnClickListener() {
#Override
public void onClick(DialogInterface dialog, int which) {
MainActivity.this.finish();
}
})
.create()
.show();
}
private void extractImagesVideo(int startMs, int endMs) {
File moviesDir = Environment.getExternalStoragePublicDirectory(
Environment.DIRECTORY_PICTURES
);
String filePrefix = "extract_picture";
String fileExtn = ".jpg";
String yourRealPath = getPath(MainActivity.this, selectedVideoUri);
File dir = new File(moviesDir, "VideoEditor");
int fileNo = 0;
while (dir.exists()) {
fileNo++;
dir = new File(moviesDir, "VideoEditor" + fileNo);
}
dir.mkdir();
filePath = dir.getAbsolutePath();
File dest = new File(dir, filePrefix + "%03d" + fileExtn);
Log.d(TAG, "startTrim: src: " + yourRealPath);
Log.d(TAG, "startTrim: dest: " + dest.getAbsolutePath());
String[] complexCommand = {"-y", "-i", yourRealPath, "-an", "-r", "1", "-ss", "" + startMs / 1000, "-t", "" + (endMs - startMs) / 1000, dest.getAbsolutePath()};
execFFmpegBinary(complexCommand);
}
//-y overwrite output files without asking
//-i ffmpeg reads from an arbitrary number of input "files" specified by the -i option
//-an disable audio recordings
//-r 1/2 will extract one image frame from every 2 second of video
//-r 1/2 will extract one image frame from every 2 second of video.Similarly, -r 1 will extract one image frame from every second of the video.
//remove -r option if u want to extract all video frames as images from the specified time duration
//-ss seeks to position
//-t limit the duration of data read from the input file
private void execFFmpegBinary(final String[] command) {
try {
ffmpeg.execute(command, new ExecuteBinaryResponseHandler() {
#Override
public void onFailure(String message) {
Log.d(TAG, "FAILED with output: " + message);
}
#Override
public void onSuccess(String s) {
Log.d(TAG, "SUCCESS with output: " + s);
Intent intent = new Intent(MainActivity.this, PreviewImageActivity.class);
intent.putExtra(FILEPATH, filePath);
startActivity(intent);
}
#Override
public void onProgress(String s) {
Log.d(TAG, "Started command : ffmpeg " + command);
if (choice == 1)
progressDialog.setMessage("progress: " + s);
Log.d(TAG, "progress : " + s);
}
#Override
public void onStart() {
Log.d(TAG, "Started command : ffmpeg " + command);
progressDialog.setMessage("Processing...");
progressDialog.show();
}
#Override
public void onFinish() {
Log.d(TAG, "Finished command : ffmpeg " + command);
if (choice == 1) {
progressDialog.dismiss();
}
}
});
} catch (FFmpegCommandAlreadyRunningException e) {
// do nothing for now
}
}
public static boolean deleteDir(File dir) {
if (dir.isDirectory()) {
String[] children = dir.list();
if (children != null) {
for (int i = 0; i < children.length; i++) {
boolean success = deleteDir(new File(dir, children[i]));
if (!success) {
return false;
}
}
}
}
return dir.delete();
}
private String getPath(final Context context, final Uri uri) {
final boolean isKitKat = Build.VERSION.SDK_INT >= Build.VERSION_CODES.KITKAT;
// DocumentProvider
if (isKitKat && DocumentsContract.isDocumentUri(context, uri)) {
// ExternalStorageProvider
if (isExternalStorageDocument(uri)) {
final String docId = DocumentsContract.getDocumentId(uri);
final String[] split = docId.split(":");
final String type = split[0];
if ("primary".equalsIgnoreCase(type)) {
return Environment.getExternalStorageDirectory() + "/" + split[1];
}
}
// DownloadsProvider
else if (isDownloadsDocument(uri)) {
final String id = DocumentsContract.getDocumentId(uri);
final Uri contentUri = ContentUris.withAppendedId(
Uri.parse("content://downloads/public_downloads"), Long.valueOf(id));
return getDataColumn(context, contentUri, null, null);
}
// MediaProvider
else if (isMediaDocument(uri)) {
final String docId = DocumentsContract.getDocumentId(uri);
final String[] split = docId.split(":");
final String type = split[0];
Uri contentUri = null;
if ("image".equals(type)) {
contentUri = MediaStore.Images.Media.EXTERNAL_CONTENT_URI;
} else if ("video".equals(type)) {
contentUri = MediaStore.Video.Media.EXTERNAL_CONTENT_URI;
}
final String selection = "_id=?";
final String[] selectionArgs = new String[]{
split[1]
};
return getDataColumn(context, contentUri, selection, selectionArgs);
}
}
// MediaStore (and general)
else if ("content".equalsIgnoreCase(uri.getScheme())) {
return getDataColumn(context, uri, null, null);
}
// File
else if ("file".equalsIgnoreCase(uri.getScheme())) {
return uri.getPath();
}
return null;
}
/**
* Get the value of the data column for this Uri.
*/
private String getDataColumn(Context context, Uri uri, String selection,
String[] selectionArgs) {
Cursor cursor = null;
final String column = "_data";
final String[] projection = {
column
};
try {
cursor = context.getContentResolver().query(uri, projection, selection, selectionArgs,
null);
if (cursor != null && cursor.moveToFirst()) {
final int column_index = cursor.getColumnIndexOrThrow(column);
return cursor.getString(column_index);
}
} finally {
if (cursor != null)
cursor.close();
}
return null;
}
private boolean isMediaDocument(Uri uri) {
return "com.android.providers.media.documents".equals(uri.getAuthority());
}
private boolean isDownloadsDocument(Uri uri) {
return "com.android.providers.downloads.documents".equals(uri.getAuthority());
}
private boolean isExternalStorageDocument(Uri uri) {
return "com.android.externalstorage.documents".equals(uri.getAuthority());
}
}
objectDetectorClass.java
package bline.detector.com.ffmpegvideoeditor.activity;
import android.content.res.AssetFileDescriptor;
import android.content.res.AssetManager;
import android.graphics.Bitmap;
import org.opencv.android.Utils;
import org.opencv.core.Mat;
import org.opencv.core.Point;
import org.opencv.core.Scalar;
import org.opencv.imgproc.Imgproc;
import org.tensorflow.lite.Interpreter;
import org.tensorflow.lite.gpu.GpuDelegate;
import java.io.BufferedReader;
import java.io.FileInputStream;
import java.io.IOException;
import java.io.InputStreamReader;
import java.lang.reflect.Array;
import java.nio.ByteBuffer;
import java.nio.ByteOrder;
import java.nio.channels.FileChannel;
import java.util.ArrayList;
import java.util.List;
import java.util.Map;
import java.util.TreeMap;
public class objectDetectorClass {
// used to load model and predict
private Interpreter interpreter;
// store all label in array
private List<String> labelList;
private int INPUT_SIZE;
private int PIXEL_SIZE=3; // for RGB
private int IMAGE_MEAN=0;
private float IMAGE_STD=255.0f;
// use to initialize gpu in app
private GpuDelegate gpuDelegate;
private int height=0;
private int width=0;
objectDetectorClass(AssetManager assetManager,String modelPath, String labelPath,int inputSize) throws IOException{
INPUT_SIZE=inputSize;
// use to define gpu or cpu // no. of threads
Interpreter.Options options=new Interpreter.Options();
gpuDelegate=new GpuDelegate();
options.addDelegate(gpuDelegate);
options.setNumThreads(4); // set it according to your phone
// loading model
interpreter=new Interpreter(loadModelFile(assetManager,modelPath),options);
// load labelmap
labelList=loadLabelList(assetManager,labelPath);
}
private List<String> loadLabelList(AssetManager assetManager, String labelPath) throws IOException {
// to store label
List<String> labelList=new ArrayList<>();
// create a new reader
BufferedReader reader=new BufferedReader(new InputStreamReader(assetManager.open(labelPath)));
String line;
// loop through each line and store it to labelList
while ((line=reader.readLine())!=null){
labelList.add(line);
}
reader.close();
return labelList;
}
private ByteBuffer loadModelFile(AssetManager assetManager, String modelPath) throws IOException {
// use to get description of file
AssetFileDescriptor fileDescriptor=assetManager.openFd(modelPath);
FileInputStream inputStream=new FileInputStream(fileDescriptor.getFileDescriptor());
FileChannel fileChannel=inputStream.getChannel();
long startOffset =fileDescriptor.getStartOffset();
long declaredLength=fileDescriptor.getDeclaredLength();
return fileChannel.map(FileChannel.MapMode.READ_ONLY,startOffset,declaredLength);
}
public Mat recognizeVideo(Mat mat_image){
Bitmap bitmap=null;
bitmap=Bitmap.createBitmap(mat_image.cols(),mat_image.rows(),Bitmap.Config.ARGB_8888);
Utils.matToBitmap(mat_image,bitmap);
height=bitmap.getHeight();
width=bitmap.getWidth();
// scale the bitmap to input size of model
Bitmap scaledBitmap=Bitmap.createScaledBitmap(bitmap,INPUT_SIZE,INPUT_SIZE,false);
// convert bitmap to bytebuffer as model input should be in it
ByteBuffer byteBuffer=convertBitmapToByteBuffer(scaledBitmap);
// defining output
Object[] input=new Object[1];
input[0]=byteBuffer;
Map<Integer,Object> output_map=new TreeMap<>();
// create treemap of three array (boxes,score,classes)
float[][][]boxes =new float[1][10][4];
// 10: top 10 objects detected
// 4: coordinates in image
float[][] scores=new float[1][10];
// stores scores of 10 object
float[][] classes=new float[1][10];
// stores class of object
// add it to object_map;
output_map.put(0,boxes);
output_map.put(1,classes);
output_map.put(2,scores);
// now predict
interpreter.runForMultipleInputsOutputs(input,output_map);
Object value=output_map.get(0);
Object Object_class=output_map.get(1);
Object score=output_map.get(2);
// loop through each object
// as output has only 10 boxes
for (int i=0;i<10;i++){
float class_value=(float) Array.get(Array.get(Object_class,0),i);
float score_value=(float) Array.get(Array.get(score,0),i);
// define threshold for score
// Here you can change threshold according to your model
if(score_value>0.8){
Object box1=Array.get(Array.get(value,0),i);
// multiply it with Original height and width of frame
float top=(float) Array.get(box1,0)*height;
float left=(float) Array.get(box1,1)*width;
float bottom=(float) Array.get(box1,2)*height;
float right=(float) Array.get(box1,3)*width;
// draw rectangle in Original frame // starting point //ending point of box //color of box thickness
Imgproc.rectangle(mat_image,new Point(left,top),new Point(right,bottom),new Scalar(0, 255, 0, 255),2);
// write text on frame
// string of class name of object // starting point // color of text // size of text
Imgproc.putText(mat_image,labelList.get((int) class_value),new Point(left,top),3,1,new Scalar(255, 0, 0, 255),2);
}
}
return mat_image;
}
private ByteBuffer convertBitmapToByteBuffer(Bitmap bitmap) {
ByteBuffer byteBuffer;
int quant=1;
int size_images=INPUT_SIZE;
if(quant==0){
byteBuffer=ByteBuffer.allocateDirect(1*size_images*size_images*3);
}
else {
byteBuffer=ByteBuffer.allocateDirect(4*1*size_images*size_images*3);
}
byteBuffer.order(ByteOrder.nativeOrder());
int[] intValues=new int[size_images*size_images];
bitmap.getPixels(intValues,0,bitmap.getWidth(),0,0,bitmap.getWidth(),bitmap.getHeight());
int pixel=0;
for (int i=0;i<size_images;++i){
for (int j=0;j<size_images;++j){
final int val=intValues[pixel++];
if(quant==0){
byteBuffer.put((byte) ((val>>16)&0xFF));
byteBuffer.put((byte) ((val>>8)&0xFF));
byteBuffer.put((byte) (val&0xFF));
}
else {
byteBuffer.putFloat((((val >> 16) & 0xFF))/255.0f);
byteBuffer.putFloat((((val >> 8) & 0xFF))/255.0f);
byteBuffer.putFloat((((val) & 0xFF))/255.0f);
}
}
}
return byteBuffer;
}
}
PreviewImageActivity.java
package bline.detector.com.ffmpegvideoeditor.activity;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.graphics.drawable.BitmapDrawable;
import android.graphics.drawable.Drawable;
import android.os.Bundle;
import android.util.Log;
import android.view.View;
import android.widget.Button;
import android.widget.ImageSwitcher;
import android.widget.ImageView;
import android.widget.TextView;
import android.widget.Toast;
import android.widget.ViewSwitcher;
import androidx.annotation.Nullable;
import androidx.appcompat.app.AppCompatActivity;
import org.opencv.android.Utils;
import org.opencv.core.CvType;
import org.opencv.core.Mat;
import java.io.File;
import java.io.IOException;
import java.util.ArrayList;
import bline.detector.com.ffmpegvideoeditor.R;
public class PreviewImageActivity extends AppCompatActivity {
private static final String FILEPATH = "filepath";
private objectDetectorClass objectDetectorClass;
private ImageSwitcher imageSwitcher;
private Button prev;
private Button next;
int placement = 0;
#Override
protected void onCreate(#Nullable Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_preview_image);
TextView tvInstruction = (TextView) findViewById(R.id.tvInstruction);
imageSwitcher = (ImageSwitcher) findViewById(R.id.imageSwitcher);
prev = findViewById(R.id.prev);
next = findViewById(R.id.next);
ArrayList<Bitmap> bitmap = new ArrayList<Bitmap>();
ArrayList<Bitmap> bitmaps = new ArrayList<Bitmap>();
ArrayList<Drawable> drawables = new ArrayList<Drawable>();
try {
objectDetectorClass = new objectDetectorClass(getAssets(), "bline_model.tflite", "bline_label.txt", 300);
Log.d("PreviewImageActivity", "Model is successfully loaded");
} catch (IOException e) {
Log.d("PreviewImageActivity", "Getting some error");
e.printStackTrace();
}
imageSwitcher.setFactory(new ViewSwitcher.ViewFactory() {
#Override
public View makeView() {
ImageView imageView = new ImageView(getApplicationContext());
imageView.setScaleType(ImageView.ScaleType.FIT_CENTER);
return imageView;
}
});
prev.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View view) {
if (placement > 0) {
placement--;
imageSwitcher.setImageDrawable(drawables.get(placement));
} else {
Toast.makeText(PreviewImageActivity.this, "No Previous Images", Toast.LENGTH_SHORT).show();
}
}
});
next.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View view) {
if (placement < drawables.size() - 1) {
placement++;
imageSwitcher.setImageDrawable(drawables.get(placement));
} else {
Toast.makeText(PreviewImageActivity.this, "No More Images", Toast.LENGTH_SHORT).show();
}
}
});
String filePath = getIntent().getStringExtra(FILEPATH);
ArrayList<String> f = new ArrayList<String>();
File dir = new File(filePath);
tvInstruction.setText("Images stored at path " + filePath);
File[] listFile;
listFile = dir.listFiles();
for (File e : listFile) {
f.add(e.getAbsolutePath());
}
if (f != null) {
Log.d("PreviewImageAdapter", "filepath: " + f);
for (int position = 0; position < f.size(); position++) {
Bitmap bmp = BitmapFactory.decodeFile(f.get(position));
bitmap.add(bmp);
Log.d("PreviewImageAdapter", "bitmap: " + bitmap);
if (bitmap.size() == f.size()) {
for (int m = 0; m < bitmap.size(); m++) {
Mat selected_image = new Mat(bitmap.get(m).getHeight(), bitmap.get(m).getWidth(), CvType.CV_8UC4);
Utils.bitmapToMat(bitmap.get(m), selected_image);
selected_image = objectDetectorClass.recognizeVideo(selected_image);
Bitmap bitmap1 = Bitmap.createBitmap(selected_image.cols(), selected_image.rows(), Bitmap.Config.ARGB_8888);
Utils.matToBitmap(selected_image, bitmap1);
bitmaps.add(bitmap1);
Log.d("PreviewImageAdapter", "detected images: " + bitmaps);
}
if (bitmaps.size() == bitmap.size()) {
for (int n = 0; n < bitmaps.size(); n++) {
Drawable drawable = new BitmapDrawable(bitmaps.get(n));
drawables.add(drawable);
Log.d("PreviewImageAdapter", "drawables: " + drawables);
}
}
imageSwitcher.setImageDrawable(drawables.get(0));
placement = 0;
}
}
}
}
}
I am also new to java and am sorry if my question is silly.

My application is working in an endless loop

So I have an application which has to monitor and range after beacons and than calculates the position of the user. After calculating this , the value is passed to the Wayfindigoverlayactivity.class where the value should be putt on the map with the blue dot.
I don know how to assign the value to the blue dot but before that my application is working on an endless loop and is opening the activity on ranging about 100x .
RangingActivity:
package com.indooratlas.android.sdk.examples.wayfinding;
import android.app.Activity;
import android.content.Intent;
import android.os.Bundle;
import android.os.RemoteException;
import android.util.Log;
import android.widget.EditText;
import android.content.Context;
import com.google.android.gms.maps.model.LatLng;
import com.indooratlas.android.sdk.IALocationRequest;
import com.indooratlas.android.sdk.examples.R;
import org.altbeacon.beacon.Beacon;
import org.altbeacon.beacon.BeaconConsumer;
import org.altbeacon.beacon.BeaconManager;
import org.altbeacon.beacon.RangeNotifier;
import org.altbeacon.beacon.Region;
import java.util.ArrayList;
import java.util.Collection;
import java.util.concurrent.BlockingQueue;
public class RangingActivity extends Activity implements BeaconConsumer,Runnable{
protected static final String TAG = "RangingActivity";
public LatLng center;
private final BlockingQueue queue;
private BeaconManager beaconManager;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_ranging);
beaconManager = BeaconManager.getInstanceForApplication(this);
beaconManager.bind(this);
}
#Override
protected void onDestroy() {
super.onDestroy();
beaconManager.unbind(this);
}
#Override
protected void onPause() {
super.onPause();
}
#Override
protected void onResume() {
super.onResume();
}
#Override
public void onBeaconServiceConnect() {
RangeNotifier rangeNotifier = new RangeNotifier() {
#Override
public void didRangeBeaconsInRegion(Collection<Beacon> beacons, Region region) {
int beacon_number = beacons.size();
Beacon[] beacon_array = beacons.toArray(new Beacon[beacons.size()]);
Beacon device1 = null, device2 = null, device3 = null;
Constants constants = new Constants();
float txPow1 = 0;
double RSSI1Unfiltered = 0;
double RSSI2Unfiltered = 0;
float txPow2 = 0;
double RSSI3Unfiltered = 0;
float txPow3 = 0;
if (beacon_number == 4) {
if (beacon_array[0].getIdentifier(0).toString() == constants.DEVICE1_UUID) {
device1 = beacon_array[0];
} else if (beacon_array[1].getIdentifier(0).toString() == constants.DEVICE1_UUID) {
device1 = beacon_array[1];
} else {
device1 = beacon_array[2];
}
if (beacon_array[0].getIdentifier(0).toString() == constants.DEVICE2_UUID) {
device2 = beacon_array[0];
} else if (beacon_array[1].getIdentifier(0).toString() == constants.DEVICE2_UUID) {
device2 = beacon_array[1];
} else {
device2 = beacon_array[2];
}
if (beacon_array[0].getIdentifier(0).toString() == constants.DEVICE3_UUID) {
device3 = beacon_array[0];
} else if (beacon_array[1].getIdentifier(0).toString() == constants.DEVICE3_UUID) {
device3 = beacon_array[1];
} else {
device3 = beacon_array[2];
}
RSSI1Unfiltered = device1.getRssi();
RSSI2Unfiltered = device2.getRssi();
RSSI3Unfiltered = device3.getRssi();
txPow1 = device1.getTxPower();
txPow2 = device2.getTxPower();
txPow3 = device3.getTxPower();
} else if (beacon_number > 0) {
Log.d(TAG, "didRangeBeaconsInRegion called with beacon count: " + beacons.size());
for (int i = 0; i < beacon_number; i++) {
Beacon nextBeacon = beacon_array[i];
Log.d(TAG, "The next beacon " + nextBeacon.getIdentifier(0) + " is about " + nextBeacon.getDistance() + " meters away." + "RSSI is: " + nextBeacon.getRssi());
logToDisplay("The next beacon" + nextBeacon.getIdentifier(0) + " is about " + nextBeacon.getDistance() + " meters away." + "RSSI is: " + nextBeacon.getRssi());
}
}
Log.d(TAG, "FLOAT!!!!!!!!" + txPow1);
LocationFinder locationFinder = new LocationFinder();
//pass location
center = locationFinder.findLocation(RSSI1Unfiltered, txPow1, RSSI2Unfiltered, txPow2, RSSI3Unfiltered, txPow3);
Log.d(TAG, "Current coordinates: asta e asta e !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! " + center.toString());
Bundle args = new Bundle();
args.putParcelable("b", center);
Intent intent00 = new Intent(RangingActivity.this, WayfindingOverlayActivity.class);
intent00.putExtras(args);
startActivity(intent00);
}
private void logToDisplay(final String s) {
runOnUiThread(new Runnable() {
#Override
public void run() {
EditText editText = RangingActivity.this.findViewById(R.id.textView3);
editText.append(s+"\n");
}
});
}
};
try {
beaconManager.startRangingBeaconsInRegion(new Region("myRangingUniqueId", null, null, null));
beaconManager.addRangeNotifier(rangeNotifier);
} catch (RemoteException e) {
}
}
/* Blockinqueue try---not working
RangingActivity(BlockingQueue q)
{
queue = q;
}
public void run() {
LatLng res;
try
{
res = center;
queue.put(res);
}
catch (InterruptedException e)
{
e.printStackTrace();
}
}
*/
}
Everything works fine here , until I open the next class where my map is the WayfindingOverlayActivity
package com.indooratlas.android.sdk.examples.wayfinding;
import android.content.Intent;
import android.graphics.Bitmap;
import android.graphics.drawable.Drawable;
import android.os.Bundle;
import com.google.android.material.snackbar.Snackbar;
import androidx.fragment.app.FragmentActivity;
import android.util.Log;
import android.view.View;
import com.google.android.gms.maps.CameraUpdateFactory;
import com.google.android.gms.maps.GoogleMap;
import com.google.android.gms.maps.OnMapReadyCallback;
import com.google.android.gms.maps.SupportMapFragment;
import com.google.android.gms.maps.model.BitmapDescriptor;
import com.google.android.gms.maps.model.BitmapDescriptorFactory;
import com.google.android.gms.maps.model.Circle;
import com.google.android.gms.maps.model.CircleOptions;
import com.google.android.gms.maps.model.GroundOverlay;
import com.google.android.gms.maps.model.GroundOverlayOptions;
import com.google.android.gms.maps.model.LatLng;
import com.google.android.gms.maps.model.Marker;
import com.google.android.gms.maps.model.MarkerOptions;
import com.google.android.gms.maps.model.Polyline;
import com.indooratlas.android.sdk.IALocation;
import com.indooratlas.android.sdk.IALocationListener;
import com.indooratlas.android.sdk.IALocationManager;
import com.indooratlas.android.sdk.IALocationRequest;
import com.indooratlas.android.sdk.IAOrientationListener;
import com.indooratlas.android.sdk.IAOrientationRequest;
import com.indooratlas.android.sdk.IAPOI;
import com.indooratlas.android.sdk.IARegion;
import com.indooratlas.android.sdk.IARoute;
import com.indooratlas.android.sdk.IAWayfindingListener;
import com.indooratlas.android.sdk.IAWayfindingRequest;
import com.indooratlas.android.sdk.examples.R;
import com.indooratlas.android.sdk.examples.SdkExample;
import com.indooratlas.android.sdk.resources.IAFloorPlan;
import com.indooratlas.android.sdk.resources.IALatLng;
import com.indooratlas.android.sdk.resources.IALocationListenerSupport;
import com.indooratlas.android.sdk.resources.IAVenue;
import com.squareup.picasso.Picasso;
import com.squareup.picasso.RequestCreator;
import com.squareup.picasso.Target;
import java.util.ArrayList;
import java.util.List;
import java.util.concurrent.ArrayBlockingQueue;
import java.util.concurrent.BlockingQueue;
#SdkExample(description = R.string.example_wayfinding_description)
public class WayfindingOverlayActivity extends FragmentActivity
implements GoogleMap.OnMapClickListener, OnMapReadyCallback ,Runnable{
private final BlockingQueue queue;
private static final String TAG = "IndoorAtlasExample";
/* used to decide when bitmap should be downscaled */
private static final int MAX_DIMENSION = 2048;
//kalman filter
private static final double KALMAN_R = 0.125d;
private static final double KALMAN_Q = 0.5d;
private GoogleMap mMap; // Might be null if Google Play services APK is not available.
private Circle mCircle;
private IARegion mOverlayFloorPlan = null;
private GroundOverlay mGroundOverlay = null;
private IALocationManager mIALocationManager;
private Target mLoadTarget;
private boolean mCameraPositionNeedsUpdating = true; // update on first location
private Marker mDestinationMarker;
private Marker mHeadingMarker;
private IAVenue mVenue;
private List<Marker> mPoIMarkers = new ArrayList<>();
private List<Polyline> mPolylines = new ArrayList<>();
private IARoute mCurrentRoute;
private IAWayfindingRequest mWayfindingDestination;
private IAWayfindingListener mWayfindingListener = new IAWayfindingListener() {
#Override
public void onWayfindingUpdate(IARoute route) {
mCurrentRoute = route;
if (hasArrivedToDestination(route)) {
// stop wayfinding
showInfo("You're there!");
mCurrentRoute = null;
mWayfindingDestination = null;
mIALocationManager.removeWayfindingUpdates();
}
updateRouteVisualization();
}
};
private IAOrientationListener mOrientationListener = new IAOrientationListener() {
#Override
public void onHeadingChanged(long timestamp, double heading) {
updateHeading(heading);
}
#Override
public void onOrientationChange(long timestamp, double[] quaternion) {
// we do not need full device orientation in this example, just the heading
}
};
private int mFloor;
// circle
private void showLocationCircle(LatLng center, double accuracyRadius) {
if (mCircle == null) {
// location can received before map is initialized, ignoring those updates
if (mMap != null) {
mCircle = mMap.addCircle(new CircleOptions()
.center(center)
.radius(accuracyRadius)
.fillColor(0x201681FB)
.strokeColor(0x500A78DD)
.zIndex(1.0f)
.visible(true)
.strokeWidth(5.0f));
mHeadingMarker = mMap.addMarker(new MarkerOptions()
.position(center)
.icon(BitmapDescriptorFactory.fromResource(R.drawable.map_blue_dot))
.anchor(0.5f, 0.5f)
.flat(true));
}
} else {
// move existing markers position to received location
mCircle.setCenter(center);
mHeadingMarker.setPosition(center);
mCircle.setRadius(accuracyRadius);
}
}
private void updateHeading(double heading) {
if (mHeadingMarker != null) {
mHeadingMarker.setRotation((float) heading);
}
}
private IALocationListener mListener = new IALocationListenerSupport() {
public void onLocationChanged(IALocation location) {
Log.d(TAG, "NEW" + location.getLatitude() + " " + location.getLongitude());
if (mMap == null) {
return;
}
final LatLng center = new LatLng(location.getLatitude(),location.getLongitude());
final int newFloor = location.getFloorLevel();
if (mFloor != newFloor) {
updateRouteVisualization();
}
mFloor = newFloor;
showLocationCircle(center, location.getAccuracy());
if (mCameraPositionNeedsUpdating) {
mMap.animateCamera(CameraUpdateFactory.newLatLngZoom(center, 15.5f));
mCameraPositionNeedsUpdating = false;
}
}
};
/**
* Listener that changes overlay if needed
*/
private IARegion.Listener mRegionListener = new IARegion.Listener() {
#Override
public void onEnterRegion(final IARegion region) {
if (region.getType() == IARegion.TYPE_FLOOR_PLAN) {
Log.d(TAG, "enter floor plan " + region.getId());
mCameraPositionNeedsUpdating = true; // entering new fp, need to move camera
if (mGroundOverlay != null) {
mGroundOverlay.remove();
mGroundOverlay = null;
}
mOverlayFloorPlan = region; // overlay will be this (unless error in loading)
fetchFloorPlanBitmap(region.getFloorPlan());
//setupPoIs(mVenue.getPOIs(), region.getFloorPlan().getFloorLevel());
} else if (region.getType() == IARegion.TYPE_VENUE) {
mVenue = region.getVenue();
}
}
#Override
public void onExitRegion(IARegion region) {
}
};
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_maps);
// prevent the screen going to sleep while app is on foreground
findViewById(android.R.id.content).setKeepScreenOn(true);
// instantiate IALocationManager
mIALocationManager = IALocationManager.create(this);
// Try to obtain the map from the SupportMapFragment.
((SupportMapFragment) getSupportFragmentManager()
.findFragmentById(R.id.map))
.getMapAsync(this);
Intent myIntent = new Intent(this, RangingActivity.class);
this.startActivity(myIntent);
Intent intent00 = getIntent();
LatLng center = intent00.getParcelableExtra("b");
Log.d(TAG,"Location!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!" + center);
}
#Override
protected void onDestroy() {
super.onDestroy();
// remember to clean up after ourselves
mIALocationManager.destroy();
}
/*Some blockingqueue---does not work
public class BlockinQueueExample
{
public void main(String[] args) throws Exception
{
BlockingQueue q = new ArrayBlockingQueue(1000);
RangingActivity producer = new RangingActivity(q);
WayfindingOverlayActivity consumer = new WayfindingOverlayActivity(q);
new Thread(producer).start();
new Thread(consumer).start();
}
}
WayfindingOverlayActivity(BlockingQueue q)
{
this.queue = q;
}
public void run() {
try{
queue.take();
Log.d(TAG,"BIANCABICA"+queue.take());
}
catch (InterruptedException e)
{
e.printStackTrace();
}
}
*/
#Override
protected void onResume() {
super.onResume();
// start receiving location updates & monitor region changes
mIALocationManager.requestLocationUpdates(IALocationRequest.create(), mListener);
mIALocationManager.registerRegionListener(mRegionListener);
mIALocationManager.registerOrientationListener(
// update if heading changes by 1 degrees or more
new IAOrientationRequest(1, 0),
mOrientationListener);
if (mWayfindingDestination != null) {
mIALocationManager.requestWayfindingUpdates(mWayfindingDestination, mWayfindingListener);
}
}
EDIT , LAUNCHER ACTIVITY
package com.indooratlas.android.sdk.examples;
import android.app.Activity;
import android.app.Notification;
import android.app.NotificationChannel;
import android.app.NotificationManager;
import android.app.PendingIntent;
import android.content.Context;
import android.content.Intent;
import android.os.Build;
import android.os.Bundle;
import android.util.Log;
import android.view.View;
import android.widget.Button;
import com.indooratlas.android.sdk.examples.imageview.ImageViewActivity;
import com.indooratlas.android.sdk.examples.wayfinding.MonitoringActivity;
import com.indooratlas.android.sdk.examples.wayfinding.RangingActivity;
import com.indooratlas.android.sdk.examples.wayfinding.WayfindingOverlayActivity;
import org.altbeacon.beacon.Beacon;
import org.altbeacon.beacon.BeaconManager;
import org.altbeacon.beacon.Region;
import org.altbeacon.beacon.powersave.BackgroundPowerSaver;
import org.altbeacon.beacon.startup.BootstrapNotifier;
import org.altbeacon.beacon.startup.RegionBootstrap;
public class Bianca extends Activity implements BootstrapNotifier {
private static final String TAG = "RANGE";
private RegionBootstrap regionBootstrap;
private Button button;
private BackgroundPowerSaver backgroundPowerSaver;
private boolean haveDetectedBeaconsSinceBoot = false;
private MonitoringActivity monitoringActivity = null;
private String cumulativeLog = "";
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_one);
BeaconManager beaconManager = org.altbeacon.beacon.BeaconManager.getInstanceForApplication(this);
//--------------------------------meniu -------------------------------
button = findViewById(R.id.button);
button.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View view) {
openAct();
}
});
Button button2 = findViewById(R.id.button2);
button2.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View view) {
openAct2();
}
});
Button button3 = findViewById(R.id.button3);
button3.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View view) {
openAct3();
}
});
Button button4 = findViewById(R.id.button4);
button4.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View view) {
openAct4();
}
});
//-----------------------------meniu----------------------------------
Notification.Builder builder = new Notification.Builder(this);
Intent intent = new Intent(this,WayfindingOverlayActivity.class);
PendingIntent pendingIntent = PendingIntent.getActivity(this,0,intent,PendingIntent.FLAG_UPDATE_CURRENT);
builder.setContentIntent(pendingIntent);
if(Build.VERSION.SDK_INT >= Build.VERSION_CODES.O)
{
NotificationChannel channel = new NotificationChannel("My Notification Channel ID",
"My Notification Name", NotificationManager.IMPORTANCE_DEFAULT);
channel.setDescription("My Notification Channel Description");
NotificationManager notificationManager = (NotificationManager) getSystemService(
Context.NOTIFICATION_SERVICE);
notificationManager.createNotificationChannel(channel);
builder.setChannelId(channel.getId());
}
beaconManager.enableForegroundServiceScanning(builder.build(), 456);
Log.d(TAG, "setting up background monitoring for beacons and power saving");
// wake up the app when a beacon is seen
Region region = new Region("backgroundRegion",
null, null, null);
regionBootstrap = new RegionBootstrap((BootstrapNotifier) this, region);
backgroundPowerSaver = new BackgroundPowerSaver(this);
}
public void openAct()
{
Intent intent = new Intent(this, WayfindingOverlayActivity.class);
startActivity(intent);
}
public void openAct2()
{
Intent intent2 = new Intent(this, RangingActivity.class);
startActivity(intent2);
}
public void openAct3()
{
Intent intent4 = new Intent(this, ImageViewActivity.class);
startActivity(intent4);
}
public void openAct4()
{
Intent intent5 = new Intent(this,RegionsActivity.class);
startActivity(intent5);
}
public void disableMonitoring() {
if (regionBootstrap != null) {
regionBootstrap.disable();
regionBootstrap = null;
}
}
public void enableMonitoring() {
Region region = new Region("backgroundRegion",
null, null, null);
regionBootstrap = new RegionBootstrap((BootstrapNotifier) this, region);
}
public void didEnterRegion(Region arg0) {
// In this example, this class sends a notification to the user whenever a Beacon
// matching a Region (defined above) are first seen.
Log.d(TAG, "did enter region.");
if (!haveDetectedBeaconsSinceBoot) {
Log.d(TAG, "auto launching MainActivity");
// The very first time since boot that we detect an beacon, we launch the
// MainActivity
Intent intent = new Intent(this, WayfindingOverlayActivity.class);
intent.setFlags(Intent.FLAG_ACTIVITY_NEW_TASK);
// Important: make sure to add android:launchMode="singleInstance" in the manifest
// to keep multiple copies of this activity from getting created if the user has
// already manually launched the app.
this.startActivity(intent);
haveDetectedBeaconsSinceBoot = true;
} else {
if (monitoringActivity != null) {
// If the Monitoring Activity is visible, we log info about the beacons we have
// seen on its display
Log.d(TAG, "I see a beacon again");
} else {
// If we have already seen beacons before, but the monitoring activity is not in
// the foreground, we send a notification to the user on subsequent detections.
Log.d(TAG, "Sending notification.");
}
}
}
public void didExitRegion(Region arg0) {
Log.d(TAG,"I no longer see a beacon.");
}
#Override
public void didDetermineStateForRegion(int i, Region region) {
}
}
The second class is not fully posted , only where I make changes.
The intent in the second class is in the OnCreate part
The location is calculated in the logcat , the only problem is that the application is working in a loop
Please help me , I am stuck. Thanks
I guess, you only need to open RangingActivity from onCreate() of WayFindingOverlayActivity if center is null. This means, we need to open RangingActivity to get the value for center. Doing null check for the center will also ensure that the application doesn't go in loop and proceed when we have the value for center. The code in the onCreate() of WayFindingOverlayActivity may look like this :
// edited here
Bundle extras = getIntent().getExtras();
if(extras == null){
Intent myIntent = new Intent(this, RangingActivity.class);
this.startActivity(myIntent);
} else{
LatLng center = extras.getParcelable("b");
Log.d("Location!!", center);
// call showLocationCircle() to show the blue dot
showLocationCircle(center, yourAccuracyRadius);
}

I am having issues in connection and reading from BLE device

Hi I am currently facing issues with connection and reading of BLE devices. I think I have connected as the code below prints a message for connection, however the reading of values from the BLE device does not seem to work. The service uuid returns a null value.
package com.example.asp_sqllite;
import android.Manifest;
import android.app.Dialog;
import android.bluetooth.BluetoothAdapter;
import android.bluetooth.BluetoothGatt;
import android.bluetooth.BluetoothGattCallback;
import android.bluetooth.BluetoothGattCharacteristic;
import android.bluetooth.BluetoothGattService;
import android.bluetooth.BluetoothManager;
import android.bluetooth.BluetoothProfile;
import android.bluetooth.le.BluetoothLeScanner;
import android.bluetooth.le.ScanCallback;
import android.bluetooth.le.ScanFilter;
import android.bluetooth.le.ScanResult;
import android.bluetooth.le.ScanSettings;
import android.content.Context;
import android.content.Intent;
import android.content.pm.PackageManager;
import android.database.Cursor;
import android.database.sqlite.SQLiteDatabase;
import android.os.Build;
import android.os.Bundle;
import android.os.Handler;
import android.os.ParcelUuid;
import android.util.Log;
import android.view.View;
import android.widget.AdapterView;
import android.widget.ArrayAdapter;
import android.widget.Button;
import android.widget.ListView;
import android.widget.Toast;
import java.lang.reflect.InvocationTargetException;
import java.lang.reflect.Method;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.UUID;
import androidx.appcompat.app.AppCompatActivity;
import androidx.core.app.ActivityCompat;
import androidx.core.content.ContextCompat;
public class PlayActivity extends AppCompatActivity {
private static final int REQUEST_ENABLE_BT =1 ;
private Button btnPlay;
private Button btnConnect;
private ListView btList;
SQLiteDatabase db;
private Handler handler;
private ArrayList<String> deviceList = new ArrayList<>();
private ArrayAdapter<String> testAdapter;
private ArrayAdapter<String> deviceAdapter;
private BluetoothAdapter bluetoothAdapter;
private BluetoothLeScanner bleScanner;
private BluetoothGatt bleGatt;
private ArrayList<ScanResult> results = new ArrayList<>();
private ScanSettings settings;
private Intent intent;
private ListView bluetoothList;
private boolean completed = false;
static final UUID HR_SERVICE_UUID = UUID.fromString("0000110a-0000-1000-8000-00805f9b34fb");
private static final UUID HEART_RATE_MEASUREMENT_CHARACTERISTIC_UUID = UUID.fromString("00002A37-0000-1000-8000-00805f9b34fb");
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_play);
this.handler= new Handler();
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.M ) {
checkPermission();
}
BluetoothManager bluetoothManager = (BluetoothManager) getSystemService(Context.BLUETOOTH_SERVICE);
bluetoothAdapter = bluetoothManager.getAdapter();
this.btList = (ListView) findViewById(R.id.btlist);
deviceAdapter = new ArrayAdapter<String>(getApplicationContext(), android.R.layout.simple_list_item_1, android.R.id.text1);
testAdapter = new ArrayAdapter<String>(getApplicationContext(), android.R.layout.simple_list_item_1, android.R.id.text1);
intent = getIntent();
db = openOrCreateDatabase("myDB.db", MODE_PRIVATE, null);
checkBluetooth();
this.btnPlay = (Button) findViewById(R.id.btnPlay);
this.btnPlay.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View view) {
if(bleGatt!=null) {
final String username = intent.getStringExtra("username");
System.out.println(username+"!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!");
String sqlStatement = "insert into records (Name) values( '" + username + "')";
String result = updateTable(sqlStatement);
/*
* Run query to get recid to be passed over to the next activity
*
* */
final Cursor cursor = db.rawQuery("SELECT recID From records", null);
int num = 0;
if (cursor != null) {
cursor.moveToLast();
num = cursor.getInt(0);
cursor.close();
db.close();
}
Intent intent = new Intent(PlayActivity.this, testPlayActivity.class);
intent.putExtra("ID", Integer.toString(num));
startActivity(intent);
}
else
Toast.makeText(getApplicationContext(),"Connect to BLE device", Toast.LENGTH_LONG).show();
//finish();
}
});
this.btnConnect = (Button) findViewById(R.id.connect);
this.btnConnect.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View view) {
startScan();
Dialog d = new Dialog(PlayActivity.this); //open up dialog box with listview
d.setContentView(R.layout.bluetooth_device);
d.setTitle("Devices");
d.show();
//stopScan();
Button scanBtn = d.findViewById(R.id.scanBluetooth);
bluetoothList = d.findViewById(R.id.bluetoothDeviceList);
bluetoothList.setAdapter(deviceAdapter);
bluetoothList.setOnItemClickListener(new AdapterView.OnItemClickListener() {
#Override
public void onItemClick(AdapterView<?> adapterView, View view, int i, long l) {
ScanResult device = results.get(i);
Toast.makeText(getApplicationContext(), device.getDevice().getName(), Toast.LENGTH_LONG).show();
bleGatt = device.getDevice().connectGatt(getApplicationContext(), false, bleGattCallback);
System.out.println("##############################################testing 123");
//finish();
try {
BluetoothAdapter adapter = BluetoothAdapter.getDefaultAdapter();
Method getUuidsMethod = BluetoothAdapter.class.getDeclaredMethod("getUuids", null);
ParcelUuid[] uuids = (ParcelUuid[]) getUuidsMethod.invoke(adapter, null);
if(uuids != null) {
for (ParcelUuid uuid : uuids) {
System.out.println(uuid.getUuid().toString()+"#############################");
}
}else{
System.out.println("fail");
}
} catch (NoSuchMethodException e) {
e.printStackTrace();
} catch (IllegalAccessException e) {
e.printStackTrace();
} catch (InvocationTargetException e) {
e.printStackTrace();
}
}
});
scanBtn.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View view) { //clear all list and adapters before scanning again
deviceList.clear();
deviceAdapter.clear();
results.clear();
startScan();
//stopScan();
handler.postDelayed(new Runnable() {
#Override
public void run() {
stopScan();
}
},5000);
}
});
handler.postDelayed(new Runnable() {
#Override
public void run() {
stopScan();
}
},5000);
}
});
}
public void checkPermission(){
if (ContextCompat.checkSelfPermission(this, Manifest.permission.ACCESS_FINE_LOCATION) != PackageManager.PERMISSION_GRANTED ||
ContextCompat.checkSelfPermission(this,Manifest.permission.ACCESS_COARSE_LOCATION) != PackageManager.PERMISSION_GRANTED
){//Can add more as per requirement
ActivityCompat.requestPermissions(this,
new String[]{Manifest.permission.ACCESS_FINE_LOCATION,Manifest.permission.ACCESS_COARSE_LOCATION},
123);
}
}
private void checkBluetooth()
{
if (bluetoothAdapter == null || !bluetoothAdapter.isEnabled()) {
Intent enableBtIntent = new Intent(BluetoothAdapter.ACTION_REQUEST_ENABLE);
startActivityForResult(enableBtIntent, REQUEST_ENABLE_BT);
}
}
private String updateTable(String sql) {
try {
db.beginTransaction();
db.execSQL(sql);
db.setTransactionSuccessful();
db.endTransaction();
} catch (Exception e) {
System.out.println(e.toString());
return ("Error");
}
Toast.makeText(this, "DB updated", Toast.LENGTH_LONG).show();
return ("Welcome");
}
private void stopScan(){
bleScanner = bluetoothAdapter.getBluetoothLeScanner();
bleScanner.stopScan(scanCallback);
}
private void startScan() {
bleScanner = bluetoothAdapter.getBluetoothLeScanner();
if (bleScanner != null) { //setting up of scanner
final ScanFilter scanFilter =new ScanFilter.Builder().build();
settings =new ScanSettings.Builder().setScanMode(ScanSettings.SCAN_MODE_LOW_LATENCY).build();
bleScanner.startScan(Arrays.asList(scanFilter), settings, scanCallback);
//stopScan();
}
else
checkBluetooth();
}
private ScanCallback scanCallback = new ScanCallback() { //scan and return device results
#Override
public void onScanResult(int callbackType, ScanResult result) {
System.out.println("######### "+callbackType + result);
if (bleScanner != null && !deviceList.contains(result.getDevice().getName())) {
deviceList.add(result.getDevice().getName());
String device = result.getDevice().getName() + "\n" + result.getDevice().getAddress();
deviceAdapter.add(device); //Store device name and address
results.add(result); //records found devices as ScanResult
}
}
public void onScanFailed(int errorCode) {
super.onScanFailed(errorCode);
Log.e("TAG","onScanFailed");
}
};
private BluetoothGattCallback bleGattCallback = new BluetoothGattCallback()
{
#Override
public void onConnectionStateChange(BluetoothGatt gatt, int status, int newState) {
if (newState== BluetoothProfile.STATE_CONNECTED){
System.out.println("#################################################################Connected");
}
else if (newState == BluetoothProfile.STATE_DISCONNECTED)
{
System.out.println("################################################################Not Connected");
}
gatt.discoverServices();
super.onConnectionStateChange(gatt, status, newState);
}
#Override
public void onServicesDiscovered(BluetoothGatt gatt, int status) {
BluetoothGattService service = gatt.getService(HR_SERVICE_UUID);
System.out.println(service+"!!!!!!!!!!!!!!!!!!!!!!");
BluetoothGattCharacteristic temperatureCharacteristic = service.getCharacteristic(HEART_RATE_MEASUREMENT_CHARACTERISTIC_UUID);
gatt.readCharacteristic(temperatureCharacteristic);
super.onServicesDiscovered(gatt, status);
}
#Override
public void onCharacteristicRead(BluetoothGatt gatt, final BluetoothGattCharacteristic characteristic, int status) {
final String value = characteristic.getStringValue(0);
runOnUiThread(new Runnable() {
#Override
public void run() {
if(HEART_RATE_MEASUREMENT_CHARACTERISTIC_UUID.equals(characteristic.getUuid())) {
//Toast.makeText(getApplicationContext(), "Correct Bluetooth: " + value, Toast.LENGTH_LONG).show();
System.out.println("###########################################################correct");
} else {
//Toast.makeText(getApplicationContext(), "Wrong Bluetooth", Toast.LENGTH_LONG).show();
System.out.println("##############################################################wrong");
}
}
});
BluetoothGattService service = gatt.getService(HR_SERVICE_UUID);
//readNextCharacteristic(gatt, characteristic);
super.onCharacteristicRead(gatt, characteristic, status);
}
};
}
The code is an example in arduino library(BLE_Example/BLE_HRM)
/*
* Copyright (c) 2016 RedBear
*
* Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"),
* to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense,
* and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
*
* The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
*
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS
* IN THE SOFTWARE.
*/
/**
* #note This demo is Nordic HRM example.
* You could use nRF toolbox tool to test it.
*/
#include <nRF5x_BLE_API.h>
#define DEVICE_NAME "Nordic_HRM"
BLE ble;
Ticker ticker_task1;
static uint8_t hrmCounter = 100;
static uint8_t bpm[2] = {0x00, hrmCounter};
static const uint8_t location = 0x03;
static const uint16_t uuid16_list[] = {GattService::UUID_HEART_RATE_SERVICE};
// Create characteristic and service
GattCharacteristic hrmRate(GattCharacteristic::UUID_HEART_RATE_MEASUREMENT_CHAR, bpm, sizeof(bpm), sizeof(bpm), GattCharacteristic::BLE_GATT_CHAR_PROPERTIES_NOTIFY);
GattCharacteristic hrmLocation(GattCharacteristic::UUID_BODY_SENSOR_LOCATION_CHAR,(uint8_t *)&location, sizeof(location), sizeof(location),GattCharacteristic::BLE_GATT_CHAR_PROPERTIES_READ);
GattCharacteristic *hrmChars[] = {&hrmRate, &hrmLocation, };
GattService hrmService(GattService::UUID_HEART_RATE_SERVICE, hrmChars, sizeof(hrmChars) / sizeof(GattCharacteristic *));
void disconnectionCallBack(const Gap::DisconnectionCallbackParams_t *params) {
Serial.println("Disconnected!");
Serial.println("Restarting the advertising process");
ble.startAdvertising();
}
void periodicCallback() {
if (ble.getGapState().connected) {
// Update the HRM measurement
// First byte = 8-bit values, no extra info, Second byte = uint8_t HRM value
// See --> https://developer.bluetooth.org/gatt/characteristics/Pages/CharacteristicViewer.aspx?u=org.bluetooth.characteristic.heart_rate_measurement.xml
hrmCounter++;
if (hrmCounter == 175)
hrmCounter = 100;
bpm[1] = hrmCounter;
ble.updateCharacteristicValue(hrmRate.getValueAttribute().getHandle(), bpm, sizeof(bpm));
}
}
void setup() {
// put your setup code here, to run once
Serial.begin(9600);
Serial.println("Nordic_HRM Demo ");
// Init timer task
ticker_task1.attach(periodicCallback, 1);
// Init ble
ble.init();
ble.onDisconnection(disconnectionCallBack);
// setup adv_data and srp_data
ble.accumulateAdvertisingPayload(GapAdvertisingData::BREDR_NOT_SUPPORTED | GapAdvertisingData::LE_GENERAL_DISCOVERABLE);
ble.accumulateAdvertisingPayload(GapAdvertisingData::COMPLETE_LIST_16BIT_SERVICE_IDS, (uint8_t*)uuid16_list, sizeof(uuid16_list));
ble.accumulateAdvertisingPayload(GapAdvertisingData::HEART_RATE_SENSOR_HEART_RATE_BELT);
ble.accumulateAdvertisingPayload(GapAdvertisingData::COMPLETE_LOCAL_NAME, (uint8_t *)DEVICE_NAME, sizeof(DEVICE_NAME));
// set adv_type
ble.setAdvertisingType(GapAdvertisingParams::ADV_CONNECTABLE_UNDIRECTED);
// add service
ble.addService(hrmService);
// set device name
ble.setDeviceName((const uint8_t *)DEVICE_NAME);
// set tx power,valid values are -40, -20, -16, -12, -8, -4, 0, 4
ble.setTxPower(4);
// set adv_interval, 100ms in multiples of 0.625ms.
ble.setAdvertisingInterval(160);
// set adv_timeout, in seconds
ble.setAdvertisingTimeout(0);
// start advertising
ble.startAdvertising();
}
void loop() {
// put your main code here, to run repeatedly:
ble.waitForEvent();
}

Speak failed: tts engine connection not fully setup in the TextToSpeech functionality

package com.example.trynot;
import java.io.IOException;
import java.util.Calendar;
import java.util.Locale;
import java.util.Properties;
import java.util.logging.Level;
import java.util.logging.Logger;
import javax.mail.Address;
import javax.mail.BodyPart;
import javax.mail.Flags;
import javax.mail.Folder;
import javax.mail.Message;
import javax.mail.MessagingException;
import javax.mail.Multipart;
import javax.mail.NoSuchProviderException;
import javax.mail.PasswordAuthentication;
import javax.mail.Session;
import javax.mail.Store;
import javax.mail.Flags.Flag;
import javax.mail.search.FlagTerm;
import com.example.trynot.MainActivity;
import com.example.trynot.R;
import android.annotation.SuppressLint;
import android.app.Activity;
import android.app.ActivityManager;
import android.app.ActivityManager.RunningServiceInfo;
import android.app.Notification;
import android.app.NotificationManager;
import android.app.PendingIntent;
import android.content.Context;
import android.content.Intent;
import android.database.sqlite.SQLiteDatabase;
import android.os.AsyncTask;
import android.os.Bundle;
import android.os.Handler;
import android.sax.StartElementListener;
import android.speech.tts.TextToSpeech;
import android.speech.tts.TextToSpeech.OnInitListener;
import android.support.v4.app.NotificationCompat;
import android.support.v4.app.NotificationCompat.Builder;
import android.util.Log;
import android.view.Menu;
import android.view.MenuItem;
import android.widget.Toast;
import android.speech.tts.TextToSpeech;
public class MainActivity extends Activity {
private static final int MY_DATA_CHECK_CODE = 1234;
public static Context c,b;
public TextToSpeech tts;
public static Intent serviceIntent;
private static int myNotificationId;
public static class ReadMailSample extends AsyncTask<String,String,Void> implements TextToSpeech.OnInitListener{
Message message;
private TextToSpeech tts;
static String command, phoneNumber, type, priority, name, time_stamp, imei, opt1, opt2, opt3, fromSubString;
Properties properties = null;
private Session session = null;
private Store store = null;
private Folder inbox = null;
String userName="avarote1994#gmail.com" ; // PROVIDE RECEPIENT EMAIL ID
String password="amul11111994" ; // PROVIDE RECEPIENT PASSWORD
static SQLiteDatabase db;
boolean flag=false;
Context acn;
//private Bundle savedInstanceState;
protected Void doInBackground(String...params){ // SEPARATE THREAD TO RUN IN THE BACKGROUND
try{
readMails();
}
catch(Exception e){
Logger logger = Logger.getAnonymousLogger();
logger.log(Level.INFO, "an exception was thrown", e);
}
return null;
}
ReadMailSample(SQLiteDatabase db){
this.db = db;
}
ReadMailSample(){
}
ReadMailSample(Context cn){
acn=cn;
}
#Override
protected void onPreExecute() {
super.onPreExecute();
tts = new TextToSpeech(c,ReadMailSample.this);
}
#Override
protected void onProgressUpdate(String... values) {
try {
System.out.println("---------------------adasd-----------" + time_stamp);
showNotification();
speakOut();
}
catch(Exception e){
e.printStackTrace();
}
}
#Override
public void onInit(int status) {
if (status == TextToSpeech.SUCCESS) {
tts.speak(command, TextToSpeech.QUEUE_FLUSH, null);
int result = tts.setLanguage(Locale.US);
if (result == TextToSpeech.LANG_MISSING_DATA
|| result == TextToSpeech.LANG_NOT_SUPPORTED) {
Log.e("TTS", "This Language is not supported");
} else {
speakOut();
}
} else {
Log.e("TTS", "Initilization Failed!");
}
}
public void speakOut() {
tts.speak("hello hi", TextToSpeech.QUEUE_FLUSH, null);
}
public void showNotification() {
PendingIntent notificationIntent = preparePendingIntent();
Notification notification = createBasicNotification(notificationIntent);
displayNotification(notification);
}
#SuppressLint("InlinedApi")
private PendingIntent preparePendingIntent() {
Intent intent=new Intent(c,MainActivity.class);
intent.setFlags(Intent.FLAG_ACTIVITY_NEW_TASK | Intent.FLAG_ACTIVITY_CLEAR_TASK);
PendingIntent pendingIntent = PendingIntent.getActivity(
c,
0,
intent,
PendingIntent.FLAG_UPDATE_CURRENT);
return pendingIntent;
}
private Notification createBasicNotification(PendingIntent notificationIntent) {
NotificationCompat.Builder builder = new Builder(c);
long[] vibrationPattern = {0, 200, 800, 200, 600, 200, 400, 200, 200, 200, 100, 200, 50, 200, 50, 200, 50, 200, 50, 200};
Notification notification = builder
.setSmallIcon(R.drawable.ic_launcher)
.setContentTitle("Medication Reminder")
.setContentText(command)
.setAutoCancel(true)
.setContentIntent(notificationIntent)
.setWhen(Calendar.getInstance().getTimeInMillis() + 1000*60+60)
.setVibrate(vibrationPattern)
.build();
return notification;
}
private void displayNotification(Notification notification) {
NotificationManager notificationManager = (NotificationManager)c.getSystemService(Context.NOTIFICATION_SERVICE);
myNotificationId=(int) System.currentTimeMillis();
notificationManager.notify(myNotificationId , notification);
}
public void readMails() throws IOException{
System.out.println("READMAIL hi");
properties = new Properties();
// SETTING UP AN IMAP SERVER TO ACCESS THE RECEPIENT'S EMAIL
properties.setProperty("mail.host", "imap.gmail.com");
properties.setProperty("mail.port", "995");
properties.setProperty("mail.transport.protocol", "imaps");
while(true){// CONTINUOUSLY MONITOR INCOMING MAIL'S
//String cq = "select * from Login4";
//Cursor c = db.rawQuery(cq, null);
// c.moveToFirst();
//final String userName = c.getString(0);
//final String password = c.getString(1);
//String cloud = "avarote1994#gmail.com";
// AUTHENTICATE AND GET AN INSTANCE OF THE SESSION FROM THE SERVER
session = Session.getInstance(properties,new javax.mail.Authenticator() {
protected PasswordAuthentication getPasswordAuthentication() {
return new PasswordAuthentication(userName, password);
}
});
try {
store = session.getStore("imaps");
store.connect();
inbox = store.getFolder("INBOX"); // ACCESS THE INBOX OF THE RECEPIENT'S EMAIL ID
inbox.open(Folder.READ_WRITE); // OPEN THE INBOX IN READ-WRITE MODE
Message messages[] = inbox.search(new FlagTerm(new Flags(Flag.SEEN), false)); //SEARCH INBOX FOR ANY UNREAD MAILS
System.out.println("Number of mails = " + messages.length);
for (int i = 0; i < messages.length; i++) { // PROCESS ALL THE UNREAD MAILS
message = messages[i];
Address[] from = message.getFrom();
String from1 = from[0].toString();
System.out.println(from1);
if(from1.contains("<")){
int start = from1.indexOf("<");
int end = from1.indexOf(">");
fromSubString = from1.substring(start+1,end); // RETRIEVE THE SENDER'S EMAIL ID
} else{
fromSubString = from1;
}
System.out.println(fromSubString);
//if(fromSubString.equals(cloud)){ // CHECK WHETHER THE MAIL IS FROM THE CLOUD
String[] subject = message.getSubject().split(","); // SPLIT THE SUBJECT
System.out.println("hi");
type = subject[0]; // STORE THE DETAILS IN RESPECTIVE VARIABLES
phoneNumber =subject[1];
name = subject[2];
System.out.println(type);
System.out.println(phoneNumber);
System.out.println(name);
//String body=message.getContentType().toString();
// System.out.print(body);
processMessageBody(message);
//System.out.println("--------------------------------");
// }
}
inbox.close(true);
store.close();
}
catch (NoSuchProviderException e) {
e.printStackTrace();
}
catch (MessagingException e) {
e.printStackTrace();
}
}
}
public void processMessageBody(Message message) {
try {
Object content = message.getContent();
String msg=content.toString();
System.out.println(msg);
if (content instanceof Multipart) { // IF MAIL HAS MULTIPART MESSAGE
Multipart multiPart = (Multipart) content;
procesMultiPart(multiPart);
}
else{
System.out.println("Content = "+content);
processSinglepart(content.toString());
}
}
catch (IOException e) {
e.printStackTrace();
}
catch (MessagingException e) {
e.printStackTrace();
}
}
public void processSinglepart(String content){
String[] body = content.split(","); // SPLIT THE CONTENTS OF THE BODY
System.out.println('1');
time_stamp = body[0]; // STORE THE DETAILS IN RESPECTIVE VARIABLES
command = body[3];
System.out.println(time_stamp);
//tts.speak(time_stamp, TextToSpeech.QUEUE_FLUSH, null);
publishProgress(command);
}
public void procesMultiPart(Multipart content) {
System.out.println("amulya");
try {
BodyPart bodyPart = content.getBodyPart(0); // RETRIEVE THE CONTENTS FROM THE BODY
Object o;
o = bodyPart.getContent();
if (o instanceof String) {
System.out.println("Content Multipart= "+o.toString());
processSinglepart(o.toString());
}
}
catch (IOException e) {
e.printStackTrace();
}
catch (MessagingException e) {
e.printStackTrace();
}
}
}
#Override
protected void onCreate(Bundle savedInstanceState) {
c = this.getApplicationContext();
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
ReadMailSample rd=new ReadMailSample(getApplicationContext());
System.out.println("hello");
rd.execute();
System.out.println("------xvsdfsdfsd---------aefaefa-----------------");
}
#Override
public boolean onCreateOptionsMenu(Menu menu) {
// Inflate the menu; this adds items to the action bar if it is present.
getMenuInflater().inflate(R.menu.main, menu);
return true;
}
#Override
public boolean onOptionsItemSelected(MenuItem item) {
// Handle action bar item clicks here. The action bar will
// automatically handle clicks on the Home/Up button, so long
// as you specify a parent activity in AndroidManifest.xml.
int id = item.getItemId();
if (id == R.id.action_settings) {
return true;
}
return super.onOptionsItemSelected(item);
}
}
I am trying to implement texttospeech functionality. I am getting the above error. The functionality runs in async thread. The async thread reads mail. I want to convert the read mail to speech.
Logcat:
02-08 21:45:26.009: I/System.out(12947): TESTING
02-08 21:45:26.709: I/System.out(12947): javax.mail.internet.MimeMultipart#21cdab10
02-08 21:45:26.709: I/System.out(12947): amulya
02-08 21:45:27.279: I/System.out(12947): Content Multipart= 1221235487821,ad:af:12:2a:d5:c8,High,IT IS THURSDAY TAKE TABLETS,future
02-08 21:45:27.279: I/System.out(12947): use,future use,future use
02-08 21:45:27.279: I/System.out(12947): 1
02-08 21:45:27.279: I/System.out(12947): 1221235487821
02-08 21:45:27.279: I/System.out(12947): ---------------------adasd-----------1221235487821
02-08 21:45:27.299: W/TextToSpeech(12947): speak failed: TTS engine connection not fully set up
02-08 21:45:29.689: D/dalvikvm(12947): GC_FOR_ALLOC freed 455K, 19% free 5727K/7068K, paused 7ms, total 7ms
02-08 21:45:30.499: I/System.out(12947): Number of mails = 0
02-08 21:45:36.029: I/System.out(12947): Number of mails = 0
I have solved this issue with audible speech
import android.content.Context;
import android.speech.tts.TextToSpeech;
import android.util.Log;
import java.util.HashMap;
import java.util.Locale;
public class Speecher implements TextToSpeech.OnInitListener {
private static final String TAG = "TextToSpeechController";
private TextToSpeech myTTS;
private String textToSpeak;
private Context context;
private static Speecher singleton;
public static Speecher getInstance(Context ctx) {
if (singleton == null)
singleton = new Speecher(ctx);
return singleton;
}
private Speecher(Context ctx) {
context = ctx;
}
public void speak(String text) {
textToSpeak = text;
if (myTTS == null) {
// currently can\'t change Locale until speech ends
try {
// Initialize text-to-speech. This is an asynchronous operation.
// The OnInitListener (second argument) is called after
// initialization completes.
myTTS = new TextToSpeech(context, this);
} catch (Exception e) {
e.printStackTrace();
}
}
sayText();
}
public void onInit(int initStatus) {
if (initStatus == TextToSpeech.SUCCESS) {
if (myTTS.isLanguageAvailable(Locale.UK) == TextToSpeech.LANG_AVAILABLE)
myTTS.setLanguage(Locale.UK);
myTTS.setPitch((float) 0.9);
}
// status can be either TextToSpeech.SUCCESS or TextToSpeech.ERROR.
if (initStatus == TextToSpeech.SUCCESS) {
int result = myTTS.setLanguage(Locale.UK);
if (result == TextToSpeech.LANG_MISSING_DATA
|| result == TextToSpeech.LANG_NOT_SUPPORTED) {
Log.e(TAG, "TTS missing or not supported (" + result + ")");
// Language data is missing or the language is not supported.
// showError(R.string.tts_lang_not_available);
} else {
// Initialization failed.
Log.e(TAG, "Error occured");
}
}
}
private void sayText() {
HashMap<String, String> myHash = new HashMap<String, String>();
myHash.put(TextToSpeech.Engine.KEY_PARAM_UTTERANCE_ID, "done");
String[] splitspeech = this.textToSpeak.split("\\\\");
for (int i = 0; i < splitspeech.length; i++) {
if (i == 0) { // Use for the first splited text to flush on audio stream
myTTS.speak(splitspeech[i].toString().trim(), TextToSpeech.QUEUE_FLUSH, myHash);
} else { // add the new test on previous then play the TTS
myTTS.speak(splitspeech[i].toString().trim(), TextToSpeech.QUEUE_ADD, myHash);
}
myTTS.playSilence(100, TextToSpeech.QUEUE_FLUSH, null);
}
}
public void stopTTS() {
if (myTTS != null) {
myTTS.shutdown();
myTTS.stop();
myTTS = null;
}
}
}
Calling method from any where...
Speecher.getInstance(getApplicationContext()).speak(YOUR_TTS_MESSAGE_HERE);
From the Api Docs we see that your implementation of OnInitListener is necessesary to receive a notification that the TTS initialization is complete. This is described in the TextToSpeech docs as well where it is stated that the speech will fail until initialized. In the following, I will not attempt to fix all of your code so that it works, but I will point out how you're mis-using the TTS framework currently and nudge you in the right direction.
Though you have implemented onInitListener, you aren't doing anything useful with that implementation. Below, I show something more useful in which one captures the result of onInitListener's onInit callback, in order to remain aware of the initialization state of TTS.
...
private boolean mIsTTSInited = false; // member indicating TTS initialization state
...
// Receives notification of initialization
public void onInit(int status) {
if (status == TextToSpeech.SUCCESS) {
Log.i("TTS", "Initilization Succeeded!");
// save the state indicating success of TTS initialization
mIsTTSInited = true;
...
} else {
Log.e("TTS", "Initilization Failed!");
mIsTTSInited = false;
}
}
...
...
#Override
protected void onProgressUpdate(String... values) {
// A trivial example to show how you can check the state of initialization for TTS
if (mIsTTSInited) {
showNotification();
speakOut();
} else {
// code to deal with TTS not init'ed case
}
}
Additionally, I wouldn't run the TextToSpeech initialization as part of the AsyncTask. If I were you, I'd move
tts = new TextToSpeech(c,ReadMailSample.this);
into onCreate rather than onPreExecute. I see this used in the tutorials seen here, and here

How do i make something to run a program/do something when a client connecting to my server and how do i make the server to return to the client?

This is my webserver working code on the pc:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using System.Net;
using System.Threading;
namespace Youtube_Manager
{
class WebServer
{
private readonly HttpListener _listener = new HttpListener();
private readonly Func<HttpListenerRequest, string> _responderMethod;
public WebServer(string[] prefixes, Func<HttpListenerRequest, string> method)
{
if (!HttpListener.IsSupported)
throw new NotSupportedException(
"Needs Windows XP SP2, Server 2003 or later.");
// URI prefixes are required, for example
// "http://localhost:8080/index/".
if (prefixes == null || prefixes.Length == 0)
throw new ArgumentException("prefixes");
// A responder method is required
if (method == null)
throw new ArgumentException("method");
foreach (string s in prefixes)
_listener.Prefixes.Add(s);
_responderMethod = method;
_listener.Start();
}
public WebServer(Func<HttpListenerRequest, string> method, params string[] prefixes)
: this(prefixes, method) { }
public void Run()
{
ThreadPool.QueueUserWorkItem((o) =>
{
Console.WriteLine("Webserver running...");
try
{
while (_listener.IsListening)
{
ThreadPool.QueueUserWorkItem((c) =>
{
var ctx = c as HttpListenerContext;
try
{
string rstr = _responderMethod(ctx.Request);
byte[] buf = Encoding.UTF8.GetBytes(rstr);
ctx.Response.ContentLength64 = buf.Length;
ctx.Response.OutputStream.Write(buf, 0, buf.Length);
}
catch { } // suppress any exceptions
finally
{
// always close the stream
ctx.Response.OutputStream.Close();
}
}, _listener.GetContext());
}
}
catch { } // suppress any exceptions
});
}
public void Stop()
{
_listener.Stop();
_listener.Close();
}
}
}
Then in form1 constructor:
WebServer ws = new WebServer(SendResponse, "http://+:8098/");//"http://localhost:8080/test/");
ws.Run();
And a method in form1:
public static string SendResponse(HttpListenerRequest request)
{
return string.Format("<HTML><BODY>My web page.<br>{0}</BODY></HTML>", DateTime.Now);
}
And in the client side java code:
package com.test.webservertest;
import android.content.Intent;
import android.os.Bundle;
import android.speech.tts.TextToSpeech;
import android.speech.tts.TextToSpeech.OnInitListener;
import android.support.v7.app.ActionBarActivity;
import android.util.AndroidRuntimeException;
import android.util.Xml;
import android.view.Menu;
import android.view.MenuItem;
import android.graphics.Canvas;
import android.graphics.Color;
import android.graphics.Paint;
import android.graphics.Path;
import android.util.AttributeSet;
import android.view.MotionEvent;
import android.view.View;
import android.content.Context;
import org.apache.http.client.HttpClient;
import org.apache.http.client.methods.HttpPost;
import org.apache.http.util.EncodingUtils;
import org.json.JSONObject;
import org.json.JSONTokener;
import java.io.DataOutputStream;
import java.net.HttpURLConnection;
import java.net.MalformedURLException;
import java.net.URL;
import java.util.Arrays;
import java.util.Locale;
import java.net.URLConnection;
import java.net.HttpURLConnection;
import java.io.*;
import java.util.concurrent.Executor;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
import java.util.logging.Logger;
//import com.adilipman.webservertest.DebugServer;
public class MainActivity extends ActionBarActivity
{
private static final int MY_DATA_CHECK_CODE = 0;
public static MainActivity currentActivity;
TextToSpeech mTts;
private String targetURL;
private String urlParameters;
#Override
protected void onCreate(Bundle savedInstanceState)
{
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
setContentView(new SingleTouchEventView(this, null));
currentActivity = this;
initTTS();
}
#Override
public boolean onCreateOptionsMenu(Menu menu)
{
// Inflate the menu; this adds items to the action bar if it is present.
getMenuInflater().inflate(R.menu.menu_main, menu);
return true;
}
#Override
public boolean onOptionsItemSelected(MenuItem item)
{
// Handle action bar item clicks here. The action bar will
// automatically handle clicks on the Home/Up button, so long
// as you specify a parent activity in AndroidManifest.xml.
int id = item.getItemId();
//noinspection SimplifiableIfStatement
if (id == R.id.action_settings)
{
return true;
}
return super.onOptionsItemSelected(item);
}
/**
* Dispatch onStart() to all fragments. Ensure any created loaders are
* now started.
*/
#Override
protected void onStart()
{
super.onStart();
TextToSpeechServer.main(null);
}
#Override
protected void onStop() {
super.onStop();
}
public void initTTS() {
Intent checkIntent = new Intent();
checkIntent.setAction(TextToSpeech.Engine.ACTION_CHECK_TTS_DATA);
startActivityForResult(checkIntent, MY_DATA_CHECK_CODE);
}
#Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
if(requestCode == MY_DATA_CHECK_CODE) {
if(resultCode == TextToSpeech.Engine.CHECK_VOICE_DATA_PASS) {
mTts = new TextToSpeech(getApplicationContext(), new OnInitListener() {
#Override
public void onInit(int i) {
if(i == TextToSpeech.SUCCESS) {
int result = mTts.setLanguage(Locale.US);
if(result == TextToSpeech.LANG_AVAILABLE
|| result == TextToSpeech.LANG_COUNTRY_AVAILABLE) {
mTts.setPitch(1);
mTts.speak(textforthespeacch, TextToSpeech.QUEUE_FLUSH, null);
}
}
}
});
} else {
Intent installIntent = new Intent();
installIntent.setAction(TextToSpeech.Engine.ACTION_INSTALL_TTS_DATA);
startActivity(installIntent);
}
}
}
public static String textforthespeacch = "";
public static void TextToSpeak(String text) {
textforthespeacch = text;
}
public class SingleTouchEventView extends View {
private Paint paint = new Paint();
private Path path = new Path();
private Path circlePath = new Path();
public SingleTouchEventView(Context context, AttributeSet attrs) {
super(context, attrs);
paint.setAntiAlias(true);
paint.setStrokeWidth(6f);
paint.setColor(Color.BLACK);
paint.setStyle(Paint.Style.STROKE);
paint.setStrokeJoin(Paint.Join.ROUND);
}
#Override
protected void onDraw(Canvas canvas) {
canvas.drawPath(path, paint);
canvas.drawPath(circlePath, paint);
}
#Override
public boolean onTouchEvent(MotionEvent event)
{
float eventX = event.getX();
float eventY = event.getY();
float lastdownx = 0;
float lastdowny = 0;
switch (event.getAction())
{
case MotionEvent.ACTION_DOWN:
path.moveTo(eventX, eventY);
circlePath.addCircle(eventX, eventY, 50, Path.Direction.CW);
lastdownx = eventX;
lastdowny = eventY;
Thread t = new Thread(new Runnable()
{
#Override
public void run()
{
byte[] response = Get(null);
if (response!=null)
Logger.getLogger("MainActivity(inside thread)").info(response.toString());
}
});
t.start();
return true;
case MotionEvent.ACTION_MOVE:
path.lineTo(eventX, eventY);
break;
case MotionEvent.ACTION_UP:
// nothing to do
circlePath.reset();
break;
default:
return false;
}
// Schedules a repaint.
invalidate();
return true;
}
}
private byte[] Get(String urlIn)
{
URL url = null;
String urlStr="http://10.0.0.2:8098";
if (urlIn!=null)
urlStr=urlIn;
try
{
url = new URL(urlStr);
} catch (MalformedURLException e)
{
e.printStackTrace();
return null;
}
HttpURLConnection urlConnection = null;
try
{
urlConnection = (HttpURLConnection) url.openConnection();
InputStream in = new BufferedInputStream(urlConnection.getInputStream());
byte[] buf=new byte[10*1024];
int szRead = in.read(buf);
byte[] bufOut;
if (szRead==10*1024)
{
throw new AndroidRuntimeException("the returned data is bigger than 10*1024.. we don't handle it..");
}
else
{
bufOut = Arrays.copyOf(buf, szRead);
}
return bufOut;
}
catch (IOException e)
{
e.printStackTrace();
return null;
}
finally
{
if (urlConnection!=null)
urlConnection.disconnect();
}
}
}
First i'm running the server on the pc.
Then i'm running the program on the android when i make a touch on my smartphone the client connect to the server on the pc.
Now what i want to do is two things:
When connecting to the pc server do something. Once my device is connected to the pc server do something for example show a message like:
MessageBox.Show("connected");
2. After connected and if connected using a flag or something return a string to the client. I have on the pc a method called SendData i used it to send a string to a server on my android device:
private void SendDataToServer()
{
// Create a request using a URL that can receive a post.
WebRequest request = WebRequest.Create("http://10.0.0.3:8080/?say=hello? im here ");
// Set the Method property of the request to POST.
request.Method = "POST";
// Create POST data and convert it to a byte array.
string postData = "This is a test that posts this string to a Web server.";
byte[] byteArray = Encoding.UTF8.GetBytes(postData);
// Set the ContentType property of the WebRequest.
request.ContentType = "application/x-www-form-urlencoded";
// Set the ContentLength property of the WebRequest.
request.ContentLength = byteArray.Length;
// Get the request stream.
Stream dataStream = request.GetRequestStream();
// Write the data to the request stream.
dataStream.Write(byteArray, 0, byteArray.Length);
// Close the Stream object.
dataStream.Close();
// Get the response.
WebResponse response = request.GetResponse();
// Display the status.
Console.WriteLine(((HttpWebResponse)response).StatusDescription);
// Get the stream containing content returned by the server.
dataStream = response.GetResponseStream();
// Open the stream using a StreamReader for easy access.
StreamReader reader = new StreamReader(dataStream);
// Read the content.
string responseFromServer = reader.ReadToEnd();
// Display the content.
Console.WriteLine(responseFromServer);
// Clean up the streams.
reader.Close();
dataStream.Close();
response.Close();
}
The question is if i need to use this method if i want to return something from my pc server to the client side ?

Categories

Resources