orientation change sound - java

Good morning,
We have developed an android app, and I have been charged with finding out how to remove the undesired behavior of a notification sound every time that the screen orientation changes. Obviously this behavior only exists on devices running OS version 3.2.3 or later.
I have read several posts that indicate that this can be turned off by unchecking USB Debugging in the Settings --> Developer options, however this option is not checked and none of the other apps that are on any of our Android devices make this notification sound upon orientation change.
The application does require there to be a notification when a "message is received" (the app connects to a webservice and gets new messages from the service every so often). So this would rule out any solution that disabled notifications.
Thus far, I have tried several potential solutions:
1) When a message is received, instantiate a new NotificationManager, and after the notification is sounded, destroy the NotificationManager.
if(MessageReceived == true) {
String ns = Context.NOTIFICATION_SERVICE;
messageNotifyManager = (NotificationManager) getSystemService(ns);
}
showNotification();
messageNotifyManager = null;
2) I realize that an orientation change is essentially the view being destroyed and re-created. I put set a flag in the initial onCreate method and checked to see if that flag had value before recreating the Notification Manager.
public static int Flag = 0;
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
if(Flag == 0) {
String ns = Context.NOTIFICATION_SERVICE;
messageNotifyManager = (NotificationManager) getSystemService(ns);
Flag = 1;
}
}
3) In the application's main class, I created a public OrientationEventListener property and then set its value in the onCreate method, disabling it immediately. When that didn't disable the sound I tried disabling the property in every class that referenced the application's main class.
public OrientationEventListender listener;
#Override
public void onCreate() {
super.onCreate();
appContext = getApplicationContext();
GetPreferences();
//...
listener = new OrientationEventListener(appContext){
public void onOrientationChanged(int Orientation){
}
};
listener.disable();
}
Now, as you can probably tell, I am very new to Android development. I assume that this solution is something so simple that everyone knows, and that is why there are no answers anywhere handy on the web. But any help with this simple problem would be greatly appreciated. Thanks!

This issue was solved by modifying the AndroidManifest, adding the following tag to each activity: android:configChanges="orientation"

Related

How to connect paired bluetooth device on app startup in Android Studio?

Is there any way to automatically connect a specific device via Bluetooth LE on app startup?
I've been scrolling through stack overflow for the past few hours and have seen a number of similar questions, although majority are quite outdated and deal with reflections or other complex methods that I can't quite comprehend (these methods I've tried to implement, but not successfully, as I didn't really understand what was going on).
So far, I've managed to find the device by its friendly name, although I have no clue what to execute in that if statement. This is within my MainActivity:
protected void onCreate(Bundle savedInstanceState) {
...
if (bluetoothAdapter == null) {
Toast.makeText(getApplicationContext(),"Bluetooth not supported",Toast.LENGTH_SHORT).show();
} else {
Set<BluetoothDevice> pairedDevices = bluetoothAdapter.getBondedDevices();
if(pairedDevices.size()>0){
for(BluetoothDevice device: pairedDevices){
if (deviceName.equals(device.getName())) {
//Device found!
//Now how do I pair it?
break;
}
...
Assuming you've successfully identified the BlueToothDevice, you now need to connect to the GATT(Generic Attribute Profile), which allows you to transfer data.
Use the BlueToothDevice.connectGatt method. Using the first overload, the method takes in a Context , a boolean (false = directly connect, true = connect when available), and a BlueToothGhattCallback. The callback receives info from the device.
BlueToothGatt blueToothGatt = device.connectGatt(this, false, blueToothGattCallback);
An example to implement the callback:
BluetoothGattCallback blueToothGattCallback =
new BluetoothGattCallback()
{
#Override
public void onConnectionStateChange(BluetoothGatt gatt, int status, int newState) {
if(newState == BlueToothProfile.STATE_CONNECTED){
/* do stuff */
}
}
}
More details on the callbacks here.
Ended up scrolling through the source code for this app, particularly the SerialSocket, SerialService and SerialListener files which completely solved my problem.

Fix the 'background service' problem on Android 8+

I have code running a service behind the scenes. It is set to run when we copy the text to the phone.
This code works fine on Android 8 below
But the problem is when I run the app on Android 8 and above
In my searches, I realized that I had to use FOREGROUND_SERVICEs and give specific access to the project.
What solutions do you suggest now?
Service Class:
public class AutoDownloadService extends Service {
private ClipboardManager mClipboardManager;
public static final String CHANNEL_ID = "ForegroundServiceChannel";
#Override
public void onCreate() {
super.onCreate();
mClipboardManager = (ClipboardManager) getSystemService(CLIPBOARD_SERVICE);
mClipboardManager.addPrimaryClipChangedListener(mOnPrimaryClipChangedListener);
}
#Override
public int onStartCommand(Intent intent, int flags, int startId) {
String input = intent.getStringExtra("inputExtra");
createNotificationChannel();
Intent notificationIntent = new Intent(this, SettingsActivity.class);
PendingIntent pendingIntent = PendingIntent.getActivity(this, 0, notificationIntent, 0);
Notification notification = new NotificationCompat.Builder(this, CHANNEL_ID)
.setContentTitle("Foreground Service")
.setContentText(input)
.setSmallIcon(R.drawable.ic_launcher_background)
.setContentIntent(pendingIntent)
.build();
startForeground(1, notification);
// stopSelf();
return START_NOT_STICKY;
}
#Override
public void onDestroy() {
super.onDestroy();
if (mClipboardManager != null) {
mClipboardManager.removePrimaryClipChangedListener(mOnPrimaryClipChangedListener);
}
}
#Override
public IBinder onBind(Intent intent) {
return null;
}
private void createNotificationChannel() {
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.O) {
NotificationChannel serviceChannel = new NotificationChannel(
CHANNEL_ID,
"Foreground Service Channel",
NotificationManager.IMPORTANCE_DEFAULT
);
NotificationManager manager = getSystemService(NotificationManager.class);
manager.createNotificationChannel(serviceChannel);
}
}
private ClipboardManager.OnPrimaryClipChangedListener mOnPrimaryClipChangedListener =
new ClipboardManager.OnPrimaryClipChangedListener() {
#Override
public void onPrimaryClipChanged() {
ClipData clip = mClipboardManager.getPrimaryClip();
String textClipBoard = clip.getItemAt(0).getText().toString();
Toast.makeText(AutoDownloadService.this, textClipBoard, Toast.LENGTH_SHORT).show();
}
};
}
Manifest
<service
android:name=".services.AutoDownloadService"
android:exported="false"
android:permission="android.permission.BIND_JOB_SERVICE" />
and add finally uses permission
<uses-permission android:name="android.permission.FOREGROUND_SERVICE" />
I think you should use Intent Service instead of service.
what you can do is if system shutdown your service you can again trigger it after sometime using alarm manger.
As stated in documentation
While an app is in the foreground, it can create and run both
foreground and background services freely. When an app goes into the
background, it has a window of several minutes in which it is still
allowed to create and use services. At the end of that window, the app
is considered to be idle. At this time, the system stops the app's
background services, just as if the app had called the services'
Service.stopSelf() methods.
So, you solution is to run foreground service on Android >= 8.0 and do something like this
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.O) {
startForegroundService(Intent(this, AutoDownloadService.class))
} else {
startService(Intent(this, AutoDownloadService.class))
}
}
First of all, you should not do this.
Monitoring clipboard in background is not something right.
Android 8 added some protection on this, so you should run as foreground services, to let the end user aware your app is monitoring the clipboard.
Anyway clipboard access only available to default IME from Android 10. So, your apps will not work in Android 10.
This example of my code is currently working fine but I have problems with Chinese mobile
Tested on mobile: Xiaomi 7
public class AutoDownloadService extends IntentService {
private ClipboardManager mClipboardManager;
public AutoDownloadService() {
super("AutoDownloadService");
}
#Override
public void onCreate() {
super.onCreate();
mClipboardManager = (ClipboardManager) getSystemService(CLIPBOARD_SERVICE);
mClipboardManager.addPrimaryClipChangedListener(mOnPrimaryClipChangedListener);
}
#Override
protected void onHandleIntent(Intent intent) {
Toast.makeText(this, " service started.", Toast.LENGTH_SHORT).show();
}
private final ClipboardManager.OnPrimaryClipChangedListener mOnPrimaryClipChangedListener =
new ClipboardManager.OnPrimaryClipChangedListener() {
#Override
public void onPrimaryClipChanged() {
ClipData clip = mClipboardManager.getPrimaryClip();
String textClipBoard = clip.getItemAt(0).getText().toString();
if (textClipBoard.startsWith("https://www.instagram.com/")) {
Toast.makeText(AutoDownloadService.this, textClipBoard, Toast.LENGTH_SHORT).show();
}
}
};
}
And in the file Manifast.xml This is how it is defined
<service android:name="com.amirhf.inatasave.services.AutoDownloadService" />
When you swipe the app from recent apps, few Chinese OEMs force stop the apps.
Once the App is force stopped, you can't post notifications, start services, receive broadcasts etc.
The only work around is : Your app should be whitelisted, added in
auto start list. Apps like WhatsApp, Facebook etc are already added by
the OEMs in the list.
This blog talks about similar problem of not receiving notifications when App is force stopped. https://hackernoon.com/notifications-in-android-are-horribly-broken-b8dbec63f48a
You can take a similar approach where you can have a "Troubleshoot" section where you can educate the user to add your app in auto start list.
If by any chance your app is very popular you can get in touch with
Chinese Manufacturer and request them to get your app whitelisted but
they do it for very popular apps. For example in my experience
Microsoft and Hike Messenger got it done for their apps.
I didn't quite understand if you're messing with file download or anything else. But I guess you're not going the right way. So here's what I may share.
From https://developer.android.com/training/scheduling/wakelock
If your app is performing long-running HTTP downloads, consider using DownloadManager.
If your app is synchronizing data from an external server, consider creating a sync adapter.
If your app relies on background services, consider using JobScheduler or Firebase Cloud Messaging to trigger these services
at specific intervals.
Also note that if you just have a task that has to done often, use JobIntentService. Its compatible with Oreo and versions below it:
Helper for processing work that has been enqueued for a job/service.
When running on Android O or later, the work will be dispatched as a
job via JobScheduler.enqueue. When running on older versions of the
platform, it will use Context.startService.
On Oreo and above versions there are limitations helping the device save resources (battery, ram...) and even when using JobIntentService you must consider them; otherwise your app may be recognized as Battery Draining App.
If what you're about to do is heavy and is important enough to be shown in notification bar, do it using ForegroundService. So that it will be taken more seriously by android system and chances of it being killed gets fewer.
Try using WorkManager it's a JetPack Library.
Advantages:
Ensures task execution, even if the app or device restarts (Guaranteed Execution)
You don’t need to write device logic to figure out what capabilities the device has and choose an appropriate API; instead, you can just hand your task off to WorkManager and let it choose the best option. It is a wrapper on all the above concepts.
Uses JobScheduler on devices with API 23+
Uses a combination of BroadcastReceiver + AlarmManager on devices with API 14-22
Ref : WorkManager Docs
Ref : Medium Article
Ref : Medium Article(1)
[Update] - stable version is out WorkManager

System.exit(0) in Android Wear Watchface?

I've read that using System.exit(0) is frowned upon when it comes to Java and Android, but so far I can find no alternative for what I'm trying to accomplish. To be clear, this is for a Watchface, which is simply a service extending CanvasWatchFaceService. I cannot call finish() in this case. I've also tried stopService and startService with no success.
The issue I'm trying to solve: It's well known that changing timezones on your device will not be reflected on a watchface unless it is restarted. In my testing, I found that System.currentTimeMillis() quite literally does not respond to timezone changes in Android Wear. The watchface must be restarted in order for it to show the correct time after a timezone change.
So I built a system with the following code:
private final BroadcastReceiver timeChangeReceiver = new BroadcastReceiver() {
#Override
public void onReceive(Context context, Intent intent) {
final String action = intent.getAction();
if (!restarting) {
if (action.equals(Intent.ACTION_TIMEZONE_CHANGED)) {
if (upSeconds >= 15) {
System.exit(0);
} else {
restarting = true;
int delay = ((15 - upSeconds) * 1000);
new CountDownTimer(delay, delay) {
#Override
public void onTick(long millisUntilFinished) { }
#Override
public void onFinish() {
System.exit(0);
}
}.start();
}
}
}
}
};
The delay is in case a user triggers a time zone change more frequently than 15 seconds at a time. Android Wear seems to detect system exits that are too frequent and replace the watchface with the "Simple" watchface.
It seems to work great, Android Wear automatically boots the watchface back up on exit. But I would eventually like to put this app on the Google Play Store, so I thought I should make sure I'm not playing with fire here.
I can't believe I went through all that work when the proper solution was so simple. Thanks ianhanniballake for the link!
After looking at the Analog Watchface Sample, I found that all I needed to do was use mCalendar.setTimeZone(TimeZone.getDefault());. In many places I was directly comparing the time in milliseconds fetched with long now = System.currentTimeMillis();, so I simply did a now = mCalendar.getTimeInMillis() to take care of that.
Now the watchface changes time properly when the timezone is changed. I guess the other watchfaces I downloaded did not properly handle this!

EyeGesture and EyeGestureManager clarity needed

Google review team requires glassewares to:
Dim the screen if there isn't an expectation that a user is
looking at it.
This is consistent with the "in the here and now" experience of Glass.
Glassware should always dim the screen if there isn't an expectation
that a user is looking at it. Ideally it behaves like a timeline and
dims after 15s. A user can 'rewake' the screen by looking up.
Update to be made: If a user is not looking at the results set in the
card scroller, dim the screen.
This hints at using the EyeGesture, which doesn't seem to be mentioned anywhere on the Glass Develop Page.
After some searching I found this EyeGesture library (github) that from this stackoverflow post (Google Glass Eye Gesture Crashing (EyeGestureLib)) doesn't seem to work anymore (and hasn't been updated in 4 months+).
The accepted answer (from the stackoverflow post) proposed using this revised EyeGesture library (github)
It was also mentioned (in the stackoverflow post - as a comment ) that:
Basically, you're trying to expose classes that exist in the Glass
environment itself, but not through the official APIs. By declaring
these stub classes (none of the methods are implemented) and by
putting them into the com.google.android.glass.eye package, we're
allowing our code to compile with these unimplemented classes. At
runtime, the system has implementations of those classes and the
application will instead use the system's implementations.
Here are my following questions:
Will there be (and when) an offcial API for EyeGesture's any time soon?
I tried Implementing the revised EyeGesture library into my activity by following the guide proposed without any luck. What could I be doing wrong?
Is there something I'm missing for it to be detected? I know that with the GestureDetector I'm required to Override the onGenericMotionEvent(MotionEvent event), is there something similar for the EyeGesture?
Here is what I'm currently doing:
I have a package named com.google.android.glass and in this package I have the following:
EyeGesture enum that implements Parcelable
EyeGestureManager class
I have in the main package:
GestureIds class (This one is different the github in that it's a public class and not private)
In my activity I have:
private void createEyeGestureDetector(ResultActivity resultActivity) {
final GestureIds gestureIds = new GestureIds();
//The github guide didn't mention any class names for
//mEyeGestureManager and mEyeGestureListener .. so I added some..
EyeGestureManager mEyeGestureManager = EyeGestureManager.from(resultActivity);
EyeGestureManager.Listener mEyeGestureListener = new EyeGestureManager.Listener() {
#Override
public void onDetected(EyeGesture gesture) {
Log.i("EyeGestureListener", "Gesture: " + gesture.getId());
int id = gesture.getId();
if(id == gestureIds.WINK_ID || id == gestureIds.DOUBLE_WINK_ID) {
Log.d("EyeGesture", "Wink");
} else if (id == gestureIds.BLINK_ID || id == gestureIds.DOUBLE_BLINK_ID){
Log.d("EyeGesture", "Blink");
} else if (id == gestureIds.LOOK_AT_SCREEN_ID || id == gestureIds.LOOK_AWAY_FROM_SCREEN_ID) {
Log.d("EyeGesture", "Screen");
}
}
};
}
In my onCreate I have:
//..
super.onCreate(bundle);
createEyeGestureDetector(this);
//..
Update Logcat:
When I do:
for (EyeGesture eg : EyeGesture.values()) {
boolean supported = mEyeGestureManager.isSupported(eg);
Log.w("yupyup", eg.name() + ":" + supported);
}
I get:
12-10 18:40:51.252 2405-2405/com.google.android.glass.websurg.websurg W/yupyup﹕ WINK:true
12-10 18:40:51.252 2405-2405/com.google.android.glass.websurg.websurg W/yupyup﹕ DOUBLE_WINK:false
12-10 18:40:51.252 2405-2405/com.google.android.glass.websurg.websurg W/yupyup﹕ BLINK:false
12-10 18:40:51.252 2405-2405/com.google.android.glass.websurg.websurg W/yupyup﹕ DOUBLE_BLINK:true
12-10 18:40:51.260 2405-2405/com.google.android.glass.websurg.websurg W/yupyup﹕ DON:true
12-10 18:40:51.268 2405-2405/com.google.android.glass.websurg.websurg W/yupyup﹕ DOFF:true
12-10 18:40:51.268 2405-2405/com.google.android.glass.websurg.websurg W/yupyup﹕ LOOK_AT_SCREEN:true
12-10 18:40:51.268 2405-2405/com.google.android.glass.websurg.websurg W/yupyup﹕ LOOK_AWAY_FROM_SCREEN:false
I also added (from the first github link):
#Override
protected void onStart(){
super.onStart();
createEyeGestureDetector(this);
for (EyeGesture eg : EyeGesture.values()) {
boolean supported = mEyeGestureManager.isSupported(eg);
Log.w("yupyup", eg.name() + ":" + supported);
}
mEyeGestureManager.register(EyeGesture.LOOK_AT_SCREEN, mEyeGestureListener);
mEyeGestureManager.register(EyeGesture.LOOK_AWAY_FROM_SCREEN, mEyeGestureListener);
mEyeGestureManager.register(EyeGesture.WINK, mEyeGestureListener);
}
and
#Override
protected void onStop(){
mEyeGestureManager.unregister(EyeGesture.LOOK_AT_SCREEN, mEyeGestureListener);
mEyeGestureManager.unregister(EyeGesture.LOOK_AWAY_FROM_SCREEN, mEyeGestureListener);
mEyeGestureManager.unregister(EyeGesture.WINK, mEyeGestureListener);
super.onStop();
}
This gives me:
12-10 18:46:11.314 2553-2553/com.google.android.glass.websurg.websurg I/EyeGestureManager﹕ Removing listener: com.google.android.glass.websurg.websurg.ResultActivity$1#41b8b908 for eye gesture: LOOK_AT_SCREEN
12-10 18:46:11.314 2553-2553/com.google.android.glass.websurg.websurg I/EyeGestureManager﹕ Removing listener: com.google.android.glass.websurg.websurg.ResultActivity$1#41b8b908 for eye gesture: LOOK_AWAY_FROM_SCREEN
12-10 18:46:11.314 2553-2553/com.google.android.glass.websurg.websurg I/EyeGestureManager﹕ Removing listener: com.google.android.glass.websurg.websurg.ResultActivity$1#41b8b908 for eye gesture: WINK
However they do not get detected.. even the WINK since it seems to be supported.
Google team already answered some of these but I will go ahead and provide more details about their answer and also provide an alternate way of doing these stuff you requested.
Dim the screen if there isn't an expectation that a user is looking at
it.
This is consistent with the "in the here and now" experience of Glass.
Glassware should always dim the screen if there isn't an expectation
that a user is looking at it. Ideally it behaves like a timeline and
dims after 15s. A user can 're-wake' the screen by looking up.
Update to be made: If a user is not looking at the results set in the
card scroller, dim the screen.
Glass handles that itself but the problem is that if the user doesn't touch the Glass pad for about 10 seconds or more, Glass will go to sleep and your App will stop running.
Great way of fixing this is to make Glass screen always on and check when the user looks at the screen or when they remove the Glass.
If the user looks at the screen, increase the brightness of the screen, if they look away, decrease the brightness of the screen.
If they remove the Glass from their face, decrease the brightness to zero, turn off the screen and stop running all the big CPU intensive code you have.
If they put back the Glass on their face, increase the brightness of the screen,turn on the Screen and then enable all your CPU intensive code.
You could just have a boolean variable to determine when to start or stop running. This method is recommended if you don't want your app to stop running after no touch event for seconds. It also saves battery when running your app.
Code Examples for the things I said above are below:
To Get Screen Brightness:
//Get Screen Brightness
public float getScreenBrightness() {
WindowManager.LayoutParams wMLayout = getWindow().getAttributes();
return wMLayout.screenBrightness;
}
To Set Screen Brightness(0 to 1):
//Set Screen Brightness
public boolean setScreenBrightness(float sBrightness){
if(sBrightness>=0){
WindowManager.LayoutParams wMLayout = getWindow().getAttributes();
wMLayout.screenBrightness = sBrightness; //Modify Brightness
getWindow().setAttributes(wMLayout); //Apply changes
return true;
}else
{
return false;
}
}
To Keep the Screen On or Off:
//Turn Screen On/Off
public void keepScreenOn(boolean screenOn){
if(screenOn) {
getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);
}else{
getWindow().clearFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);
}
}
Don't forgeet to add permision in the Manifest:
<uses-permission android:name="android.permission.WAKE_LOCK" />
If you are just doing developing and don't want to worry about permision right now, you can use:
<uses-permission android:name="com.google.android.glass.permission.DEVELOPMENT" />
and avoid having to look up what permission to use. I suggest you use that for now as it will save you time during development and you don't have to worry about permission when coding.
[EYE GESTURE]
No Google official API available for detecting those. Anything available now is a little hack to access hidden Glass API for that. Glass team is working on it and they said the API will only be release when it is reliable. Right now, it is NOT perfect according to them.
NOTE
The answer I am about to post below SHOULD work but may NOT work on the next Glass update. When they do update, something magically changes and one function will STOP working. Glass API and Glass itself is on Beta Mode and therefore expect things to keep changing until official EYE Gesture API gets released.
There are two ways to detect Eye Gesture. One way is to use IntentFilter and wait for "gesture" message.Another way is to use Stub Library to Access the hidden Glass API. I will talk about both here as there are prons and cons for each method.
Method 1 (Stub Lib):
This is the way you are currently trying to do it.
Pros:
Can detect more gestures
Cons:
Wink CANNOT be stooped from taking pictures.
Yo are using different library than the one I used that is still working. I will try to fix your problem if that doesn't work, You should then do it the way I did mine with the library I used too.
You got Step 1 wrong.
Step 1: Create the stubs:
Create a package called com.google.android.glass. In this package
create two classes: EyeGesture and EyeGestureManager
It should be
com.google.android.glass.eye
NOT
com.google.android.glass
com.google.android.glass may have worked in the past but there were too many updates.
So, EyeGesture and EyeGestureManager must be placed in your package called com.google.android.glass.eye
If Eye gesture is still not detected, forget about that library and use the one I am currently using. Close your project and create a new one.
Steps:
1) Download the library from here. (Last update was 4 months ago). The one you are currently using was probably last updated 8 months ago or even a year ago.
https://github.com/prt2121/EyeGestureLib
The zip file will have a long name like "EyeGestureLib-fwenindioniwenubwudew".
Rename the Zip file to "EyeGestureLib".
Extract the folder inside with a long name like "EyeGestureLib-f8a9fef3bde4396f947106e78cd0be7c7ecdd5a6"
Rename that folder to "EyeGestureLib"
The "EyeGestureLib" folder should have two folders inside it called "EyeGestureStub" and "EyeGestureDemoApp" plus other useless files.
2) Open Eclipse and create a new project.
create a simple MainActivty class activity.
3) Inside your MainActivity class:
private EyeGestureManager mEyeGestureManager;
private EyeGestureListener mEyeGestureListener;
private EyeGesture target1 = EyeGesture.WINK;
private EyeGesture target2 = EyeGesture.DOUBLE_BLINK;
private EyeGesture target3 = EyeGesture.LOOK_AT_SCREEN;
Inside onCreate:
mEyeGestureManager = EyeGestureManager.from(this);
mEyeGestureListener = new EyeGestureListener();
Inside onStart:
mEyeGestureManager.register(target1, mEyeGestureListener);
mEyeGestureManager.register(target2, mEyeGestureListener);
mEyeGestureManager.register(target3, mEyeGestureListener);
Inside onStop:
mEyeGestureManager.unregister(target1, mEyeGestureListener);
mEyeGestureManager.unregister(target2, mEyeGestureListener);
mEyeGestureManager.unregister(target3, mEyeGestureListener);
Inside MainActivity (Not inside any function but just anywhere inside you MainActivity class):
private class EyeGestureListener implements Listener {
#Override
public void onEnableStateChange(EyeGesture eyeGesture, boolean paramBoolean) {
}
#Override
public void onDetected(final EyeGesture eyeGesture) {
//Show what we just detected
Log.i(eyeGesture.toString() , " is detected");
//Check which eye event occured
if (eyeGesture.name() == target1.name()) {
// Wink
Log.i("EyeGesture: ", " you just winked");
} else if (eyeGesture.name() == target2.name()) {
// Double blink
Log.i("EyeGesture: ", " you just double winked");
} else if (eyeGesture.name() == target3.name()) {
// Look at Screen
Log.i("EyeGesture: ", " you Looked at Screen");
}
}
}
4) You will get error.
Import the EyeGestureStub that is inside the EyeGestureLib folder to fix it.
To fix the error:
a) Go to File -> Import -> Android -> Existing Android Code into Workspace
Click Next, Browse and Browse the EyeGestureStub folder inside EyeGestureLib folder.
Make sure to exclude the "EyeGestureDemoApp" if it is there. You ONLY need EyeGestureLib folder which contains EyeGesture and EyeGestureManager.
b) Right click on "EyeGestureStub" -> Properties -> Android ->
On the right side,under Project Build Target make sure that "Glass Development Kit Preview" check-box is checked.
Under Library, make sure that the "Is Library" check-box is checked.
Click Apply and Ok to exit the window.
c) Open Android SDK Manger. check for the version of Android SDK Build-tools installed. I have 21.1.1.
d) Open the project.properties of EyeGestureStub and change sdk.buildtools=18.1.1 to sdk.buildtools=21.1.1
Finish.
Done. It should work if you followed the instruction.
Run it and choose MainActivity as the the Launch Activity.
<-------------------------------------------------------------------------------------------------------------------------------->
[STILL NOT WORKING? IMPORT EVERYTHING && work from there]
If you can't get it to Work, delete the current project and import the whole project downloaded then work from there up. This is the easiest way. You may need to fix some errors before you can compile.**
To import the project,
1) Go to File -> Other -> Android -> Android Project from Existing Code.
Next -> Browse
then choose the EyeGestureLib folder which contains both the EyeGestureStub and EyeGestureDemoApp.
Make sure under Project to Import that both EyeGestureStub and EyeGestureDemoApp are check-box are checked then click Finish.
2) Right click on "EyeGestureStub" -> Properties -> Android ->
On the right side,under Project Build Target make sure that "Glass Development Kit Preview" check-box is checked.
Under Library, make sure that the "Is Library" check-box is checked.
Click Apply and Ok to exit the window.
3) Right click on "MainActivity" -> Properties -> Android ->
On the right side,under Project Build Target make sure that "Glass Development Kit Preview" check-box is checked.
4) You will get invisible error that will not be showing.
To see it Go to Windows -> Show View -> Problems
There, you will see all the problems.
Next step to fix it, we have to match the Android SDK Build-tools version with the ones listed in the project.properties of both EyeGestureStub and MainActivity
a) Open Android SDK Manger. check for the version of Android SDK Build-tools installed. I have 21.1.1.
b) Open the project.properties of EyeGestureStub and change sdk.buildtools=18.1.1 to sdk.buildtools=21.1.1
c) Open the project.properties of MainActivity and change sdk.buildtools=18.1.1 to sdk.buildtools=21.1.1
Note: Changing the first project.properties may automatically change the second one.
Done. It should work if you followed the instruction.
Run it and choose MainActivity as the the Launch Activity.
<-------------------------------------------------------------------------------------------------------------------------------->
Method 2 (IntentFilter)
Pros:
Wink CAN be stopped from taking pictures.
Cons:
Detects WINK ONLY
The first method can receive four events (WINK,DOUBLE_WINK,DOUBLE_BLINK,LOOK_AT_SCREEN,) but this method can ONLY receive one event (WINK).
This method is useful if you just want to detect ONLY WINK without Glass taking a picture.
To listen to Intent, you have to extend BroadcastReceiver.
public class EyeGesture extends BroadcastReceiver {
#Override
public void onReceive(Context context, Intent intent) {
if (intent.getStringExtra("gesture").equals("WINK")) {
//Disable Camera Snapshot
abortBroadcast();
Log.e("WINKED ","");
}else {
Log.e("SOMETHING", "is detected " + intent.getStringExtra("gesture"));
}
}
}
You must register the intent in the Manifest as below:
<receiver android:name="com.inno.inno.glassplugin.EyeGesture">
<intent-filter>
<action android:name="com.google.android.glass.action.EYE_GESTURE" />
</intent-filter>
</receiver>
The name specified in the Manifest must match the name of the class listening to the intent which is EyeGesture.
Simple as that. No library required but only WINK can be detected. It also stops Glass from taking picture when wink is detected. You can comment abortBroadcast(); if you want Glass to take picture when event is detected.
This is for any one looking to detect Eye Gesture from Glass at this moment. These are the only current solutions around until Google releases their official Eye Gesture API.
You should file for a new Glass API feature here. File it as Glass Eye Gesture API Request. If the Glass team receives too much of this feature request, they will make it their top priority and release it. I already filed for one.
Got a return from the Google Glass Review team. Their response to :
Dim the screen if there isn't an expectation that a user is looking at
it.
This is consistent with the "in the here and now" experience of Glass.
Glassware should always dim the screen if there isn't an expectation
that a user is looking at it. Ideally it behaves like a timeline and
dims after 15s. A user can 'rewake' the screen by looking up.
Update to be made: If a user is not looking at the results set in the
card scroller, dim the screen.
was this:
The platform handles this, are you overriding this in some way, are
you holding a wake-lock when the result is shown?
So it seems as of now it is not intended to actually work straight with the EyeGesture, since it apparantely does it automatically (need confirmation on this part). In any case there is no point in trying to handle the LOOK_AT_SCREEN since the LOOK_AWAY_FROM_SCREEN isn't handled.
12-10 18:40:51.268 2405-2405/com.google.android.glass.websurg.websurg W/yupyup﹕ LOOK_AWAY_FROM_SCREEN:false
For those interested in handling the Wink EyeGesture here is what apparantely works according to some information I've gathered (needs to be confirmed).
The idea is to use is to use an EyeGesture and EyeGestureManager stub. Pretty much they exist within the environment but not in the API. This means to access them you need to create Subs that durring runtime will work (atleast that's how I understood it).
Apparantely there is a known bug when handling the WINK EyeGesture. It will take a picture. This may be caused from within the Google Glass settings where Take a picture when a wink is detected is activated. (needs to be confirmed).
So how to actually handle it?
Step 1: Create the stubs:
Create a package called com.google.android.glass. In this package create two classes: EyeGesture and EyeGestureManager
EyeGesture:
package com.google.android.glass.eye;
import android.os.Parcel;
import android.os.Parcelable;
/**
* https://gist.github.com/victorkp/9094a6aea9db236a97f3E
*
*/
public enum EyeGesture implements Parcelable {
BLINK, DOFF, DON, DOUBLE_BLINK, DOUBLE_WINK, LOOK_AT_SCREEN, LOOK_AWAY_FROM_SCREEN, WINK;
public int getId(){
return -1;
}
#Override
public int describeContents() {
return 0;
}
#Override
public void writeToParcel(Parcel dest, int flags) {
}
}
EyeGestureManager:
package com.google.android.glass.eye;
import android.content.Context;
/**
*
* If there are any updates required check: https://gist.github.com/victorkp/9094a6aea9db236a97f3
*
*/
public class EyeGestureManager {
public static final int INFINITE_TIMEOUT = -1;
public static final String SERVICE_NAME = "eye_gesture";
public interface Listener {
public void onDetected(EyeGesture gesture);
}
public static EyeGestureManager from(Context paramContext) {
return null;
}
public void activateGazeLogging(boolean paramBoolean) {
}
public boolean applyAndSaveCalibration(EyeGesture paramEyeGesture) {
return false;
}
public boolean clearCalibration(EyeGesture paramEyeGesture) {
return false;
}
public void enableGazeService(boolean paramBoolean) {
}
public boolean endCalibrationInterval(EyeGesture paramEyeGesture) {
return false;
}
public boolean isCalibrationComplete(EyeGesture paramEyeGesture) {
return false;
}
public boolean isGazeLogging() {
return false;
}
public boolean isRegistered() {
return false;
}
public boolean isSupported(EyeGesture paramEyeGesture) {
return false;
}
public boolean loadCalibration(EyeGesture paramEyeGesture) {
return false;
}
public boolean register(EyeGesture gesture, EyeGestureManager.Listener listener){
return false;
}
public boolean startCalibrationInterval(EyeGesture paramEyeGesture) {
return false;
}
public boolean unregister(EyeGesture gesture, EyeGestureManager.Listener listener) {
return false;
}
}
Great, you've got both stubs. First thing you should notice is that they aren't 100% like the one from the github link. Some of the functions have been deprecated, I've kept the ones that work. (I haven't tried every single one).
What is next? Well you need (not really but it's easier if you do) to create the GestureId class (yes you can put this anywhere you want to).
GestureId:
package com.google.android.glass.websurg.websurg;
import com.google.android.glass.eye.EyeGesture;
/**
*
*
* For updates check out: https://gist.github.com/victorkp/9094a6aea9db236a97f3
*
*
*
*/
public class GestureIds {
public int BLINK_ID;
public int WINK_ID;
public int DOUBLE_BLINK_ID;
public int DOUBLE_WINK_ID;
public int LOOK_AT_SCREEN_ID;
public int LOOK_AWAY_FROM_SCREEN_ID;
public GestureIds() {
BLINK_ID = EyeGesture.BLINK.getId();
WINK_ID = EyeGesture.WINK.getId();
DOUBLE_BLINK_ID = EyeGesture.DOUBLE_BLINK.getId();
DOUBLE_WINK_ID = EyeGesture.DOUBLE_WINK.getId();
LOOK_AT_SCREEN_ID = EyeGesture.LOOK_AT_SCREEN.getId();
LOOK_AWAY_FROM_SCREEN_ID = EyeGesture.LOOK_AWAY_FROM_SCREEN.getId();
}
}
Great now you have all the classes to actually get started. Keep in mind the stubs are the ones that will work correctly during runtime (atleast I think so since I don't get any errors?)
Say you have a MainActivity and you want to add the EyeGesture:
private void createEyeGestureDetector(Context context) {
mGestureIds = new GestureIds();
mEyeGestureManager = EyeGestureManager.from(context);
mEyeGestureListener = new EyeGestureManager.Listener() {
#Override
public void onDetected(EyeGesture gesture) {
Log.w("EyeGestureListener", "Gesture: " + gesture.getId());
int id = gesture.getId();
if (id == mGestureIds.WINK_ID || id == mGestureIds.DOUBLE_WINK_ID) {
Log.d("EyeGesture", "Wink");
} else if (id == mGestureIds.BLINK_ID || id == mGestureIds.DOUBLE_BLINK_ID) {
Log.d("EyeGesture", "Blink");
} else if (id == mGestureIds.LOOK_AT_SCREEN_ID || id == mGestureIds.LOOK_AWAY_FROM_SCREEN_ID) {
Log.d("EyeGesture", "Screen");
}
runOnUiThread(new Runnable() {
#Override
public void run() {
Log.w("detected", "omg detected");
}
});
}
};
}
Notice I have also a run(). I've added both since one github linked had it and the other didn't. Tried both and doesn't seem to get detected.
You call this (the one right above) function in your onCreate(); , then in your onResume();
mEyeGestureManager.register(EyeGesture.LOOK_AT_SCREEN, mEyeGestureListener);
mEyeGestureManager.register(EyeGesture.LOOK_AWAY_FROM_SCREEN, mEyeGestureListener);
mEyeGestureManager.register(EyeGesture.WINK, mEyeGestureListener);
The on your onPause():
mEyeGestureManager.unregister(EyeGesture.LOOK_AT_SCREEN, mEyeGestureListener);
mEyeGestureManager.unregister(EyeGesture.LOOK_AWAY_FROM_SCREEN, mEyeGestureListener);
mEyeGestureManager.unregister(EyeGesture.WINK, mEyeGestureListener);
Now all this seems to work (no errors when the functions are called, and logs are shown saying they were called (notice there are no logs in the stubs)).
However, my google glass doesn't seem to be detecting the EyeGesture. I'm pretty sure it's all good, or that I'm missing something minor. I won't accept it as an answer since, it only answers part of my question. Feel free to try this out yourself and let me know how it works.

Android: Logout system and the android lifecycle

on the weekend I started to build my first android app. As I need to ask the user of my app for user credentials [which are used for further webservice usage] I want to simulate a "login system". On the start of my app the user should be told to enter his credentials. When the user is inactive for too long I want to dismiss the entered credentials and to "log out" the user.
While coding and afterwards while testing I realized that the way I thought I could go doesn't work. After reading the docu and several SO-questions again and again I question myself more and more if I have understand the app / activity life cycle and its possibilites fully. So I wanted to ask for help in understand the life cycle and its linked influences on my app. So yes this might be several questions in one :/
For the moment my app consists of the following activities:
a search activity (which is opened once the app is started)
a settings acitivy (which can be accessed from the search dialog and has a link back to the search dialog)
After the user has entered an ID in the search dialog I want to open an activity regarding to the search result (NYI).
When starting to implement the user auth, my idea was the following:
Everytime onResume() of an activity is called I need to check a) if user credentials are already stored and b) if the last action of the user is less then X minutes ago. If one these fails I want to show a "log in panel" where the user can enter his credentials, which are then stored in the SharedPreferences. For that I did the following:
I first build an parent activity which has the check and a reference for the SharedPreferences in it
public class AppFragmentActivity extends FragmentActivity {
protected SharedPreferences sharedPref;
protected SharedPreferences.Editor editor;
protected String WebServiceUsername;
protected String WebServicePassword;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_appfragmentactivity);
}
#Override
protected void onResume () {
super.onResume();
// Check if user is "logged in".
// Meaning: Are there given user credentials and are they valid of was the user inactive for too long?
// We only do this "onResume" because this callback is the only one, which is called everytime an user
// starts/restarts/resumes an application
checkForUserCredentials();
// Set new "last action" now "now"
setLastAction(new Date().getTime());
}
#Override
protected void onStart () {
// Fill content
super.onStart();
// Set global sharedPreferences
sharedPref = this.getSharedPreferences(getString(R.string.FILE_settings_file), Context.MODE_PRIVATE);
}
/*
* Checks if user credentials are valid meaning if they are set and not too old
*/
private void checkForUserCredentials() {
long TimeLastAction = sharedPref.getLong(getString(R.string.SETTINGS_USER_LAST_ACTION), 0);
long TimeNow = new Date().getTime();
// Ask for User credentials when last action is too long ago
if(TimeLastAction < (TimeNow - 1800)) {
// Inactive for too long
// Set back credentials
setUsernameAndPassword("", "");
}
else
{
WebServiceUsername = sharedPref.getString(getString(R.string.SETTINGS_USER_USERNAME), "");
WebServicePassword = sharedPref.getString(getString(R.string.SETTINGS_USER_PASSWORD), "");
}
}
/*
* Saves the given last action in the sharedPreferences
* #param long LastAction - Time of the last action
*/
private void setLastAction(long LastAction) {
editor = sharedPref.edit();
editor.putLong(getString(R.string.SETTINGS_USER_LAST_ACTION), LastAction);
editor.commit();
}
/*
* Saves the given username and userpassword sharedPreferences
* #param String username
* #param String password
*/
private void setUsernameAndPassword(String username, String password) {
editor = sharedPref.edit();
editor.putString(getString(R.string.SETTINGS_USER_USERNAME), username);
editor.putString(getString(R.string.SETTINGS_USER_PASSWORD), password);
editor.commit();
WebServiceUsername = username;
WebServicePassword = password;
}
/*
* Method called when pressing the OK-Button
*/
public void ClickBtnOK(View view) {
// Save User-Creentials
EditText dfsUsername = (EditText) findViewById(R.id.dfsUsername);
String lvsUsername = dfsUsername.getText().toString();
EditText dfsPassword = (EditText) findViewById(R.id.dfsPassword);
String lvsPassword = dfsPassword.getText().toString();
if(lvsUsername.equals("") || lvsPassword.equals("")) {
TextView txtError = (TextView) findViewById(R.id.txtError);
txtError.setText(getString(R.string.ERR_Name_or_Password_empty));
}
else
{
// Save credentials
setUsernameAndPassword(lvsUsername, lvsPassword);
setLastAction(new Date().getTime());
// open Searchactivity
Intent intent = new Intent(this, SearchActivity.class);
startActivity(intent);
}
}
The "log in mask" is setContentView(R.layout.activity_appfragmentactivity);.
The two other activites I created are then extending this parent class. This is one of it:
public class SearchActivity extends AppFragmentActivity {
SearchFragment searchfragment;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_search);
}
#Override
protected void onResume() {
super.onResume();
if(WebServiceUsername.equals("") && WebServicePassword.equals("")) {
// Username not set. Re"login".
Intent intent = new Intent(this, AppFragmentActivity.class);
startActivity(intent);
}
}
#Override
protected void onStart() {
super.onStart();
}
// ...
}
As far as I understand the lifecycle now this should work as the following: When my app starts (the SearchActivity is set for LAUNCH) the app should step into the onResume() of my parent class. There it sees that the credentials are not yet stored and opens the layout of the AppFragmentActivity which is the login. When entered, the user is redirected to the SearchActivity which now sees "ok credentials are there, lets move forward". But this doesnt't happen as the login is not shown up. So I think my onResume() might be wrong. Perhaps my full idea is bad? Up to here I thought I also understand the life cycle, but obviosly I don't?
I then had a look around on SO for similar problems. One thing I saw here was a comment to an user which wanted to build a similar "logout" mechanism as mine, that he has to implement this in every activity. I thought about that and ask myself "Why do I have to override the onResume() in every of my activites, when they are all from the same parent? When theres no onResume() in the child, the one of the parent should be called". The user in the SO-question was advised to use services as background threads to count down a timer in there for the logout. I then read the services article in the docu and then fully got disoriented:
There are two types of services: Started and bounded ones. A started service is once started by an activity and then runs in the background until hell freezes when it doesn't get stoped. So it's fully independed of any app, but the programmer has to / should stop it when it's not longer needed. A bounded services is bounded to one or many app components and stops when all bounded components end. I thought this might be a good alternative for me, but when I thought further I ask myself how: If one of my starts it (let's say the login dialog) and then is closed the service is stoped and the other activites always start there own ones which can't be the sense of it. So this service must be bounded not to any component but to my app. But whats the life cycle of an android app? How can I keep information "global" inside my app. I know I can switch data between actitivites using 'Intents'.
This more and more "foggy cloud" lead to ask myself: "Shall I use only one activity and try to switch in/out everything using fragments?"
So my questions are (I think that's all of them, but I'm not sure anymore):
Does my idea of writing an parent class which does the checks for all extended childs ok or bad AND does it work as I understood it?
Do I have to override every onResume() in the childs just to call the parent one for the checks?
Can you give me a tip why my "login systems" doesn't work?
What's the life cycle of an android app and how can I interact with it?
Shall I only use one activity and switch in/out everything using fragments or is it a good way to have several activities and some of them use fragments (to reuse often used parts)?
Thanks in advise
What I've done in the end is the following:
I removed the "login" thing from the parent class into a stand alone activity. This activity is called when the credentials are not valid together with an finish() of the calling one. So I don't build a loop and drop unused activites.

Categories

Resources