I'm currently attempting to make an AI that will be able to learn how to play games on Android by actively monitoring certain features pertaining to the pixels on the screen, and how the user interacts with them.
My issue is that I cannot seem to find any relevant information regarding detecting MotionEvents that other applications receive. Are there any standard means by which I could set a global OnTouchEvent hook, thus receiving all user inputs regardless of the application that is active? If there aren't any standard methods, any ideas as to how one could achieve this?
One application can not know anything that is going on inside another application unless that other app has specifically shared that data. This is for security and privacy reasons. (Imagine how unsafe phones would be if any app could know what the user is entering into any other app, such as password.)
You can detect external touch event in your application.
WINDOWS_SERVICE which is system service, can provide you with the information of touch on the Android device, for that you need one layout which should be of unit size of pixel so that it doesn't consumes you click event on external applications.
mWindowManager = (WindowManager) getSystemService(WINDOW_SERVICE);
WindowManager.LayoutParams params = new WindowManager.LayoutParams(
1, // width is equal to 1px
1, // height is equal to 1px
WindowManager.LayoutParams.TYPE_PHONE, // Type Phone, These are non-application windows providing user interaction with the phone
WindowManager.LayoutParams.FLAG_NOT_FOCUSABLE | // This window would never get key input focus.
WindowManager.LayoutParams.FLAG_WATCH_OUTSIDE_TOUCH, // This window will get outside touch.
PixelFormat.TRANSPARENT // The view will be transparent
);
Now you have to add you layout to this window manager service.
//Adding view to the window manager
mWindowManager.addView(touchLayout, params);
For better explanation with code you can refer this link
http://allinmyspace.com/2016/08/28/android-detect-touch-events-on-external-applications/
Related
I would like to take a photo with MediaStore.ACTION_IMAGE_CAPTURE intent, but disable the settings button in the top-left corner. I was able to find a EXTRA_SHOW_ACTION_ICONS parameter, but it is not well documented. This is the description:
The name of an Intent-extra used to control the UI of a ViewImage. This is a boolean property that specifies whether or not to show action icons.
Even if I set it true or false, nothing changes. What does this parameter do? I use it like this:
takePictureIntent.putExtra(MediaStore.EXTRA_SHOW_ACTION_ICONS, false);
I would like to take a photo with MediaStore.ACTION_IMAGE_CAPTURE intent, but disable the settings button in the top-left corner
As I have pointed out previously, you do not have control over the UI of a third-party camera apps. There are hundreds of such apps; none have to give you any ability to control their UI.
What does this parameter do?
Based on a search of the Android source code, it does nothing in Android itself. If the device happens to have the AOSP Gallery app installed, it appears to control something in the image viewer there. It is certainly possible that some non-AOSP apps use that extra for some particular reason, but that behavior would vary by app.
I'm using the dispatchGesture API from Android accessibility.
I've added an overlay to the screen and I'm looking for a way to dispatchGesture behind the overlay (the overlay is what's intercepting the original gesture) since otherwise the gesture is dispatched on my OverlayView and don't play back in the app.
Is there any way to do this with the accessibility API?
For context - I want to be able to help people record actions in Android and replay them for accessibility.
Alas, there is no way to do this generally. A touchable overlay will capture all touches, as you're observing. It's not possible to do general-purpose filtering of touch events.
You've probably already thought of this, but if you're playing back pre-recorded gestures, you can remove your overlay before you dispatch them.
The general purpose filtering API doesn't exist because it's very difficult to filter touch events outside the system process without introducing serious jank.
You must use FLAG_NOT_TOUCHABLE params flag for your view and then dispatch your click.
Can anyone kindly help me out?
I want to create an app, like user access control(UAC) application. When UAC is running we cannot click anywhere on the screen, unless we close UAC window. Also we cannot press any key to do anything, like window key or any of the function key. So I want to create a similar application using C ++ code to control keyboard and mouse, that only mouse and keyboard is enabled in my application window and disable outside and unless i do not close my app i cant perform any other task. My application would be just a graphical simple window with a close button, and obove mentioned controls.
A long time ago Windows supported system modal dialogs. These would prevent the user from interacting with other windows including the desktop. Microsoft removed support for this a long time ago due to the problems that it caused.
Now when Windows needs to provide a system modal window for UAC they use a bit of desktop magic. To simulate a system modal window UAC does something like this.
Create a bitmap and take a snapshot of the current desktop.
Darken the bitmap
Create a new desktop
Set the new desktop as the currently active one.
Create a window the size of the new desktop and draw the bitmap in it.
Now they have a desktop that looks like the old and acts as if it were a system model window. You are then free to create a child window to grab input from the user. The example below shows how to create a desktop and switch to it and should be a good starting point for what you want to do
// TODO: Make a copy of the current desktop
// Prepeare a new desktop and activate it
HDESK oldDesktop = GetThreadDesktop(GetCurrentThreadId());
HDESK desktop = CreateDesktop(L"MyDesktop", NULL, NULL, 0, GENERIC_ALL, NULL);
SwitchDesktop(desktop);
// TODO: Create the window that draws the snapshot of the old desktop
// TODO: Create a dialog with buttons and stuff
// Since we don't have a window sit for 5 seconds staring at blank screen
Sleep(5000);
// Switch back to the old desktop and destroy the one we created
// ALERT: If we crash or exit without doing this you will need to
// restart windows
SwitchDesktop(oldDesktop);
CloseDesktop(desktop);
You can find more information on the desktop related API's
Anyone have an idea how the Amazon mp3 app is displaying the player controls inside of the keyguard? I've been playing with some of the system alert and system overlay window parameters with now luck. The system alert will display over everything except the lock screen and the system overlay will display over the lock screen but in 4.0.3 and above you can not receive the touch events.
I have also seen options like WindowManager.LayoutParams.TYPE_KEYGUARD_DIALOG but it only seems to work for an activity; anything else I use it with gives me an exception that tells me that I can not use that flag with this type of window.
What I have noticed is that the player controls either replace or cover the keyguard clock and the player controls change size depending on the type of lock (like pattern lock or swipe lock). I've also noticed that the keyguard does not display my wallpaper anymore...
Any thoughts are welcome!
My game app has many complex graphical elements and I am worried that having a banner ad continuously on screen will detract too much from the game. My plan is that the user will be able to play each "level" in the game using the full screen without any ads, but upon completion of each level an ad will slide in, on top of the top part of the screen. Presumably this means I have to somehow create a layout programatically that will appear on top of the existing screen layout. Can this be done? If so how?
It can be done.
Afaik you simply have to create a layout which is android:backgroundcolor="transparent".
Else you could do it so you started a new activity on each level completion?