I'm working on a Android App that will use a controller (at this time I'm using a Gamesir G3s gamepad on Samsung Galaxy S8 connected trough Bluetooth).
The application is using the standard Android API (level >= 19) for controller management.
I'm facing two major problems at this time:
how to discriminate events coming from DPAD pressure from analog
stick movements;
how to enumerate sources (i.e. axis, buttons, etc) of the controller before movement and pressure events arise.
In particular in this controller the event GenericMotionEvent is triggered for both analog stick motion and DPAD pressure:
#Override
public boolean onGenericMotionEvent(MotionEvent event)
{
int source = event.getSource();
if((source & InputDevice.SOURCE_JOYSTICK)== InputDevice.SOURCE_JOYSTICK && event.getAction() == MotionEvent.ACTION_MOVE)
{
.....
return true;
}
return super.onGenericMotionEvent(event);
}
the source variable always contains 16777232 (i.e. SOURCE_JOYSTICK referring to Android Developer docs).
If I attach the same controller to a standard Windows PC (trough USB) DPAD events are depicted in different way (in a system dialog) from inputs of the analog stick. Of course this may depend on the controller electronics that may behave differently depending on the connected host but I'm wondering why anyway.
To scavenge into gamepad sources the only hint I found (in Android docs) is to use InputDevice.getSources() and so I did:
#Override
public void onInputDeviceAdded(int deviceId)
{
InputDevice device = InputDevice.getDevice(deviceId);
if(null!=device)
{
final int sources = device.getSources();
Log.d("CONTROLLER", "Controller attached" + device.getDescriptor());
}
}
The same check is also done at application startup (assuming the controller is already connected at this time).
In this case sources variable contains 16786707 that correctly is SOURCE_JOYSTICK plus 0x2503 that should be the composed result of the following constants:
SOURCE_GAMEPAD,
SOURCE_CLASS_BUTTON,
SOURCE_KEYBOARD,
SOURCE_CLASS_JOYSTICK,
SOURCE_CLASS_POINTER,
SOURCE_MOUSE.
but adding single values sums a total of 0x2517 that's pretty strange.
My doubts about electronics behaviour are confirmed: attached to Android I found axis mapped in a different way than on Windows (on Windows analog sticks axis are mapped on X,Y,RX,RY while on Android X,Y,Z,RZ respectively).
Someone have hints?
Best regards.
I finally found the solution. Investigating the Android developer official sources I found a sample code that shows how to handle controller events (including DPAD ones) of course it wasn't working in my specific case.
The global misunderstanding arise from the fact that my controller it's not depicting himself as a SOURCE_DPAD and then every source bound to such distinction is failing. In the specific case of the source example provided by Android Developer the problem is this function:
public static boolean isDpadDevice(InputEvent event) {
// Check that input comes from a device with directional pads.
if ((event.getSource() & InputDevice.SOURCE_DPAD)
!= InputDevice.SOURCE_DPAD) {
return true;
} else {
return false;
}
}
That inhibits the caller method:
public int getDirectionPressed(InputEvent event) ...
That should handle the event. In my case the source responsible for DPAD events is SOURCE_JOYSTICK.
Related
This is happening in several places for multiple class types but I'll stick with a button example for now.
So I have a button which I want talkback to announce as "Play". The content description is set to "Play". However, talkback is also announcing the class too, so it reads as "Play Button".
I tried a solution I found elsewhere by overloading the onInitializeAccessibilityNodeInfo method
private void setupContentDescriptors() {
mPlayPauseButton.setAccessibilityDelegate(new View.AccessibilityDelegate() {
public void onInitializeAccessibilityNodeInfo(View host, AccessibilityNodeInfo info)
{
super.onInitializeAccessibilityNodeInfo(host, info);
//blanked to prevent talkback from announcing class/type
info.setClassName("");
info.setContentDescription("Play");
}
});
}
Setting the class name to "" worked perfectly, but I soon found out this solution only worked for API 23 and above.
According to the docs, "Starting in API 23, delegate methods are called after host methods, which all properties to be modified without being overwritten by the host class."
I've tried several other methods to no avail.
Ideas?
Prior to API 23, you will need to create a subclass and implement onInitializeAccessibilityNodeInfo() if you need to override the class name. You cannot override it by using a delegate.
That said, TalkBack is attempting to provide a consistent and high-quality experience for your user by speaking role descriptions. In the vast majority of cases, you should not attempt to override this behavior.
If you have a small and well-known user circle, maybe this is another alternative to the answer of alanv.
In Talkback 5.2.1(*) you can do this:
Under "Settings -> Accessibility -> Talkback -> Settings -> Verbosity
There you can switch on / off the entry "Speak element type".
With this the user itself can decide if he wants to hear the element type or not. This is another argument NOT to tinker with the way Talkback reads elements.
(*) I did not find any documentation about when the verbosity-setting for speaking elements was introduced. On my Android devices with Talkback 5.2.1 it's working, while devices with Talkback 5.0.3 dont have this setting. So anywhere in between it had to be introduced.
have you tried
ViewCompat.setAccessibilityDelegate(mPlayPauseButton, new
AccessibilityDelegateCompat() {
#Override
public void onInitializeAccessibilityNodeInfo(View host,
AccessibilityNodeInfoCompat info) {
super.onInitializeAccessibilityNodeInfo(host, info);
info.setClassName(null);
info.setContentDescription("your label");
}
})
ViewCompat should take care of the version handling.
With android AccessibilityService able to paste in other app EditText Field, but with browser testfields (Emulator Default Browser or Samsung deault Browser) its not working, throwing error:
Cannot perform this action on a not sealed instance.
In android chrome browser with some singnup textfield its working but not for all textfields.
#Override
public void onAccessibilityEvent(AccessibilityEvent event) {
AccessibilityNodeInfo source = event.getSource();
if (source != null && ( event.getEventType() == AccessibilityEvent.TYPE_VIEW_FOCUSED ) ) {
// || event.getEventType() == AccessibilityEvent.TYPE_VIEW_CLICKED ) &&
//event.getClassName().equals("android.widget.EditText")
//) {
ctx = getApplicationContext();
ClipboardManager clipboard = (ClipboardManager) ctx.getSystemService(Context.CLIPBOARD_SERVICE);
ClipData clip = ClipData.newPlainText("label", "XYZ");
clipboard.setPrimaryClip(clip);
source.performAction(AccessibilityNodeInfo.ACTION_PASTE);
//Not Working, always return false.
//Tried with other options
Bundle argumentsTest = new Bundle();
argumentsTest.putCharSequence(AccessibilityNodeInfo.ACTION_ARGUMENT_SET_TEXT_CHARSEQUENCE, "Bundle Test Data");
source.performAction(AccessibilityNodeInfo.ACTION_SET_TEXT,argumentsTest )
// Not Working, throw java.lang.IllegalStateException exception
//Message: "Cannot perform this action on a not sealed instance"
}
}
I don't beleive you're trying to do what you think you're trying to do.
When you set the "text" of an accessibilityNodeInfo, what you're chaning is the text property of that object, as it pertains to your accessibility service. THIS DOES NOT mean, that you are changing the text of the EditText box, that the accessibilityNodeInfo object references. By the time your accessibility service gets this object, the two objects are quite separate from each other. Even if your code were to run successfully, you would not be getting the results you are expecting. Now, as for why you cannot perform this action, knowing this it should be obvious. For an accessibility service to be able to modify the nodes it has, doesn't really make sense. So they become sealed (think of this as a run time enforcement of a constant). Accessibility nodes become sealed and unsealed at various points in their lifetime. The parts of the framework that have access to unsealed node infos are View classes and private APIs. Any accessibility service related tasks are going to be dealing with sealed, read-only instances.
As for why your original solution is not working, I believe that we do not have enough information. The "ACTION_PASTE" approach is (approximately) the correct approach, HOWEVER, there are an abundance of issues when doing so with web browsers. Browser version, Android version, device version, website, etc all play a role. Especially if your set up is old enough to not use the new WebView (pure chromium webview, rather than the old 4.+ approach of odd embedded mobile WebViews based on outdated versions of WebKit, which will now never be updated). I recommend testing your code on an up to date Nexus device, using at least Android 5.0 and seeing if your code works. If you cannot do this, report version information for your set up. If you already are, on what website?
How can I get the current input device in my application in java? I want to know is the remote or the game controller, that is being used.
It is an android application that I want to run on Amazon FireTV. Unlike the Amazon Kindle there is no touchscreen but you can use a remote or a game controller. I would like to know if it is possible to detect what kind of input device the user is currently using.
The code I have until now is a standard Cordova Application code, but when I know how to detect the current input device I would make a plugin to pass the value to the javascript code. That is not the problem.
As mentioned in the comments you should provide steps you have already taken or code you have already written to address this functionality as that will help us tweak the most appropriate answer.
As a general rule, you can look at the official docs to identify controllers on Fire TV.
https://developer.amazon.com/public/solutions/devices/fire-tv/docs/identifying-controllers
Basically, you need to write the identification code in your Cordova plugin as follows:
int hasFlags = InputDevice.SOURCE_GAMEPAD | InputDevice.SOURCE_JOYSTICK;
boolean isGamepad = inputDevice.getSources() & hasFlags == hasFlags;
This will allow you to find out if it's a gamepad. For a Fire TV remote the code you need is:
int hasFlags = InputDevice.SOURCE_DPAD;
bool isRemote = (inputDevice.getSources() & hasFlags == hasFlags)
&& inputDevice.getKeyboardType() == InputDevice.KEYBOARD_TYPE_NON_ALPHABETIC;
The InputDevice class is available on the Android developer site:
http://developer.android.com/reference/android/view/InputDevice.html
So you basically need to import that in your plugin class to ensure the above code works fine.
import android.view.InputDevice;
I'm writing a game for OUYA and Android and I'm using the trackpad on the OUYA controller. When ever you touch it a mouse pointer comes up and I can't find a way to hide it. I image this would be a problem for games on an Android netbook as well.
Has anyone found a way to interact with the cursor instead of just listening for events?
This won't hide the mouse, but it will at least help prevent touch events from interfering with your joystick processing code -- not a proper solution I know, but still might help people who land on this page:
public boolean onGenericMotionEvent(MotionEvent event) {
if ( (event.getSource() & InputDevice.SOURCE_CLASS_JOYSTICK) != 0) {
//handle the event
return true;
}
else {
return false;
}
}
Android currently does not expose any functionality to hide the mouse cursor. Whenever you have an external pointing device (ie. usb/bluetooth mouse, trackpad, etc) a mouse pointer will appear on the screen whenever you interact with the device.
Unfortunately (as of JB 4.2.2) this means it is impossible without a modified ROM.
It is possible to request pointer capture now. You need to explicitly request capture:
fun onClick(view: View) {
view.requestPointerCapture()
}
As documented:
Android delivers pointer events from sources other than the mouse normally, but the mouse pointer is not visible anymore.
You can either handle pointer events by overriding onCapturedPointerEvent:
override fun onCapturedPointerEvent(motionEvent: MotionEvent): Boolean {
// Get the coordinates required by your app
val verticalOffset: Float = motionEvent.y
// Use the coordinates to update your view and return true if the event was
// successfully processed
return true
}
or registering an event handler for OnCapturedPointerListener:
myView.setOnCapturedPointerListener { view, motionEvent ->
// Get the coordinates required by your app
val horizontalOffset: Float = motionEvent.x
// Use the coordinates to update your view and return true if the event was
// successfully processed
true
}
And it's up to you to release the pointer when you're done:
override fun onClick(view: View) {
view.releasePointerCapture()
}
I know that the context of this question overall may not apply (ie: Ouya development), but this was the first search result when I looked into how to do this myself. So I figured that I'd update the answer!
How do I capture the mouse in a Java application so that all mouse events (even ones that happen if the mouse is moved outside the app window) are seen by the Java app? This is like the Windows SetCapture function.
You don't; the JVM, or more specifically AWT, only generates input events when Windows sends it input events, and the JVM only registers for those events which occur within it's window.
You might be able to pull it off using JNI, but then again you might not - it will depend if you can get your hands on the information required by the underlying API. Since that's likely to be a window handle, you won't have what you need to invoke the API, even from JNI.
You have to hook the mouse at the operating system level. Windows(Swing, AWT, MFC, etc....) are only aware of mouse movements within their bounds. If you need a way to access the current position of the mouse regardless of where the mouse is on the screen, you need to write an Input Hook: Input Hooks. You can then use JNI or read the STDOUT from a win32 console application designed to use the Input Hook to forward mouse events/positions to your Java code. I use the latter method in some of my user interface test cases with success.
I needed to do that too!
I after searching the web I found that its possible to use the moveMouse in java.awt.Robot.
Basically use Robot to move the mouse into center of your frame. If user moves it: check how much and move it back to center.
No additional packets or JNI are needed for this (my demo uses JOGL and vecmath but that's for the graphics). Is it good enough? Try the demo, its here:
http://www.eit.se/hb/misc/java/examples/FirstPersonJavaProtoGame/
If the above solution is not good enough then perhaps lwjgl is what you need:
http://www.lwjgl.org/javadoc/org/lwjgl/input/Mouse.html
/Henrik Björkman
Just use the system-hook library available on gitHub https://github.com/kristian/system-hook
This only apply to windows-based systems but really simple to implement.
Sample usage
import lc.kra.system.keyboard.GlobalKeyboardHook;
import lc.kra.system.keyboard.event.GlobalKeyAdapter;
import lc.kra.system.keyboard.event.GlobalKeyEvent;
public class GlobalKeyboardExample {
private static boolean run = true;
public static void main(String[] args) {
// might throw a UnsatisfiedLinkError if the native library fails to load or a RuntimeException if hooking fails
GlobalKeyboardHook keyboardHook = new GlobalKeyboardHook();
System.out.println("Global keyboard hook successfully started, press [escape] key to shutdown.");
keyboardHook.addKeyListener(new GlobalKeyAdapter() {
#Override public void keyPressed(GlobalKeyEvent event) {
System.out.println(event);
if(event.getVirtualKeyCode()==GlobalKeyEvent.VK_ESCAPE)
run = false;
}
#Override public void keyReleased(GlobalKeyEvent event) {
System.out.println(event); }
});
try {
while(run) Thread.sleep(128);
} catch(InterruptedException e) { /* nothing to do here */ }
finally { keyboardHook.shutdownHook(); }
}
}