I have one linux laptop and one android tablet. I want to show my mouse movement in my android tablet that is i will move mouse in linux laptop and that movement will be shown on android tablet.Will libgdx will work for this??Is their any good study material or sample program to start ?
There was a Libgdx Remote project, but that went in the other direction (pushing Android sensor data to your desktop so you can debug tilt code on your desktop). However, I believe that's been deprecated in the most recent builds.
The remnants of that project are the RemoteInput
and RemoteSender classes. You may be able to adapt those for your case (setting up your Android device to be a network listener may be more work than forwarding mouse events).
This approach may be much lower-level than what you want (you may just want to get mouse coordinates from your laptop and map them directly to drawing something on the screen -- and not inject low-level events).
Related
I just started learning to program for android and sadly after making one program (Hello World) I have already ran into an issue. The gesture for "Swipe to Unlock" or for the camera are not functioning. I can bypass it on the stock emulator because it shows a notification i can click that will skip the screen, however i would much rather just figure out a solution, i have searched around on Google and StackOverflow to no avail... maybe someone can help out.
Thanks in advance,
Jon
ok solved! for those that might have the same problem, just press f2
adb shell input keyevent 82
I had the same issue and grew tired of it, this always worked.
AVD_WXGA_TABLET emulator.
Android v 7
F2 does nothing.
Dragging with left-down mouse tries to do a swipe up but bounces back down when I release the mouse.
I've tried fast and slow, release on the virtual device or off it. Short and long strokes. You name it...
The f'ing thing won't unlock. And for all the keystrokes available in the tool pallet including fingerprints, there is no simple swipe-left/up/down/right. You'd think they would have had cntrol-alt-arrow for that out of sheer common sense or as the result of the developer trying to use it.
In my case, I just restarted the simulator by holding down the "Power" button in the simulator menu for a couple of seconds and then choosing "Restart." I was able to unlock the simulator easily after that.
I'm not sure exactly which step solved it, maybe all, but these were my steps
Use the SDK manager to update to the latest version of the emulator
Turn off the emulator (hold the power button and select Power Off)
Close Android Studio
Reopen Android Studio
Launch the emulator from the AVD manager
On mac you can keep holding the command key and then do the swipe gesture with the mouse or trackpad of your computer.
Video
Try using one of two ways
Swipe up from top section of screen
Swipe up from bottom section of screen i.e. just above the navigation buttons.
I'm working on an application that will be released both for desktop and for mobile devices with touch screens. There is a feature which will require the user to double-click (for desktop).
How do I capture a "double-click" on a touch-screen device? Is that even possible?
I think you found a solution by now.
But this is how I did it,
I used the MousePressed event with the condition MouseEvent.getClickCount() == 2.
I have a game that's running on ouya. I want to make a game object follow the cursor, but mouseMoved() only gets called in the desktop version of libgdx. Similarly, Gdx.input.getX() and getY() only update when the mouse is clicked (ouya touchpad tapped).
badlogic recommends using their controller api over the official ouya one for various reasons, and it works perfectly with everything else. Is there a way for libgdx to return the cursor position in android? I really want to avoid rewriting all my other controls when the libgdx controller extension works in every other aspect and supports multiple controllers.
Thanks.
You might want to try a non-OUYA approach. I can plug a USB mouse and keyboard into my phone using a USB adapter and I can pair a bluetooth mouse with my phone too. In fact you have to do this with a lot those HDMI android pc on a stick things. A mouse should cause all the touch events you normally detect in LibGDX but I think the OUYA touchpad functions a mouse so if you want to track it like a regular mouse you should be able to.
If you look at how the Android backend and PC backend are implemented you'll see that getX and similar query the mouse on PC but return the last stored touch location on android. But starting in Honeycomb you can get the cursor position with View.OnGenericMotionListener (see Android: Tracking mouse pointer movement).
So you can add in some android specific (but not OUYA specific) code to track the cursor position.
Oh, and Android 4 added even more mouse related support - https://developer.android.com/about/versions/android-4.0.html
I know it's been a while and you probably figured out how to do it, but I just wanted to let you know that I've submitted a patch to libGDX that allows for mouse tracking in the same way you do it on the desktop.
It's now done natively, starting with libGDX 1.4.1
I am working on a game with a virtual joystick in the bottom right corner of the screen and a "move" button in the bottom left. So you use the joystick to point the character in the right direction and press the "move" button to go forward in that direction. This was all working great until today and now when I press on the screen to go forward my joystick is being affected. I know it is not a coding problem because I haven't opened the file that handles touches for 8 days and it has been working fine. Also after I close my app and then use another app that is completely separate from libGDX the multitouch is having the same issue. So do you guys think this is a problem with libGDX, my device, or am I just not coding the multitouch correctly for libGDX? I am using a Stage and the controls are Actors on the Stage I should note that after I restart my phone, the other app that is not using libGDX works correctly. It is only after I open my libGDX based game. So frustrated with this that I am about to give up on libGDX. I am happy to post any code that is requested.
EDIT: Please see comments below. This seems to be a specific issue with the Galaxy Nexus and probably some other Samsung devices as well. This libGDX based app has had no issues on other devices such as original droid, and ASUS Transformer tablet.
Turns out this is not a libGDX problem at all... Nor was it a problem with any of my code. There seems to be a bug in the way the Galaxy Nexus handles multitouch. You can view the bug report at this link. Seems that as of 4.0.4 there still is no fix for this bug. I am still currently running a VZW Galaxy Nexus with 4.0.2, I guess all we can do is wait and put a disclaimer in our games until (if ever) it is fixed.
By locking and unlocking the device, the problem does go away (sometimes), but will quickly come back if I go to my homescreen and re-open the app.
well I'm trying to hide the lower system bar on +3.0 version of android i found this, Hiding Systembar in android 3.0(honeycomb).
Well this tread was pretty old and 4.0 is even out now so i was hoping there is a way to do this now that don't require the user to root the device.
This is not supported. Users need to be able to access the HOME and BACK buttons when they are not available by the device off-screen.
Some Android devices that are running 2.x will get upgraded to 4.0. My understanding is that those devices will not have the bottom system bar, as those devices will have off-screen HOME and BACK buttons. Whether the top portion of the system bar (time, signal strength, notifications) will be removable via things like Theme.NoTitleBar.Fullscreen remains to be seen -- we will know more once the Nexus S and similar devices get upgraded.