I am making a racing game with 2 cars, each of which to be controlled by a different player on the same device. I used the OnTouchEvent method to move the cars by drag and drop. I can't test the code on my laptop because my laptop can't do multi-touch and only allows me to control one car at a time (I only have the emulator from Android Studio, and no physical Android device). So what I currently know is that this code allows the cars to be moved one at a time. I was wondering if my code actually makes it so that the cars can be moved at the same time (which is what I want)or if they can only be moved one at a time:
public boolean onTouchEvent(MotionEvent event) {
switch (event.getAction()) {
// press down
case MotionEvent.ACTION_DOWN:
// conditionals check which car is tapped on
if (car.getCollisionShape().contains((int)event.getX(), (int)event.getY())) {
car.setActionDown(true);
} else if (car2.getCollisionShape().contains((int)event.getX(), (int)event.getY())) {
car2.setActionDown(true);
}
break;
// finger moved along screen
case MotionEvent.ACTION_MOVE:
// moves the cars horizontally where the fingers are moved
if(car.getActionDown()) {
car.setPosition(event.getX(), car.y, true);
} else if (car2.getActionDown()) {
car2.setPosition(event.getX(), car2.y, true);
}
break;
// finger released
case MotionEvent.ACTION_UP:
// conditionals check which car is tapped on
if (car.getCollisionShape().contains((int)event.getX(), (int)event.getY())) {
car.setActionDown(false);
} else if (car2.getCollisionShape().contains((int)event.getX(), (int)event.getY())) {
car2.setActionDown(false);
}
break;
}
return true;
}
Inside the ACTION_DOWN, a car's boolean, which signals if the car is currently being hold down or not, becomes true.
Inside, the ACTION_MOVE, the car's location is shifted to wherever the event.getX() and event.getY() are
Unfortunately, your code doesn't seem to be designed properly for handling multi-touch operations. To make multi-touch work, pointer-IDs are required, which can be obtained by using MotionEvent.getPointerId().
Steps to handle multi-touch gestures are roughly outlined as follows.
Get pointer-ID in ACTION_DOWN and map the ID to the car instance according to hit test result.
And in subsequent actions (ACTION_MOVE, ACTION_UP), determine for which car the MotionEvent is targeted, based on the pointer-ID. And then apply changes to the relevant car instance.
See the official training documentation for handling multi-touch gestures.
By the way, emulator has a little function to simulate multi-touch by mouse (mainly for testing pinch and spread gestures). You may be able to utilize it for testing your app. See Pinch and spread section in this table to know how to use it.
Related
I am developing a game for android with libgdx and I have a little problem, I want to make it trigger a function when the user swipes, this I achieve. The problem is that it runs the function when you finish swiping (when you lift your finger). How can I make the function run while the swipe is doing?
This is the current code for the gesture listener:
private static class DirectionGestureListener extends GestureAdapter{
DirectionListener directionListener;
public DirectionGestureListener(DirectionListener directionListener){
this.directionListener = directionListener;
}
#Override
public boolean fling(float velocityX, float velocityY, int button) {
if(Math.abs(velocityX)>Math.abs(velocityY)){
if(velocityX>0){
directionListener.onRight();
}else{
directionListener.onLeft();
}
}else{
if(velocityY>0){
directionListener.onDown();
}else{
directionListener.onUp();
}
}
return super.fling(velocityX, velocityY, button);
}
}
And the game scene:
Gdx.input.setInputProcessor(new SimpleDirectionGestureDetector(new SimpleDirectionGestureDetector.DirectionListener() {
#Override
public void onUp() {
/*something*/
}
#Override
public void onRight() {
/*something*/
}
#Override
public void onLeft() {
/*something*/
}
#Override
public void onDown() {
/*something*/
}
}));
I looked into GestureDetector.java of libgdx source code itself. fling() will be executed whenever event of touchUp() happened. Thus aligned with what you experienced.
I think an option to make this work is to implement such gesture detection yourself by extending InputAdapter class or implementing InputProcessor interface class, then work on touchDown(), and touchDragged() to have an intended effect as you aimed for.
The idea is to keep track of touch id as it firstly touched on the screen inside touchDown() (its 3rd parameter is int button which is what you're looking at), then use that id to check and further operate inside touchDragged(). If id matches, then for simple approach, you can check whether user touched and moved for pre-defined distance by comparing it against original touching position.
Let's say we want a swipe gesture only when user firstly touches the screen, moves it for (at least) distance we pre-defined set, and within pre-defined duration. So if user firstly touches the screen, moves finger around but still not more than pre-defined distance, then finally moves far enough within duration we've set, this still won't be treated as swipe as our conditions set that it must be from first intention (first touch) to do such gesture. This means we calculate distance against original touching point, not moved point. Of course, you can customize the conditions to suit your need too.
From above, conditions to regard it as swipe gesture can include following (you can adapt these yourself)
distance it moved compared to original touching point
duration it took to move (might be 150ms, etc), longer duration user can be more relaxing not to act fast to make it registered as swipe gesture
only 1 swipe at a time can be taken into effect i.e. if user uses 2 fingers (2 touches) to swipe at the same time, only first one (or last one) can be used as swipe effect, etc.
Treating it separately for x, y direction is per your need. If so, you have to add handling code inside touchDragged() to check for both direction and send you corresponding event i.e. swipe-x, or swipe-y per se. Then you hook it up by calling one of your method to let your game knows.
I've implemented a game where a player can move a sprite around in a tile-based maze. The player controls the sprite with the arrow keys. What I want is to restrict the speed at which the player can move around, e.g. I don't want them to be able to hold down the arrow key and fly across the screen. I tried fixing this by implementing a sleep:
switch (keyCode) {
case KeyEvent.VK_UP: // Up arrow key
if (running) {
player.Move(1); // Move North
}
paintPlayer(getGraphics());
// So the player can't hold down the arrow key and fly across the screen, force them to wait between inputs
// BUT this leads to problems if you do hold it down, moves end up in a 'queue'...
try {
Thread.sleep(150);
break;
} catch (InterruptedException ex) {
Logger.getLogger(MazeView.class.getName()).log(Level.SEVERE, null, ex);
}
break;
// etc.
But this leads to problems if you hold the arrow key down - the moves seem to end up in some kind of "queue" and you end up constantly crashing into a wall until all the moves from your extended key press are done. Is there a better way of doing this?
Previously I was doing a keyPressed() event, but as Riyafa suggested, I changed it to only fire on keyReleased(), and that's done it. There's no need to include sleeps anymore either.
I advise to investigate Switng Timer. As the name suggests its action is performed in the same thread as your Key event listener and all other GUI stuff, so you can easily painting there and update game state, based on key/button states. You can also use swingutilities.invokelater(new runnable() if you are waiting is some another thread. I did not do Swing for a long time and forgotten everything but I am sure that this is how dynamic games are supposed to be done.
I'm very new to Android programming, and trying to understand touch events with nested views. To start, here's a description of my app:
I have a relative layout that I've added via the GUI editor. Everything is default. I've also created a class called ClipGrid that extends ScrollView. Nested inside that, I make a HorizontalScrollView. Inside of that, I make a TableLayout and it's rows. The rows contain buttons.
The end result is a grid of buttons. It displays 4x4 at once, but can scroll either direction to display other buttons.
I call it to the screen from my main activity like this:
ClipGrid clip_grid = new ClipGrid(this);
setContentView(clip_grid);
I did that just for testing purposes, and I think I will have to change it later when I want to add other views to my relativelayout. But I think it might have implications for touch events.
in the end, I want to detect when the grid has been moved and snap the newly viewable 4x4 grid of buttons to the edge of my layout when the user lifts their finger. I'm just not sure how to go about implementing this and any help would be appreciated. Thanks.
The way touch events are handled is kind of a cascading effect that starts from the top view and goes down to the lower nested views. Basically, Android will pass the event to each view until true is returned.
The general way you could implement the onTouchEvent event of a View would be:
#Override
public boolean onTouchEvent(MotionEvent event) {
boolean actionHandled = false;
final int action = event.getAction();
switch(action & MotionEventCompat.ACTION_MASK) {
case MotionEvent.ACTION_DOWN:
// user first touches view with pointer
break;
case MotionEvent.ACTION_MOVE:
// user is still touching view and moving pointer around
break;
case MotionEvent.ACTION_UP:
// user lifts pointer
break;
}
// if the action was not handled by this touch, send it to other views
if (!actionHandled)
actionHandled |= super.onTouch(v, MotionEvent.event);
return actionHandled;
}
Hi Everyone,
in the game I develop with AndEngine, there are a lot of sprites running around. Now every of this sprites has a TouchArea registered to the scene, because I display some informations about the sprite when it is touched. The Scene itself has a OnSceneTouchListener that I use for moving the camera around and to zoom.
My problem is, that every time the user moves the camera (by touching the display somewhere and moving his finger around) the OnAreaTouched() method of any sprite, that is accidentally under the finger, gets called, when the movement is finished (finger gets lifted). I already limited the triggering events to event.getAction()==UP (before it was a real mess of called touchAreas) but this is not enough. If the user is zooming or moving the camera, the sprites touchAreas should not be activated.
Is there any way I can distinguish between an OnAreaTouched-event and an OnSceneTouched-event? Which one is called first and can I suppress the other?
This is my OnSceneTouched() method (simplified):
public boolean onSceneTouchEvent(Scene scene, final TouchEvent event) {
boolean isZooming = event.getMotionEvent().getPointerCount() >= 2;
if (event.getAction() == MotionEvent.ACTION_DOWN) {
// REMEMBER FIRST TOUCHPOINT, TO KNOW IN WHICH DIRECTION TO MOVE
this.touchPoint = new Point(event.getMotionEvent().getX(), event.getMotionEvent().getY());
} else if (event.getAction() == MotionEvent.ACTION_MOVE) {
if (isZooming) {
// DO SOME ZOOM STUFF
} else {
// DO SOME MOVEMENT STUFF
}
return true;
}
OK, actually this is not very interesting - but as you can see I always return true to signal that the touch event was handled. Still the OnAreaTouched() gets called
This is a typical OnAreaTouched() Method of a sprite:
public boolean onAreaTouched(final TouchEvent touchEvent, float touchAreaLocalX, float touchAreaLocalY) {
if (touchEvent.getAction() == TouchEvent.ACTION_UP) {
// DISPLAY INFORMATION ABOUT THE SPRITE
return true;
}
return false;
}
You see, there is nothing special to it. So I hope someone can help me out here to find a solution how to suppress the OnAreaTouch-event when the OnSceneTouch-event should be used. Maybe I can somehow catch the event.getAction()==UP in the OnSceneTouched()-Method??
I hope I could explain the problem good enough for you to understand (sorry, it's not that easy for me : ). Any help is much appreciated, and thank you for you time!
regards
Christoph
edit:
After experimenting with MahdeTo's suggestion to tag the event somehow I found out the following:
the TouchEvent that triggers the OnSceneTouchEvent() Method is not the same as the one triggering the OnAreaTouched() Method.
OnAreaTouched() gets called 20 ms later than OnSceneTouchEvent()
the event calling OnAreaTouched() starts actually when the user puts his finger down on the display (than he moves it around and the OnSceneTouchEvent() gets called multiple times), when he then lifts his finger the first event stops and gets handled. (I tried it out by measuring the time)
So I came up with the solution to measure how long a touch event lasted. If the event is longer than 200 ms, I guess the user wanted not to simply click but move or zoom (because these actions usually take longer). So now the OnAreaTouched() method gets only called when someone really meant to click and not accidentally swiped over the area.
but it's still not a good solution and I would really appreciate if anyone knows more about controlling such events.
thank you
I have recently written a fairly complicated application using touch controls - pressing buttons on the screen, pinch zooming, rotating and moving objects with two fingers and more. What I ended up doing was iteratively improving the application and adding rules controlling how to respond to different combinations of touch events. Whenever something was wrong, I recorded the sequence of TouchEvents and added rules so that the misbehaving action was handled.
You can also create boolean switches that prevent the onAreaTouched to execute, just add a condition that, for example if you want to touch an object, checks whether doNotTouchObject is true or false.
Another thing I found useful were touch mode switchers. I wrote several methods that completely change the behavior when user touches the screen. I believe you can switch them on the go, so it should be possible to call scene.setOnAreaTouchListener(touchListener) while you are touching the screen.
Since un/registering touch areas is blazing fast, simply unregistering certain touch areas while you perform some action is a possibility. After the action is completed, reregister them again.
I used a combination of all of the above and it works rather well, but it is essential to keep the code clean, otherwise debugging or implementing new features will be a pain.
maybe you can use the TouchDetector.
use the clickDetector instead of process TouchUp event, then only the detected click event is processed by the sprites;
use the scroll detector for the Scene Scrolling too;
enable the SceneTouchListenerBinding and TouchAreaBinding will also help you to by pass all the unintended in-progress events.
Anticipating the day when multi-touch interfaces become more pervasive, are there libraries in Java that can be used for developing touch applications? I'm looking for interfaces similar to MouseListener / MouseMotionListener / MouseWheelListener.
The MT4j project has everything you need to develop multitouch applications in java.
All the well known multitouch gestures are already built in and can be accessed as simple
as listening to mouse events (for example: component.addGestureListener(..)).
It also features a hardware accelerated scene graph, similar to JavaFX.
You can even simulate multitouch input by connecting one ore more mice to your machine.
Check it out at http://www.mt4j.org
Sparsh is still in my bookmarks from the last time I was investigating multitouch java solutions.
While not as straight forward as the typical mouse listener or click listener, it still provides a reasonable interface.
You need your listening class to implement sparshui.client.Client, which requires the processEvent method definition.
public void processEvent(int groupID, Event event) {
if(event instanceof TouchEvent) {
TouchEvent e = (TouchEvent)event;
if(e.getState() == TouchState.BIRTH) {
//do initial touch stuff
} else if(e.getState() == TouchState.MOVE) {
//do dragging stuff
}
}
else if(event instanceof DragEvent) {
DragEvent e = (DragEvent)event;
//do DragEvent specific stuff
} else if(event instanceof RotateEvent) {
RotateEvent e = (RotateEvent)event;
//do RotateEvent specific stuff
} else if(event instanceof ZoomEvent) {
ZoomEvent e = (ZoomEvent)event;
//do ZoomEvent specific stuff
}
//several other gesture types....
}
After that, you need to start up the gesture recognition server, passing in your component
new ServerConnection("localhost", objectImplementingClientInterface);
Looking at the code examples on the site should give you a pretty good idea of the framework.
How about this: http://kenai.com/projects/macmultitouch
I am primarily working in Processing and designing my UI from the ground up. I've been looking for a solution which doesn't prescribe a UI framework which both MT4J and JavaFX appear to do. Furthermore, MT4J appears to be abandoned.
This looks like a promising solution at least for Windows but I'm unsure if it's actually released yet:
http://wiki.gestureworks.com/index.php/GestureWorksCore:Gestureworks_Core_Tutorials
This is specifically for Processing, cross-platform, open-source and active:
https://github.com/vialab/SMT
MT4J doesn't work with Windows 8.
If the applicatin is only for one user, you can use JavaFX. There are different listener for touch events. But it is not possible to process two gestures at the same time, because all touch points will merge to one gesture. For big multi touch screens it is a disadvange. For normal screens, where is only one user its ok.
But there is also GestureWorks. There you can define new gesture or use the predefined gesture. The gestures are defined in a XML File (called GML). Any object can handle there own gestures. But you have to implement the hitTest and the point assignment manually. But there is a greate tutorial.
Another library, which i don't tested, ist the Multi Touch SDK by PQ Lab.