I have implemented a ZoomViewGroup, which is capable of scrolling infinitely in all directions and also zooming infinitely while still delivering all touch events correctly offset to its child Views.
But when it comes to multi-touch, only the first pointer is offset correctly, and all others are pointing to a wrong location, and thats because there is the scaling factor I have to take care of. (As pointer 0 has a different offset from its original location than pointer 1 or 2, when scaling_factor != 1.0f)
I am saving my transformation in a Matrix, so it's easy to calculate the coordinates from screen to workspace and back using matrix.mapPoints(..) and the matrix inverse.
When I draw the child views, I can just apply the transformation matrix like this:
protected void dispatchDraw(Canvas canvas)
{
canvas.save();
canvas.concat(transformation_matrix);
super.dispatchDraw(canvas);
canvas.restore();
}
Same with touch events:
float[] touch_array = new float[2];
public boolean dispatchTouchEvent(MotionEvent event)
{
touch_array[0] = event.getX();
touch_array[1] = event.getY();
transformation_matrix.mapPoints(touch_array);
event.setLocation(touch_array[0], touch_array[1]);
return super.dispatchTouchEvent(event);
}
But MotionEvent.setLocation(float x, float y) does actually offset all pointers by the same amount. If we zoomed by 2.0f the offset is different for each pointer, so I have to be able to do something like MotionEvent.setLoction(int pointer_index, float x, float y) for each one individually. Is there anything I can do to achieve this?
EDIT: The only solution I know (and searched for) so far is to create a new MotionEvent with the new coordinates.
--
You can get all pointers (fingers touching) positions trough the methods getX(int pointerId)/getY(int).
You can get all on screen pointers trough the getPointerCount()
So to parse multi-touch you must do something like:
public boolean dispatchTouchEvent(MotionEvent event)
{
for(int i = 0; i < event.getPointerCount(); i++){
touch_array[0] = event.getX(i);
touch_array[1] = event.getY(i);
transformation_matrix.mapPoints(touch_array);
MotionEvent copy = MotionEvent.obtainNoHistory(event);
copy.setLocation(touch_array[0], touch_array[1]);
boolean handled = super.dispatchTouchEvent(copy);
copy.recycle();
if(handled) return true;
}
return false;
}
Also note that I created a copy of the MotionEvent object, so childrem can modify it without breaking the position trough the for.
Sorry, just now I noticed your last statement, you must copy the event to a new MotionEvent, so the childrem can parse itself and do its works, copying also will make your code safe of modifications, it's a bad idea to change the MotionEvent directly since it can break a for loop in any dispatch method.
Related
I have a code where i move an actor from point A to point B, i don't want it to stop on point B but to keep moving forward at the same direction exactly, i don't know how to do it.
also i want it to move in a fixed speed no matter what is the distance between the points, can someone help?
the action is:
Gdx.input.setInputProcessor(new InputAdapter() {
public boolean touchUp(int screenX, int screenY, int pointer, int button) {
touchposx = screenX;
touchposy = screenY;
MoveToAction action = new MoveToAction();
action.setPosition(touchposx+300, screenHeight-touchposy+300);
action.setDuration(5f);
bullet.addAction(action);
return true;
}
});
}
You achieve this by using a combination of moveBy (for fixed speed), sequencing actions and a Remove Action.
But in my opinion, an Action is a too complicated approach to solve this. Actions are mainly designed for UI animations. Instead, I would override the bullet's act() method, use setPosition and remove the Actor from the stage when it left the screen.
I'm making a game using Libgdx, I need to know if the user is using two fingers and if they are placed in the correct position. One finger should be on the right side of the screen and the other in the left side of the screen.
the easiest solution without using any listeners is to just iterate over some count of pointers and calling simple Gdx.input.isTouched() - you have to set some "maximum pointers count" but hey - peoples usually has only 20 fingers :)
final int MAX_NUMBER_OF_POINTERS = 20;
int pointers = 0;
for(int i = 0; i < MAX_NUMBER_OF_POINTERS; i++)
{
if( Gdx.input.isTouched(i) ) pointers++;
}
System.out.println( pointers );
due to reference:
Whether the screen is currently touched by the pointer with the given index. Pointers are indexed from 0 to n. The pointer id identifies the order in which the fingers went down on the screen, e.g. 0 is the first finger, 1 is the second and so on. When two fingers are touched down and the first one is lifted the second one keeps its index. If another finger is placed on the touch screen the first free index will be used.
you can also easily the position of touching pointer by using Gdx.input.getX() and Gdx.input.getY() like
final int MAX_NUMBER_OF_POINTERS = 20;
int pointers = 0;
for(int i = 0; i < MAX_NUMBER_OF_POINTERS; i++)
{
if( Gdx.input.isTouched(i) )
{
x = Gdx.input.getX(i);
y = Gdx.input.getY(i)
}
}
and then you can for example put it into array
How to detect number of fingers being used?
You can do it with MotionEvent with getPointerCount()
You can detect how many fingers are on the screen doing this :
int PointerCount = event.getPointerCount();
I need to know if the user is using two fingers and if they are placed in the correct position
You can get the X,Y and compare it.
#Override
public boolean onTouch(View v, MotionEvent event) {
int x = event.getX();
int y = event.getY();
return true;
}
For more information you can check MotionEvent Documentation so you can get there what you want.
Both GestureListener and InputProcessor have the touchDown method. This has the x and y value for each finger on the screen (pointer). Implement either one of these and you can override the touchdown to suit your needs. This is a great tutorial to start with. Hope this helps.
I have a LibGDX scene with a couple of Images (the Actor subclass). I want to drag one Image and drop it on another. I started with the source code located at DragDropTest.java. Since I basically want the source to be the payload I've tried modifying payload.setDragActor to use the source Image. It kind of works, I need to add the code to place the payload actor back in the stage but that isn't my issue.
My problem that that the payload (when it's the source actor or a separate actor) doesn't really get dragged. Instead what happens is the payload actor positions itself slightly down and to the right of the mouse cursor. I want to place the payload, not point to where I want the payload placed. It doesn't feel like dragging at all, it feels like something is following the cursor. I see the same behavior on the Android emulator as I do on the desktop version of the app.
I went digging through the LibGDX source for com.badlogic.gdx.scenes.scene2d.utils.DragAndDrop and found the answer. The code places the payload +14 in the X direction from the cursor and (-20 - payLoadActor.getHeight()) in the Y direction which is why I'm not able to visually drag the payload. There is a setDragActorPosition method that can be used to correct the position. If you always want the dragged payload to be centered under the cursor you can do:
final DragAndDrop dragAndDrop = new DragAndDrop();
dragAndDrop.setDragActorPosition(-(sourceImage.getWidth()/2), sourceImage.getHeight()/2);
If you want the dragged payload to maintain its placement under the cursor/finger than you have to use the cursor position when calling setDragActorPosition in the dragStart method.
final DragAndDrop dragAndDrop = new DragAndDrop();
dragAndDrop.addSource(new DragAndDrop.Source(sourceImage) {
public DragAndDrop.Payload dragStart (InputEvent event, float x, float y, int pointer) {
DragAndDrop.Payload payload = new DragAndDrop.Payload();
payload.setDragActor(sourceImage);
dragAndDrop.setDragActorPosition(-x, -y + sourceImage.getHeight());
return payload;
}
public void dragStop (InputEvent event, float x, float y, int pointer, Target target) {
sourceImage.setBounds(50, 125, sourceImage.getWidth(), sourceImage.getHeight());
if(target != null) {
sourceImage.setPosition(target.getActor().getX(), target.getActor().getY());
}
virtualStage.addActor(sourceImage);
}
});
You're not showing any code, so it's hard to know what you are actually doing. Based on your description, it sounds like you are placing the payload at where the cursor is rather than moving it the same offset as the cursor moved since you started dragging.
Assuming you're using an ActorGestureListener, this is what I do in my app.
float touchX, touchY;
public void touchDown(InputEvent ev, float x, float y, int pointer, int button) {
touchX = x;
touchY = y;
...
}
public void pan(InputEvent ev, float x, float y, float dx, float dy) {
moveBy(x - touchX, y - touchY);
...
}
This way you are moving your actor by the same amount as the mouse cursor moved, so it's like the pointer "glued" itself to whereever you touched your actor. Also, I believe the events in libgdx makes it so that you will not receive the pan events until the cursor has moved a bit inside your actor, so be aware of that when testing (moving the cursor a few pixels only will not trigger the pan events).
I want to make a game where you can build stuff by dragging and dropping objects into place. I think LibGDX only supports DragNDrop on Actors, but I need physics on bricks in order to make them fall down if the construction is not stable.
So far, my approach to drag and drop is:
for(Brick b : map.getList()){
final Image im = new Image(b.ar);
stage.addActor(im);
im.setPosition(b.posX, b.posY);
im.setOrigin(b.posX, b.posY);
im.addListener((new DragListener() {
public void touchDragged (InputEvent event, float x, float y, int pointer) {
im.setOrigin(x, y);
im.setPosition(x, y);
//System.out.println("touchdragged ---> X=" + x + " , Y=" + y);
}
}));
}
where the map.getLists contains all bricks to be painted. b.ar is the texture to be painted.
With this aproach [this] is what happens. I don't know what may be causing it.
#Override
public void render(float delta) {
spritebatch.begin();
map.getWorld().step(1/60f, 6, 2);
renderer.render(map.getWorld(), camera.combined);
if(Gdx.input.justTouched()){
Vector3 touchPoint = new Vector3(Gdx.input.getX(), Gdx.input.getY(),0);
camera.unproject(touchPoint.set(Gdx.input.getX(), Gdx.input.getY(), 0));
System.out.println(touchPoint);
}
stage.draw();
spritebatch.end();
}
Of course i'd like to make the body fell (with the box 2d engine from libgdx) if you drop the object and it has nothing under it.
Thanks in advance
You're setting the origin in your listener callback to a screen coordinate. That is not going to work.
The origin is used to define the "center" of your object, so when you reposition it, Libgdx knows which part of the actor to put where. Generally the origin is either the bottom left corner of the object (I think this is the default) or its the center of the object.
I guess you may want to reset the origin so if someone taps on the left edge of a brick and then you reposition the object you'll reposition that point on the brick (and not reposition the bottom left corner of the brick). To do that you'll need to convert the screen coordinates into coordinates in the actor's space.
That's all somewhat icky though. I think you'd be better off just doing relative repositioning. Instead of trying to position the brick absolutely with setPosition just reposition it relatively:
im.setPosition(im.getX() + dx, im.getY() + dy);
Then it doesn't matter where the "origin" is.
You'll have to compute dx and dy in your listener based on the previous touch point.
It appears that the drag listener gives coordinates relative to the origin of the actor that is raising the event. That is a bit strange when you are moving that actor in response to the drag events, because the origin keeps changing. Essentially, I found that if I just move the actor by the x and y values of the event, it will follow the mouse or finger.
One improvement is to record the position that the drag started at and use it as an offset, so the pointer stays the same distance from the actor's origin.
Another option might be to add the listener to the stage instead of the button. I expect the coordinates would then be relative to the stage's origin, which is not changing. I haven't tried that technique.
Here's the code I used to drag a button horizontally:
DragListener dragListener = new DragListener() {
private float startDragX;
#Override
public void dragStart(
InputEvent event,
float x,
float y,
int pointer) {
startDragX = x;
}
#Override
public void drag(InputEvent event, float x, float y, int pointer) {
insertButton.translate(x - startDragX, 0);
}
};
dragListener.setTapSquareSize(2);
insertButton.addListener(dragListener);
If you want to drag something in two dimensions, just copy the x code for the y position.
I asked this at the libgdx forums but didn't get a response so I was hoping y'all could help me out:
I have Actors that represent game pieces. What I'm trying to do is make it so the player can click-and-drag the tile to move it around the screen and rotate it multiple times before submitting the placeTile command. From what I understand of DragAndDrop it doesn't seem to be designed with my use case in mind so I figured I'd instead attach a dragListener listener to each game piece (code below). It works well for dragging, except I can't figure out how to set the 'minimum distance before drag starts' to 0... but that's not my main question (though any insights would be appreciated )
Anyway, the big problem comes in when I rotate the actor, and then try to drag it: At 30 degrees rotation, drag acts almost like normal: at 60 degrees, very small movements of the mouse send the actor moving in a tight circle very quickly. Another 30 degrees, the tile actor exits the screen in 1-2 frames, moving in a wide arc. If the actor is rotated clockwise, it's movements are clockwise; same pattern for counter-clockwise.
It looks like the translation of the actor is taking rotation into account; I guess my question is, is it possible to rotate an Actor/Group without the rotation affecting future translations? Alternatively, is there a better way to drag an Actor around the screen based on touch/mouse input? I included some code below: I imagine I'm screwing up something basic, but I can't figure out what:
// during initial stage creation
tileActor.setOrigin(tileActor.getWidth() / 2, tileActor.getHeight() / 2);
tileActor.addListener(new DragListener() {
public void dragStart(InputEvent event, float x, float y,
int pointer) {
chosenTileActor = event.getTarget();
}
public void drag(InputEvent event, float x, float y, int pointer) {
Actor target = event.getTarget();
target.translate(x, y);
}
});
And for the listener that deals with rotation via scrolling mouse wheel:
multiplexer.addProcessor(new InputAdapter() {
#Override
public boolean scrolled(int amt) {
if (chosenTileActor == null)
return false;
else
chosenTileActor.rotate(amt * 30);
return true;
}
});
Any pointers? Am I even going the right direction by using DragListener?
Thanks for reading!
Instead of translating, just set the actor's position directly to the stage coordinates of your drag event:
tileActor.addListener(new DragListener() {
private float offsetX, offsetY;
#Override
public void dragStart(InputEvent event, float x, float y, int pointer) {
Actor target = event.getTarget();
this.offsetX = event.getStageX() - target.getX();
this.offsetY = event.getStageY() - target.getY();
}
#Override
public void drag(InputEvent event, float x, float y, int pointer) {
event.getTarget().setPosition(event.getStageX() - offsetX, event.getStageY() - offsetY);
}
});
I'm computing the offsets in dragStart so that the actor doesn't immediately jump to wherever I clicked when I started dragging (making the drags relative to my mouse). Tested this and it works with any rotation.