I am making a bridge building game for Android. As you know there are two coordinates for drawing a line.
Firstly, When we push "put" button and select a dot then it's the first coordinate of the line, secondly the coordinates where we keep touching on screen is always displayed as the second coordinate, lastly, where we release our finger is decided as the second coordinate of the line. And there will be more than one lines.
I'd be glad if anyone explain these to me.
You will have to override the onTouchEvent function of the respective activity:
#Override
public boolean onTouchEvent(MotionEvent event) {
int action = event.getActionMasked();
float x = event.getX();
float y = event.getY();
if (action == MotionEvent.ACTION_DOWN) {
// save the coordinates somewhere
} else if (action == MotionEvent.ACTION_UP) {
// save the coordinates as well
} else if (action == MotionEvent.ACTION_MOVE) {
// display the coordinates
}
}
Then you simply have to use the stored coordinates to draw a line between the coordinates, e.g. within a canvas that is located on your activity.
You will find a sophisticated example here: http://www.vogella.com/tutorials/AndroidTouch/article.html
Related
I've created a Relative layout that will change a button position when I touch it. Using OnTouchListener. But I want when I release the button (ACTION_UP), it go to it's own position before this operation. Can anyone help me out please?
My code : (inside on Create)
...
this.relative_layout.setOnTouchListener(new OnTouchListener()
{
#Override
public boolean onTouch(View view, MotionEvent event)
{
// TODO: Implement this method
if(event.getAction() == event.ACTION_MOVE)
{
float x = event.getX();
float y = event.getY();
my_button.setX(x);
my_button.setY(y);
} else if(event.getAction() == event.ACTION_UP)
/* here's the issue that I'm
* asking for help about
* What to do to return the button position back
* Like it was? */
return true;
}
});
According to the doc, setX does:
Sets the visual x position of this view, in pixels. This is
equivalent to setting the translationX property to be the difference
between the x value passed in and the current left property.
So to reverse that, use setTranslationX(0);
Same thing for the Y- setTranslationY(0);
I have implemented a ZoomViewGroup, which is capable of scrolling infinitely in all directions and also zooming infinitely while still delivering all touch events correctly offset to its child Views.
But when it comes to multi-touch, only the first pointer is offset correctly, and all others are pointing to a wrong location, and thats because there is the scaling factor I have to take care of. (As pointer 0 has a different offset from its original location than pointer 1 or 2, when scaling_factor != 1.0f)
I am saving my transformation in a Matrix, so it's easy to calculate the coordinates from screen to workspace and back using matrix.mapPoints(..) and the matrix inverse.
When I draw the child views, I can just apply the transformation matrix like this:
protected void dispatchDraw(Canvas canvas)
{
canvas.save();
canvas.concat(transformation_matrix);
super.dispatchDraw(canvas);
canvas.restore();
}
Same with touch events:
float[] touch_array = new float[2];
public boolean dispatchTouchEvent(MotionEvent event)
{
touch_array[0] = event.getX();
touch_array[1] = event.getY();
transformation_matrix.mapPoints(touch_array);
event.setLocation(touch_array[0], touch_array[1]);
return super.dispatchTouchEvent(event);
}
But MotionEvent.setLocation(float x, float y) does actually offset all pointers by the same amount. If we zoomed by 2.0f the offset is different for each pointer, so I have to be able to do something like MotionEvent.setLoction(int pointer_index, float x, float y) for each one individually. Is there anything I can do to achieve this?
EDIT: The only solution I know (and searched for) so far is to create a new MotionEvent with the new coordinates.
--
You can get all pointers (fingers touching) positions trough the methods getX(int pointerId)/getY(int).
You can get all on screen pointers trough the getPointerCount()
So to parse multi-touch you must do something like:
public boolean dispatchTouchEvent(MotionEvent event)
{
for(int i = 0; i < event.getPointerCount(); i++){
touch_array[0] = event.getX(i);
touch_array[1] = event.getY(i);
transformation_matrix.mapPoints(touch_array);
MotionEvent copy = MotionEvent.obtainNoHistory(event);
copy.setLocation(touch_array[0], touch_array[1]);
boolean handled = super.dispatchTouchEvent(copy);
copy.recycle();
if(handled) return true;
}
return false;
}
Also note that I created a copy of the MotionEvent object, so childrem can modify it without breaking the position trough the for.
Sorry, just now I noticed your last statement, you must copy the event to a new MotionEvent, so the childrem can parse itself and do its works, copying also will make your code safe of modifications, it's a bad idea to change the MotionEvent directly since it can break a for loop in any dispatch method.
I am trying to draw an image that is larger than the screen, and let the user scroll around it. The image is some graphs that are calculated by my code from some data. (This is in Android using Java).
What I want is a View (actually a SurfaceView) that can be scrolled by the user. I have tried putting a SurfaceView inside a ScrollView but the whole image drawn by the SurfaceView is very large and so (a) there are performance issues as my app is drawing the whole SurfaceView when it should only need to do the part that is in the viewport and (b) my app crashes with "dimensions too large... out of memory" errors.
It would be best for me if the View could implement scrolling (vertical only) and then pass the y-value to my code so it could draw the visible part of my image in the viewport.
Alternatively I could capture the 'mouse' events and calculate the y-value myself.
Does anyone have any code so that I can do this?
Update: I should clarify, the image is drawn by my program code using Canvas.drawLine() and the like. I can calculate which parts of the image fit within my viewport, provided I have the state of the scroll as a y-value, and I can calculate the absolute y-coordinate of each point by subtracting scroll y-value. What I need is something that works to find when the user scrolls the image and what the resulting y-value is. A scrollbar would also be nice.
Update: I am using SurfaceView so that I can update the image from another Thread and so improve user interface performance.
if you really have just a draw content... you don't need a scrollable client then...
have a model (x,y,lines, etc)
have a pan & zoom
draw a scaled/zoomed instance of the model
just draw the content (sorry - no scroller in this solution)
#Override
public boolean onTouchEvent(MotionEvent event) {
boolean isProcessed = scaleGestureDetector.onTouchEvent(event);
if (isProcessed) {
// Handle touch events here...
switch (event.getAction() & MotionEvent.ACTION_MASK) {
case MotionEvent.ACTION_DOWN:
start.set(event.getX(), event.getY());
tap( event.getX(), event.getY() );
mode = DRAG_OR_TAP;
break;
case MotionEvent.ACTION_POINTER_DOWN:
break;
case MotionEvent.ACTION_UP:
break;
case MotionEvent.ACTION_POINTER_UP:
mode = NONE;
break;
case MotionEvent.ACTION_MOVE:
if (mode == DRAG_OR_TAP) {
doPan(event.getX() - start.x, event.getY() - start.y);
start.set(event.getX(), event.getY());
}
break;
}
}
myView.scale(currentPan, currentScaleFactor);
invalidate();
return true;
}
private void doPan(float panX, float panY) {
currentPan.x = currentPan.x + panX;
currentPan.y = currentPan.y + panY;
}
private void tap(float x, float y) {
...
}
and last not least use the scaleListener
private class ScaleListener extends
ScaleGestureDetector.SimpleOnScaleGestureListener {
#Override
public boolean onScale(ScaleGestureDetector detector) {
float value = detector.getScaleFactor();
currentScaleFactor = currentScaleFactor * value;
// don't let the object get too small or too large.
boolean doesntMatch = false;
if (currentScaleFactor < 1f || currentScaleFactor > 20f){
currentScaleFactor = Math.max(1f, Math.min(currentScaleFactor, 20f));
doesntMatch = true;
}
if(!doesntMatch){
//scale the viewport as well
currentPan.x = currentPan.x*value;
currentPan.y = currentPan.y*value;
}
return true;
}
}
you can simply use a WebView to display an image
provides pan
provides zoom
no library required
has a scrollbar
Android: Easiest way to make a WebView display a Bitmap?
i'm not sure if i really hit your question...
I'm developing an application in which I want to touch the screen vertically from top to bottom, and with the movement of finger I'm changing background.
I got that part of app.
The issue is that if a user touches the middle or the bottom part (coordinates) of the screen, It changes the background.
I want that user starts from top and move to bottom and if the user touches the middle or the bottom part (coordinates) of the screen, application shouldn't show any reaction but a toast should be shown saying
"Start touching from the top of the screen".
#Override
public boolean onTouchEvent(MotionEvent event) {
int x = (int)event.getX();
int y = (int)event.getY();
if(MatchReservedPixels(x,y))
{
return false;
}
return true;
}
//and here is your MatchReservedPixels function
private boolean MatchReservedPixels(int x,int y)
{
//Here goes your pixel matching logic
}
and don't forget to apply touchevent on avtivity.
I found this article: http://android-developers.blogspot.ca/2010/06/making-sense-of-multitouch.html which helped my understanding but I'm still not sure how to do what I'm trying to do.
In my game, I have a virtual analog stick and some buttons. Only 2 fingers will ever register at once. This is what I want. One for the analog stick and one for a button.
The main thing I'm unsure of is, say I put a finger down on the analog stick and move it around, then put a finger on the button, then release the button, the analog stick should keep moving to my first finger.
And vice versa, if the button touches first and then the analog stick, if I let go of the analog stick the button should still be pushed down.
Do touch pointers work in this fashion on Android, as in, once I put my finger down, regardless of any other fingers I put up or down, it will remember my first finger in order and give it a down, move move move up events?
Thanks
Ideally I wish I had a function like this:
void onTouch(int fingerID, int action, int x, int y)
{
}
Where each finger that is put down will receive down, move and up event when that finger goes up.
The game is a racing game so they need to be able to steer and push gas at the same time.
My problem is similar to this
identified multi touch pointer in action_move
Edit:
I have this code:
private void onTouch(int finger, int action, float x, float y)
{
if(action == MotionEvent.ACTION_DOWN || action == MotionEvent.ACTION_POINTER_DOWN)
{
createInput(finger, x, y);
}
else if(action == MotionEvent.ACTION_MOVE)
{
inputMove(finger, x, y);
}
else if(action == MotionEvent.ACTION_UP || action == MotionEvent.ACTION_POINTER_UP)
{
destroyInput(finger, x, y);
}
}
public void onTouch(MotionEvent ev)
{
final int pointerCount = ev.getPointerCount();
for (int p = 0; p < pointerCount; p++) {
onTouch(ev.getPointerId(p), ev.getAction(), ev.getX(p), ev.getY(p));
}
}
But it only works for the first one.
So from the examples given right in the MotionEvent class:
public boolean onTouch(MotionEvent ev) {
final int pointerCount = ev.getPointerCount();
for (int p = 0; p < pointerCount; p++) {
onTouch(ev.getPointerId(p), ev.getAction(), ev.getX(p), ev.getY(p));
}
return true;
}
You could take a look at the way the pinch is handled in AChartEngine for zoom. See this code, starting at line 80. It may look complex at the beginning, but it may help you on handling more use cases.