Imagine a layout with 4 buttons
_______________________________
| | |
| A | B |
|______________|________________|
| | |
| C | D |
|______________|________________|
I'd like to detect the fling gesture over the whole layout but when the fling starts over a button is no detected.
I'm using:
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
gesturedetector= new GestureDetector(this, this);
findViewById(R.id.touchContainer).setOnTouchListener(new OnTouchListener() {
#Override
public boolean onTouch(View v, MotionEvent event) {
Log.e("","TouchEvent");
return gesturedetector.onTouchEvent(event);
}
});
}
It when there is no clickable items but fails if the fling start over a clickable item.
How can I solve that? Offering a bounty of 50 point for a complete working answer
One way I have achieved this is to override the following method:
public boolean onInterceptTouchEvent(MotionEvent event){
super.onInterceptTouchEvent(event);
...
You can override this method in your layout container (e.g. ViewGroup, or whatever you're holding the buttons with) and continue to return false from it in order to 'intercept' touch events that are being consumed by child Views (i.e. your buttons). Within that overridden method you can then call your gesture detector object with the MotionEvents. This method also 'sees' events that target the ViewGroup itself as well, which means - if I remember correctly - you would only need to call your gesture detector from within that method, and in doing so the gesture detector will 'see' all events, no matter whether they'er over the buttons or not. So if you drag your finger starting over a button and then ending at some point on the layout background, the gesture detector should see the entire swipe. You would not need to feed the gesture detector with the events from the layout's own onTouchEvent() because it'll have already seen them.
A second way:
I just looked at my project where I used this, and realised that I switched to a different way of doing it. What I actually did was I designed all of my child Views such that the parent Activity (or the containing ViewGroup) could register the same gesture detector object with all of those child Views (each of my special Views have a method called registerGestureDetector()). Then, in the overridden 'onTouchEvent()' in my child Views, I pass the MotionEvents to the gesture detector that has been registered with that View. In other words, the parent ViewGroup layout and all the child Views simply share the same gesture detector.
I realise that this may sound like a bit of hassle and not necessary considering it could be done using onInterceptTouchEvent(), but my application deals with some pretty complicated rules regarding how my Views need to respond to touch events and gestures, and it allowed me to apply some additional logic that I needed specific for my application. However, both of these methods I've used achieve the same basic objective here: to channel the MotionEvents that targetted various Views to the same gesture detector object.
Related
i create a system overlay app using this way
but i have a problem .... when i move my button to corner of screen, i can't touch system's view like Call button in the following image
image
how can i disable any touch in my button ? ( ignore my view's touch and touch system's view )
in somewhere i find this code but it isn't work
bl.setOnTouchListener(new View.OnTouchListener() {
public boolean onTouch(View v, MotionEvent event) {
return true;
}
});
Always visit developer.android.com first, it's really well documented with fundamental concepts.
The onTouch method will pass the event to the layer below it, if it returns false.
If you've extended a default touchable View class, you should use return super.onTouch()
Here's the link you're looking for:
https://developer.android.com/reference/android/view/View.OnTouchListener.html
I have created some activity which is transparent when some different app opens my activity starts and opens on top of that activity (Intent.FLAG_ACTIVITY_NEW_TASK).
What i am trying to achieve is what action happens on my activity will reflect to other activity.I mean when i scroll down underlaying view will scroll.
I could NOT do that i have used some flags combinations but it did not work.
I could not pass touch events both activities at the same time. It just works at one view, i need to do what happens on top (transparent) activity , underlaying activity has get to same events.
Window window = getWindow();
window.addFlags(WindowManager.LayoutParams.FLAG_NOT_TOUCH_MODAL);
window.addFlags(WindowManager.LayoutParams.FLAG_NOT_FOCUSABLE);
// window.addFlags(WindowManager.LayoutParams.FLAG_NOT_TOUCH_MODAL);
window.addFlags(WindowManager.LayoutParams.FLAG_SPLIT_TOUCH);
// window.addFlags(WindowManager.LayoutParams.FLAG_WATCH_OUTSIDE_TOUCH);
//window.addFlags(WindowManager.LayoutParams.TYPE_PHONE);
setContentView(R.layout.trans);
final View v = getWindow().getDecorView().findViewById(android.R.id.content);
v.setOnTouchListener(new View.OnTouchListener() {
#Override
public boolean onTouch(View view, MotionEvent motionEvent) {
Log.i("TAG", View !!!!!!!!");
return false;
}
});
This is not possible. You can neither inject input events to other apps' Activities nor can you receive their input events (known as tapjacking).
You cannot do that with touch events, I am also not sure why would you want to do so.
That said, you can use local broadcasts or EventBus but it is a really bad idea, catch touch events in one Activity and pass to the other one after that.
First off, I've been reading this site for years now and it's helped me out of a bind several times, so thank you to the community here who contribute, and hopefully you can help me with a problem of my own.
I'm just starting out with Android development at my company and I'm attempting to port an existing application from Windows Mobile C# to Android Java. Most of it is going smoothly, but one area I'm having some difficulty is the UI.
The Windows Mobile application reads in a survey specification from a file when the WinForm is created. In the case of a closed-ended question (such as multiple choice), I need to populate the screen with either a CheckBox or RadioButton control for each applicable answer in the spec. Creating the layout and controls required is no problem, but we also have a requirement that the screen does not scroll. Because of this our software needs to be able to calculate the best possible fit within the available screen space without overflow (ie. 1-4 columns used for display) before it's displayed
I have written my UI (at least the layout) as both an XML resource or Java code, but because methods like GetWidth() and GetHeight() return 0 in onCreate(), I haven't yet been able to add this required pre-processing.
All of this needs to happen prior to the screen showing. If anyone can point me in the right direction, I would be immensely grateful.
When Android builds the UI from a layout, the root of the layout requests all of it's children to report their desired size by calling onMeasure(). It does this is a recursive fashion bottom up. if necessary, the parent view will then set the size of the children so that they fit. As you have found, this measuring pass is not finished during onCreate().
Try a global layout listener.
public void onCreate(Bundle savedInstanceState)
{
super.onCreate(savedInstanceState);
// inflate your main layout here (use RelativeLayout or whatever your root ViewGroup type is
LinearLayout mainLayout = (LinearLayout ) this.getLayoutInflater().inflate(R.layout.main, null);
// set a global layout listener which will be called when the layout pass is completed and the view is drawn
mainLayout.getViewTreeObserver().addOnGlobalLayoutListener(
new ViewTreeObserver.OnGlobalLayoutListener() {
public void onGlobalLayout() {
// measure your views here
}
}
);
setContentView(mainLayout);
For testing purposes, I need to get the coordinates of all visible views on the screen. However, when checking the output, it seems the UI Thread is not done drawing/positioning/applying settings to all the views yet. Some views are 0x0 pixels while they should be (and they are on both the emulator and a physical device) visible. Some bottom aligned buttons are positioned like stairs, et cetera.
Q: how can i wait for the UI Thread to complete drawing (or at least wait for like a second, that should be more than enough), so the coordinates of all visible views are accurate?
I suspects it's something with Threads, but I couldn't find any definitive answers. As of yet, I do not have any self-declared threads.
Edit: I use onBackPressed to make a bunch of views visible, then capture that in xml, make the previous views invisible and other views visible, capture that in xml, etc. I iterate trough a few different combinations of views and "xml-screenshot" each combination.
LinearLayout layout = (LinearLayout)findViewById(R.id.YOUD VIEW ID);
ViewTreeObserver vto = layout.getViewTreeObserver();
vto.addOnGlobalLayoutListener(new OnGlobalLayoutListener() {
#Override
public void onGlobalLayout() {
this.layout.getViewTreeObserver().removeGlobalOnLayoutListener(this);
int width = layout.getMeasuredWidth();
int height = layout.getMeasuredHeight();
}
});
You'll need to adjust this to work with your layout, and add an ID to the basic parent layout.
By using a ViewTreeObserver, you can get to know when your layout has finished drawing, and run your required code then
I'm very new to Android programming, and trying to understand touch events with nested views. To start, here's a description of my app:
I have a relative layout that I've added via the GUI editor. Everything is default. I've also created a class called ClipGrid that extends ScrollView. Nested inside that, I make a HorizontalScrollView. Inside of that, I make a TableLayout and it's rows. The rows contain buttons.
The end result is a grid of buttons. It displays 4x4 at once, but can scroll either direction to display other buttons.
I call it to the screen from my main activity like this:
ClipGrid clip_grid = new ClipGrid(this);
setContentView(clip_grid);
I did that just for testing purposes, and I think I will have to change it later when I want to add other views to my relativelayout. But I think it might have implications for touch events.
in the end, I want to detect when the grid has been moved and snap the newly viewable 4x4 grid of buttons to the edge of my layout when the user lifts their finger. I'm just not sure how to go about implementing this and any help would be appreciated. Thanks.
The way touch events are handled is kind of a cascading effect that starts from the top view and goes down to the lower nested views. Basically, Android will pass the event to each view until true is returned.
The general way you could implement the onTouchEvent event of a View would be:
#Override
public boolean onTouchEvent(MotionEvent event) {
boolean actionHandled = false;
final int action = event.getAction();
switch(action & MotionEventCompat.ACTION_MASK) {
case MotionEvent.ACTION_DOWN:
// user first touches view with pointer
break;
case MotionEvent.ACTION_MOVE:
// user is still touching view and moving pointer around
break;
case MotionEvent.ACTION_UP:
// user lifts pointer
break;
}
// if the action was not handled by this touch, send it to other views
if (!actionHandled)
actionHandled |= super.onTouch(v, MotionEvent.event);
return actionHandled;
}