Anticipating the day when multi-touch interfaces become more pervasive, are there libraries in Java that can be used for developing touch applications? I'm looking for interfaces similar to MouseListener / MouseMotionListener / MouseWheelListener.
The MT4j project has everything you need to develop multitouch applications in java.
All the well known multitouch gestures are already built in and can be accessed as simple
as listening to mouse events (for example: component.addGestureListener(..)).
It also features a hardware accelerated scene graph, similar to JavaFX.
You can even simulate multitouch input by connecting one ore more mice to your machine.
Check it out at http://www.mt4j.org
Sparsh is still in my bookmarks from the last time I was investigating multitouch java solutions.
While not as straight forward as the typical mouse listener or click listener, it still provides a reasonable interface.
You need your listening class to implement sparshui.client.Client, which requires the processEvent method definition.
public void processEvent(int groupID, Event event) {
if(event instanceof TouchEvent) {
TouchEvent e = (TouchEvent)event;
if(e.getState() == TouchState.BIRTH) {
//do initial touch stuff
} else if(e.getState() == TouchState.MOVE) {
//do dragging stuff
}
}
else if(event instanceof DragEvent) {
DragEvent e = (DragEvent)event;
//do DragEvent specific stuff
} else if(event instanceof RotateEvent) {
RotateEvent e = (RotateEvent)event;
//do RotateEvent specific stuff
} else if(event instanceof ZoomEvent) {
ZoomEvent e = (ZoomEvent)event;
//do ZoomEvent specific stuff
}
//several other gesture types....
}
After that, you need to start up the gesture recognition server, passing in your component
new ServerConnection("localhost", objectImplementingClientInterface);
Looking at the code examples on the site should give you a pretty good idea of the framework.
How about this: http://kenai.com/projects/macmultitouch
I am primarily working in Processing and designing my UI from the ground up. I've been looking for a solution which doesn't prescribe a UI framework which both MT4J and JavaFX appear to do. Furthermore, MT4J appears to be abandoned.
This looks like a promising solution at least for Windows but I'm unsure if it's actually released yet:
http://wiki.gestureworks.com/index.php/GestureWorksCore:Gestureworks_Core_Tutorials
This is specifically for Processing, cross-platform, open-source and active:
https://github.com/vialab/SMT
MT4J doesn't work with Windows 8.
If the applicatin is only for one user, you can use JavaFX. There are different listener for touch events. But it is not possible to process two gestures at the same time, because all touch points will merge to one gesture. For big multi touch screens it is a disadvange. For normal screens, where is only one user its ok.
But there is also GestureWorks. There you can define new gesture or use the predefined gesture. The gestures are defined in a XML File (called GML). Any object can handle there own gestures. But you have to implement the hitTest and the point assignment manually. But there is a greate tutorial.
Another library, which i don't tested, ist the Multi Touch SDK by PQ Lab.
Related
I'm trying to build an onscreen keyboard in javafx. The only issue I am having is that when you select the javafx scene or click a button on the window, the focus is then redirected to the javafx scene and the button click doesn't actually type a letter on a browser, text document, etc.
Here is my code for the button click. I'm using the robot class.
private void handleButtonAction(ActionEvent event) throws AWTException {
Robot a = new Robot();
a.keyPress(KeyEvent.VK_Y);//testing keypress of "Y".
}
I've seen in Swing how you can set a focus property to false, but I'm set on using JavaFx. I've seen quite a few people attempt this same question, but no one has a correct answer.
I am afraid there is currently no way to achieve this purely with JavaFx.
Here is an example how you could achieve it by wrapping your JavaFx-App in a Swing-App. An architectural monstrosity ;).
Alternatively, I could imagine that using a native layer like JNI/JNA is possible in this scenario.
However, in this case I would question if Java is technically the right tool for the job. One might argue, that an OSK is by definition tightly coupled to the OS and as such more native toolkits are in order. This of course depends on the scope of your project. As a side project this might be fine, but as a product not.
i was wondering if it is possible to pass an MouseEvent, which occurs on a specific Stage to pass through the window of another program/application which to be fired there.
To be more specific, i want to program a HUD-like program for informational purposes, which should not interfere with the normal desktop usage (such as operating scrollbars/buttons which are underneath the "HUD").
One approach to accomplish such behaviour could be achieved using the Robot class in combination with an MouseListener.
Example:
scene.setOnMouseClicked(new EventHandler<MouseEvent>(){
#Override
public void handle(MouseEvent event) {
stage.hide();
robot.mousePress(InputEvent.BUTTON1_MASK)
stage.show();
}
});
That should work (not yet tried), but feels kinda like a workaround, also this attempt should make problems if one tries to handle drag & drop. Are there any other possibilities to pass any kind of MouseEvents to other windows? Is it possible to make a Stage mouse transparent such as it is possible for Nodes ?
I am making a sort of remote control program for Lego EV3 robot. that part is irrelevant. So i made a GUI and i want to control the robot when i press keys. I understand that i have to use something called KeyListener and i even saw a tutorial which is supposed to work.
The GUI class code is right here. Kinda long but it has KeyPressed event at the end.
http://pastebin.com/QK639BDs
I am not sure what i am doing wrong but the program doesn't detect if any key is pressed at all. Any.
I would really appreciate any help on how to make that work.
EDIT:
keyManager=KeyboardFocusManager.getCurrentKeyboardFocusManager();
keyManager.addKeyEventDispatcher(new KeyEventDispatcher() {
// UP:38 DOWN:40 LEFT:37 RIGHT:39
public boolean dispatchKeyEvent(KeyEvent e) {
if(e.getID()==KeyEvent.KEY_PRESSED && e.getKeyCode()==38){
System.out.println("UP");
return true;
}
if(e.getID()==KeyEvent.KEY_RELEASED&& e.getKeyCode()==38){
System.out.println("RELEASED");
return true;
}
return false;
}
});
So i browsed around and i found KeyboardFocusManger which is sort of working for me. I am testing it with a println. I am having only one problem. While i hold down the UP key i want it to only print UP once. Because the UP key will basically start the motor and it will keep moving until the key release will stop it.
Any ideas on how to do that?
KeyListener has issues, particularly with focus, so when any of your text fields are focused, your KeyListener won't respond, for example.
A better solution is to make use of the key bindings API, which allows you to control the level of focus required in order to trigger the key event. In combination with the Action API, you can define common actions for both your keys and buttons, for example.
Take a look at How to Use Key Bindings and How to Use Actions for more details.
Ps- I'm jealous and I wish you luck ;)
Do this:
frame.getContentPane().addKeyListener(this);
You may have to do it with different components depending on which one you want to have a key listener on.
I have developed many applications for desktop or web in java, but never for android.
What little I have practiced with was weird because I am used to SWING. Since it uses .xml files for applications I was wondering if you needed this also in games?
For example there is a a good example at http://developer.android.com/resources/samples/JetBoy/index.html. I was hoping someone could explain two things for me.
How is the image drawn to the screen?
How do you listen for user input? (Eg. Clicks, drawing, swiping, ect...)
I have made games on the desktop so you don't have to get in depth, but I found it hard to understand the code. (What I was suppose to do special for android.)
(Another way to put it:)
I guess basically what I am asking is how to I draw an image that will in turn listen to clicks and such without adding a button? Or do I need to add an invisible button? Then if I did have to add a button how would I listen for swiping and drawing?
Also I saw some methods like doDraw(), but did not see where they were putting the image so it would appear on the android device.
JetBoy is rather complicated for a beginner, check these simple examples which are based on the same principles as JetBoy: How can I use the animation framework inside the canvas?
Everything is done by drawing bitmaps on the view's canvas and an event which detects when and where the sreen was touched.
I think this will tell you everything you need to know, in a comprehensive, 31 part series detailing the creation of Light Racer 3d for Android.
If you are not going to use any Game Engine ,
basically you extend android.view.SurfaceHolder
and then overide these methods,
public boolean onTouchEvent(MotionEvent event)
protected void onDraw(Canvas canvas)
go through
these articles.It teaches everything you need from the scratch
In my SWT based application, I have a Canvas-derived custom Widget, which is displaying a bunch of "items". The whole purpose of these items is for the user to drag them out of the widget. I had no trouble implementing a DragSource, DragDetectListener and all that stuff to make DND work. The problem I am trying to solve is that I want the drag to be detected much earlier, i.e. after a much shorter mouse drag distance, than the default platform behavior.
I know I can override dragDetect() of the Widget class. However, this only allows me to veto the super class implementation, not to notify that a drag already happened before the super class would think it has.
Basically, if I could generate the drag event myself, like if I could just use Widget.postEvent(SWT.DragDetect, eventWhichIAllocatedAndFilledOut) (which is package private), that would seem like my solution. I've looked at the code for drag detection in Widget, and it doesn't seem to be designed for this use-case. Is there a work around that let's me initiate drags anytime I want?
I have figured it out. It is possible to generate a custom event and distribute it to the DragDetect listener mechanism. The code below does the same as the internal implementation, but can be called at will from within the Widget implementation, for example from a MouseMoveListener's mouseMove(MouseEvent e) hook:
Event event = new Event();
event.type = SWT.DragDetect;
event.display = getDisplay();
event.widget = this;
event.button = e.button;
event.stateMask = e.stateMask;
event.time = e.time;
event.x = e.x;
event.y = e.y;
notifyListeners(SWT.DragDetect, event);
It is noteworthy that the built-in drag detection has to be disabled for this to work as intended. The default implementation is exposed via the dragDetect(MouseEvent e) method that can be called from a mouseDown() handler (as explained in the documentation for dragDetect()). It works by busy looping in the event thread until the drag is detected. It simply consumes mouse move events from the native event queue on the GTK backend at least. When a DragDetectListener is registered with the Widget, this will automatically be done, so unless one disables the mechanism via setDragDetect(false), a custom drag detection would only run after the built-in detection which imposes the delay because it is blocking the event thread, besides detecting the drag a second time, of course.