```import javax.imageio.ImageIO;
import javax.swing.*;
import java.awt.*;
import java.awt.event.InputEvent;
import java.awt.image.BufferedImage;
import java.io.File;
import java.awt.Graphics;
import java.awt.Image;
import java.io.IOException;
public class Movement {
public static void main(String args[]) throws AWTException, IOException {
Robot mineMove = new Robot();
mineMove.delay(4000);
moveMouse(mineMove, 500, 0);
placeBlock(mineMove);
moveMouse(mineMove, -500, 0);
mineMove.delay(500);
moveMouse(mineMove, 0, -600);
interactWithVill(mineMove);
moveMouse(mineMove, 0, 600);
// moveMouse(mineMove, 500, 0);
// breakAndPickBlock(mineMove);
// moveMouse(mineMove, -500, 0);
}```
public void saveScreen (Robot mineMove) throws IOException {
Rectangle screen = new Rectangle();
screen.x = 0;
screen.y = 0;
screen.height = 1200;
screen.width = 800;
BufferedImage i = mineMove.createScreenCapture(screen);
// System.out.println(i.getType());
// Graphics g = i.getGraphics();
// JFrame frame = new JFrame();
// Image img = i;
// g.drawImage(img, 20,20,null);
File file = new File("Tester-Screen-Grab.png");
ImageIO.write(i, "png", file);
}
``` ** public static void interactWithVill(Robot mineMove) {
mineMove.mousePress(InputEvent.BUTTON3_DOWN_MASK);
mineMove.delay(100);
mineMove.mouseMove(731, 302);
//sreenshot and look
mineMove.delay(1000);
mineMove.mouseMove(731, 384);
//screenshot and check enchant
mineMove.delay(1000);
mineMove.keyPress(27); // keycode : 27 = ESC
mineMove.delay(10);
mineMove.keyRelease(27);
mineMove.delay(1000);
}**```
This is the method where I think the problem lies.
``` public static void moveMouse(Robot mineMove, int mouseMoveCordY, int mouseMoveCordX) {
Point mousePos = MouseInfo.getPointerInfo().getLocation();
int currPosX = (int) mousePos.getX();
int currPosY = (int) mousePos.getY();
currPosY += mouseMoveCordY;
currPosX += mouseMoveCordX;
mineMove.mouseMove(currPosX, currPosY);
mineMove.delay(50);
}```
``` public static void placeBlock (Robot mineMove) {
mineMove.keyPress(50); // keycode : 50 = 2
mineMove.delay(10);
mineMove.keyRelease(50);
mineMove.delay(10);
mineMove.mousePress(InputEvent.BUTTON3_DOWN_MASK);
mineMove.delay(100);
mineMove.mouseRelease(InputEvent.BUTTON3_DOWN_MASK);
mineMove.delay(1000);
}```
``` public static void breakAndPickBlock (Robot mineMove) {
mineMove.mousePress(InputEvent.BUTTON1_DOWN_MASK);
mineMove.delay(500);
mineMove.mouseRelease(InputEvent.BUTTON1_DOWN_MASK);
mineMove.keyPress(87);
mineMove.delay(1000);
mineMove.keyRelease(87);
mineMove.keyPress(83);
mineMove.delay(1000);
mineMove.keyRelease(83);
}```
}
I am trying to make a simple bot to practice making my own projects and I wanted to use the Robot class to make mouse movements and keystrokes in the game.
Everything worked until the "interactWithVill" method. When this runs it will work fine but then the computer will be stuck on whatever screen I am on, whether that be in the game or in the IDE, and no matter what I do I cannot interact with other windows but the window that it was stuck on works fine. The only fix is closing the window I was stuck on. Any help would be great if there were any errors I did working with the Robot class. If there is a better approach or class to use any help would be great as I am new to this.
Another smaller issue is that when I try to use Robot.moveMouse(int x, int y) it won't work unless I wiggle the mouse a bit before that command runs, why would this happen?
Thank you!
Related
I'm creating a program that displays animated gifs. Because some animated gif files only store the pixels that changed from the previous frame, before each frame is displayed, it's being drawn to a master BufferedImage object, named master, then that BufferedImage is being drawn. The problem is that drawing the frames (stored as BufferedImage objects themselves) to the master reduces their quality.
I know it's not a problem with the frames themselves, if I just draw the frames individually without drawing them to master then they look fine. It's also not a problem with the fact that there's lots of frames being layered on top of each other, even the first frame shows quality reduction. I've tried setting every RenderingHint to every possible value, but it changes nothing.
Below is my code, with unnecessary parts for solving this problem omitted:
import java.awt.image.BufferedImage;
import java.awt.Component;
import java.awt.Dimension;
import java.awt.Graphics;
import java.awt.Graphics2D;
import java.awt.event.ActionEvent;
import java.awt.event.ActionListener;
import java.io.File;
import java.io.IOException;
import java.util.ArrayList;
import java.util.Iterator;
import javax.activation.MimetypesFileTypeMap;
import javax.imageio.IIOImage;
import javax.imageio.ImageIO;
import javax.imageio.ImageReader;
import javax.imageio.metadata.IIOMetadata;
import javax.imageio.metadata.IIOMetadataNode;
import javax.imageio.stream.FileImageInputStream;
import javax.swing.JFrame;
import javax.swing.JPanel;
import javax.swing.SwingUtilities;
import javax.swing.Timer;
#SuppressWarnings("serial")
class A extends javax.swing.JPanel{
public static final String PATH = "C:/Users/Owner/Desktop/test.gif";
public B i;
public A() throws java.io.IOException{
i = new B(new java.io.File(PATH));
i.registerComponent(this);
}
#Override
public java.awt.Dimension preferredSize(){
return i.getSize();
}
#Override
public void paintComponent(java.awt.Graphics g){
i.draw(g);
}
public static void main(String[] args){
javax.swing.SwingUtilities.invokeLater(new Runnable(){
public void run(){
javax.swing.JFrame f = new javax.swing.JFrame();
f.setDefaultCloseOperation(javax.swing.JFrame.EXIT_ON_CLOSE);
try{
f.add(new A());
}catch(Exception e){
}
f.pack();
f.setVisible(true);
}
});
}
}
class B{
private final static String META_FORMAT = "javax_imageio_gif_image_1.0";
// instance variables
private final BufferedImage[] frames;
private BufferedImage master;// Because Gif images can store only the changing
// pixels, the first frame is drawn to this image, then the next one *on top of it*, etc.
private final short[] frameDurations; // in 100ths of a second
private final short[] xOffsets;
private final short[] yOffsets;
private int frame = 0;
private final Dimension size;// the size of the gif (calculated in findSize)
private final Timer animationTimer;
// constructor from a File (checked to be a gif)
public B(File src) throws IOException{
if (!(new MimetypesFileTypeMap().getContentType(src.getPath()).equals("image/gif"))){
throw new IOException("File is not a gif. It's Mime Type is: " +
new MimetypesFileTypeMap().getContentType(src.getAbsolutePath()));
}
FileImageInputStream stream = new FileImageInputStream(src);
Iterator<ImageReader> readers = ImageIO.getImageReaders(stream);
ImageReader reader = null;
// loop through the availible ImageReaders, find one for .gif
while (readers.hasNext()){
reader = readers.next();
String metaFormat = reader.getOriginatingProvider().getNativeImageMetadataFormatName();
// if it's a gif
if ("gif".equalsIgnoreCase(reader.getFormatName()) && META_FORMAT.equals(metaFormat)){
break;
}else{
reader = null;
continue;
}
}// while (readers.hasNext())
// if no reader for gifs was found
if (reader == null){
throw new IOException("File could not be read as a gif");
}
reader.setInput(stream, false, false);
// Lists to be converted to arrays and set as the instance variables
ArrayList<BufferedImage> listFrames = new ArrayList<BufferedImage>();
ArrayList<Short> listFrameDurs = new ArrayList<Short>();
ArrayList<Short> listXs = new ArrayList<Short>();
ArrayList<Short> listYs = new ArrayList<Short>();
boolean unknownMeta = false;// asume that the metadata can be read until proven otherwise
// loop until there are no more frames (since that isn't known, break needs to be used)
for (int i = 0;true;i++){// equivalent of while(true) with a counter
IIOImage frame = null;
try{
frame = reader.readAll(i, null);
}catch(IndexOutOfBoundsException e){
break;// this means theres no more frames
}
listFrames.add((BufferedImage)frame.getRenderedImage());
if (unknownMeta){// if the metadata has already proven to be unreadable
continue;
}
IIOMetadata metadata = frame.getMetadata();
IIOMetadataNode rootNode = null;
try{
rootNode = (IIOMetadataNode) metadata.getAsTree(META_FORMAT);
}catch(IllegalArgumentException e){
// means that the metadata can't be read, it's in an unknown format
unknownMeta = true;
continue;
}
// get the duration of the current frame
IIOMetadataNode graphicControlExt = (IIOMetadataNode)rootNode.getElementsByTagName("GraphicControlExtension").item(0);
listFrameDurs.add(Short.parseShort(graphicControlExt.getAttribute("delayTime")));
// get the x and y offsets
try{
IIOMetadataNode imageDescrip = (IIOMetadataNode)rootNode.getElementsByTagName("ImageDescriptor").item(0);
listXs.add(Short.parseShort(imageDescrip.getAttribute("imageLeftPosition")));
listYs.add(Short.parseShort(imageDescrip.getAttribute("imageTopPosition")));
}catch(IndexOutOfBoundsException e){
e.printStackTrace();
listXs.add((short) 0);
listYs.add((short) 0);
}
}// for loop
reader.dispose();
// put the values in the lists into the instance variable arrays
frames = listFrames.toArray(new BufferedImage[0]);
// looping must be used because the ArrayList can't contian primitives
frameDurations = new short[listFrameDurs.size()];
for (int i = 0;i < frameDurations.length;i++){
frameDurations[i] = (short)(listFrameDurs.get(i) * 10);
}
xOffsets = new short[listXs.size()];
for (int i = 0;i < xOffsets.length;i++){
xOffsets[i] = listXs.get(i);
}
yOffsets = new short[listYs.size()];
for (int i = 0;i < yOffsets.length;i++){
yOffsets[i] = listYs.get(i);
}
size = findSize();
animationTimer = new Timer(frameDurations[0], null);
clearLayers();
}
// finds the size of the image in constructors
private final Dimension findSize(){
int greatestX = -1;
int greatestY = -1;
// loop through the frames and offsets, finding the greatest combination of the two
for (int i = 0;i < frames.length;i++){
if (greatestX < frames[i].getWidth() + xOffsets[i]){
greatestX = frames[i].getWidth() + xOffsets[i];
}
if (greatestY < frames[i].getHeight() + yOffsets[i]){
greatestY = frames[i].getHeight() + yOffsets[i];
}
}// loop
return new Dimension(greatestX, greatestY);
}// findSize
private BufferedImage getFrame(){
/* returning frames[frame] gives a perfect rendering of each frame (but only changed
* pixels), but when master is returned, even the first frame shows quality reduction
* (seen by slowing down the framerate). The issue is with drawing images to master
*/
Graphics2D g2d = master.createGraphics();
g2d.drawImage(frames[frame], xOffsets[frame], yOffsets[frame], null);
g2d.dispose();
return master;
}
public Dimension getSize(){
return size;
}
// adds a FrameChangeListener associated with a component to the Timer
public void registerComponent(Component c){
FrameChangeListener l = new FrameChangeListener(c);
animationTimer.addActionListener(l);
if (!animationTimer.isRunning()){
animationTimer.start();
}
}
// draws the image to the given Graphics context (registerComponent must be used for the image
// to animate properly)
public void draw(Graphics g){
g.drawImage(getFrame(), 0, 0, null);
}
// resets master
private void clearLayers(){
master = new BufferedImage((int)size.getWidth(), (int)size.getHeight(), frames[0].getType());
}
// class that listens for the Swing Timer.
private class FrameChangeListener implements ActionListener{
private final Component repaintComponent;
// the Components repaint method will be invoked whenever the animation changes frame
protected FrameChangeListener(Component c){
repaintComponent = c;
}
public void actionPerformed(ActionEvent e){
frame++;
int delay;
try{
delay = frameDurations[frame] * 10;
}catch(ArrayIndexOutOfBoundsException x){
frame = 0;
clearLayers();
delay = frameDurations[frame] * 10;
}
animationTimer.setDelay(delay);
repaintComponent.repaint();
}// actionPerformed
}// FrameChangeListener
}
And here is the image file I've been using to test:
And here is how it displays:
It would be much appreciated if anyone could help me solve this issue
The problem is this line from the clearLayers() method:
master = new BufferedImage((int)size.getWidth(), (int)size.getHeight(), frames[0].getType());
As the GIF uses a palette, the BufferedImage type will be TYPE_BYTE_INDEXED. However, if you pass this parameter to the BufferedImage constructor, it will use a default IndexColorModel (a built-in, fixed 256 color palette), not the palette from your GIF. Thus, the frames from the GIF will have to be dithered into the destination, as the colors doesn't match.
Instead, use TYPE_INT_RGB/TYPE_INT_ARGB for type, or use the constructor that also takes an IndexColorModel parameter and pass the IndexColorModel from the frames of the GIF.
In code:
master = new BufferedImage((int)size.getWidth(), (int)size.getHeight(), BufferedImage.TYPE_INT_ARGB);
Alternatively, the following should also work if all frames of the GIF uses the same palette (not necessarily the case):
master = new BufferedImage((int)size.getWidth(), (int)size.getHeight(), frames[0].getType(), (IndexColorModel) frames[0].getColorModel());
However, as the OP reports back the latter option doesn't work for him, the first option is probably safer. :-)
Recently i decided to start learning how to make 2D games With JAVA ( eclipse ) so i found a tutorial online that shows how to make superMari game with java, i wrote the same code he wrote and i followed step by step what he did, which wasn't a big thing to talk about, unfortunately he's code shows, after excuting, a window with two images in it while mine shows just the window with no images, i ensure you that i imported the two images and put them in one package to avoid all kind of problems but it still shows nothing.
my code has two classes, "main" and "Scene", here it is, hopefully someone will find a solution for me, thank you guys!
Main.java :
package AiMEUR.AMiN.jeu;
import javax.swing.JFrame;
public class Main {
public static Scene scene;
public static void main(String[] args) {
JFrame fenetre = new JFrame("Naruto in mario World!!");
fenetre.setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE);
fenetre.setSize(700, 360);
fenetre.setLocationRelativeTo(null);
fenetre.setResizable(false);
fenetre.setAlwaysOnTop(true);
scene = new Scene();
fenetre.setContentPane(scene);
fenetre.setVisible(true);
}
}
Scene.java :
package AiMEUR.AMiN.jeu;
import java.awt.Graphics;
import java.awt.Graphics2D;
import java.awt.Image;
import javax.swing.ImageIcon;
import javax.swing.JPanel;
#SuppressWarnings("serial")
public class Scene extends JPanel{
private ImageIcon icoFond;
private Image imgFond1;
private ImageIcon icoMario;
private Image imgMario;
private int xFond1;
public Scene(){
super();
this.xFond1 = -50;
icoFond = new ImageIcon(getClass().getResource("/Images/fond.gif"));
this.imgFond1 = this.icoFond.getImage();
icoMario = new ImageIcon(getClass().getResource("/Images/1.png"));
this.imgMario = this.icoMario.getImage();
// paintComponent(this.getGraphics());
}
public void paintCompenent(Graphics g){
super.paintComponent(g);
Graphics g2 = (Graphics2D)g;
g2.drawImage(this.imgFond1, this.xFond1, 0, null);
g2.drawImage(imgMario, 300, 245, null);
}
}
You have not named the paintComponent method correctly, and therefore it is not being overridden.
The correct name is paintComponent not paintCompenent:
public class Example extends JPanel {
#Override
public void paintComponent(final Graphics g) {
super.paintComponent(g);
}
}
You can determine the loading status of an ImageIcon by doing something like this:
public Scene(){
super();
this.xFond1 = -50;
icoFond = new ImageIcon(getClass().getResource("/Images/fond.gif"));
int status = icoFond.getImageLoadStatus();
switch (status) {
case (MediaTracker.COMPLETE): {
System.out.println("icoFond image has successfully loaded");
}
case (MediaTracker.ERRORED): {
System.out.println("The icoFond image didn't load successfully");
// probably because the image isn't actually at "/Images/fond.gif"
}
}
this.imgFond1 = this.icoFond.getImage();
icoMario = new ImageIcon(getClass().getResource("/Images/1.png"));
this.imgMario = this.icoMario.getImage();
// paintComponent(this.getGraphics());
}
I've been searching for a solution to accessing a built-in webcam from a java or javaFX application. I've seen loads of other posts pointing to OpenCV and JavaCV, Sarxos's library and quite a few others.
I've run into difficulties such as newer versions of OpenCV not working with older code posted on various sites and newer code that uses OpenCV 3.0 is hard to find or doesn't do what I need, which is simply a customer application which saves an image taken from the web cam to a variable (or file).
Hope someone can point me in the right direction.
Thanks in advance
You're in luck. I toyed around with OpenCV last weekend and ran into the same problems as you. Here's an example about how to do it. The example opens the camera, uses an AnimationTimer (a bit overkill, but was a quick solution for prototyping) to grab a mat image periodically, converts the mat image to a JavaFX image, performs face detection and paints it on a canvas.
Here's what you need:
Download OpenCV, e. g. in my case the windows version. Rename the opencv-3.0.0.exe to opencv-3.0.0.exe.zip and open it. Extract the contents of build/java.
Create a new JavaFX project. Put the jar and dlls into a lib folder, e. g.:
lib/opencv-300.jar
lib/x64/opencv_java300.dll
Add the jar to your build path.
In your src folder create a path opencv/data/lbpcascades and put the file lbpcascade_frontalface.xml in there (found in etc/lbpcascades). That's only for face detection, you can uncomment the code in case you don't need it.
Create the application class, code:
import java.io.ByteArrayInputStream;
import java.lang.reflect.Field;
import java.net.URISyntaxException;
import java.nio.file.Paths;
import java.util.ArrayList;
import java.util.List;
import javafx.animation.AnimationTimer;
import javafx.application.Application;
import javafx.event.EventHandler;
import javafx.geometry.Rectangle2D;
import javafx.scene.Group;
import javafx.scene.Scene;
import javafx.scene.canvas.Canvas;
import javafx.scene.canvas.GraphicsContext;
import javafx.scene.image.Image;
import javafx.scene.paint.Color;
import javafx.stage.Stage;
import javafx.stage.WindowEvent;
import org.opencv.core.Core;
import org.opencv.core.Mat;
import org.opencv.core.MatOfByte;
import org.opencv.core.MatOfRect;
import org.opencv.core.Rect;
import org.opencv.imgcodecs.Imgcodecs;
import org.opencv.objdetect.CascadeClassifier;
import org.opencv.videoio.VideoCapture;
public class Camera extends Application {
private static final int SCENE_W = 640;
private static final int SCENE_H = 480;
CascadeClassifier faceDetector;
VideoCapture videoCapture;
Canvas canvas;
GraphicsContext g2d;
Stage stage;
AnimationTimer timer;
#Override
public void start(Stage stage) {
this.stage = stage;
initOpenCv();
canvas = new Canvas(SCENE_W, SCENE_H);
g2d = canvas.getGraphicsContext2D();
g2d.setStroke(Color.GREEN);
Group group = new Group(canvas);
Scene scene = new Scene(group, SCENE_W, SCENE_H);
stage.setScene(scene);
stage.setResizable(false);
stage.show();
timer = new AnimationTimer() {
Mat mat = new Mat();
#Override
public void handle(long now) {
videoCapture.read(mat);
List<Rectangle2D> rectList = detectFaces(mat);
Image image = mat2Image(mat);
g2d.drawImage(image, 0, 0);
for (Rectangle2D rect : rectList) {
g2d.strokeRect(rect.getMinX(), rect.getMinY(), rect.getWidth(), rect.getHeight());
}
}
};
timer.start();
}
public List<Rectangle2D> detectFaces(Mat mat) {
MatOfRect faceDetections = new MatOfRect();
faceDetector.detectMultiScale( mat, faceDetections);
System.out.println(String.format("Detected %s faces", faceDetections.toArray().length));
List<Rectangle2D> rectList = new ArrayList<>();
for (Rect rect : faceDetections.toArray()) {
int x = rect.x;
int y = rect.y;
int w = rect.width;
int h = rect.height;
rectList.add(new Rectangle2D(x, y, w, h));
}
return rectList;
}
private void initOpenCv() {
setLibraryPath();
System.loadLibrary(Core.NATIVE_LIBRARY_NAME);
videoCapture = new VideoCapture();
videoCapture.open(0);
System.out.println("Camera open: " + videoCapture.isOpened());
stage.setOnCloseRequest(new EventHandler<WindowEvent>() {
public void handle(WindowEvent we) {
timer.stop();
videoCapture.release();
System.out.println("Camera released");
}
});
faceDetector = new CascadeClassifier(getOpenCvResource(getClass(), "/opencv/data/lbpcascades/lbpcascade_frontalface.xml"));
}
public static Image mat2Image(Mat mat) {
MatOfByte buffer = new MatOfByte();
Imgcodecs.imencode(".png", mat, buffer);
return new Image(new ByteArrayInputStream(buffer.toArray()));
}
private static void setLibraryPath() {
try {
System.setProperty("java.library.path", "lib/x64");
Field fieldSysPath = ClassLoader.class.getDeclaredField("sys_paths");
fieldSysPath.setAccessible(true);
fieldSysPath.set(null, null);
} catch (Exception ex) {
ex.printStackTrace();
throw new RuntimeException(ex);
}
}
public static String getOpenCvResource(Class<?> clazz, String path) {
try {
return Paths.get( clazz.getResource(path).toURI()).toString();
} catch (URISyntaxException e) {
throw new RuntimeException(e);
}
}
public static void main(String[] args) {
launch(args);
}
}
Of course you can do whatever you wish (e. g. saving) with the JavaFX image once you have it.
for sarxos, pseudo code, i can't publish whole class:
import com.sleepingdumpling.jvideoinput.Device;
import com.sleepingdumpling.jvideoinput.VideoFrame;
import com.sleepingdumpling.jvideoinput.VideoInput;
Device choosenDevice;
for (Device device : VideoInput.getVideoDevices()) {
// select your choosenDevice webcam here
if (isMyWebcam(device)) {
choosenDevice = device;
break;
}
}
// eg. VideoInput(640,480,25,choosenDevice );
VideoInput videoInput = new VideoInput(frameWidth, frameHeigth,
frameRate, choosenDevice );
VideoFrame vf = null;
while (grabFrames) {
vf = videoInput.getNextFrame(vf);
if (vf != null) {
frameReceived(vf.getRawData());
// or vf.getBufferedImage();
}
}
videoInput.stopSession();
QUESTION SUMMARY: I generated a .JAR file using Netbeans 7.2.1 for a Java Swing/AWT program developed on Windows 7 (Java version 1.8.0_40), that helps collect user handwriting from the screen. It runs fine on a Windows 7 notebook PC but due to some reason captures handwriting data only on a specific region of the screen on a Windows 8.1 tablet (Java version 1.8.0_45). Could someone kindly tell me why this is happening?
DETAILS: I have a requirement of collecting online handwriting samples (i.e. those acquired from electronic devices like a tablet PC using a pen/stylus and writing surface) for some analysis
Being new to developing programs of this nature, I read up about it on the web and decided to use the Java Swing/AWT toolkits
A person's handwriting is composed of strokes which are in turn composed of points. My objective was to capture:
- the X- and Y-coordinates of a point on the screen
- the timestamp of creation of this point
- the stroke's start-time, end-time and color (color not too important)
To this end I wrote the following program using Netbeans 7.2.1 IDE with Java 1.8.0_40 on a Windows 7 Home Basic OS
/*
* To change this template, choose Tools | Templates
* and open the template in the editor.
*/
package handwritingsamplerawt;
import java.awt.AWTException;
import java.awt.Color;
import java.awt.Dimension;
import java.awt.FlowLayout;
import java.awt.Graphics;
import java.awt.Rectangle;
import java.awt.Robot;
import java.awt.Toolkit;
import java.awt.event.ActionEvent;
import java.awt.event.ActionListener;
import java.awt.event.MouseAdapter;
import java.awt.event.MouseEvent;
import java.awt.image.BufferedImage;
import java.io.File;
import java.io.FileWriter;
import java.io.IOException;
import java.text.DecimalFormat;
import java.text.SimpleDateFormat;
import java.util.*;
import javax.imageio.ImageIO;
import javax.swing.JButton;
import javax.swing.JFrame;
import javax.swing.JPanel;
import javax.swing.SwingUtilities;
public class HandwritingSamplerAWT {
static JFrame frame;
public static void main(String[] args) {
SwingUtilities.invokeLater(new Runnable() {
public void run() {
CreateAndShowGUI();
}
});
}
private static void CreateAndShowGUI() {
frame = new JFrame("Writing Surface v0.1");
frame.getContentPane().setLayout(new FlowLayout(FlowLayout.RIGHT));
frame.setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE);
frame.setBackground(Color.LIGHT_GRAY);
frame.pack();
frame.add(new MyPanel());
frame.setVisible(true);
}
}
class MyPanel extends JPanel{
private int x,y;
static int strokeIndex;
private long reducedMillis;
private ArrayList<StrokeInfo> strokes;
private JButton btnSave;
public MyPanel() {
MyPanel.strokeIndex=0;
this.reducedMillis = 1435800000000L;
this.strokes = new ArrayList<>();
this.btnSave = new JButton("SAVE SAMPLE");
this.btnSave.addActionListener(new ActionListener() {
public void actionPerformed(ActionEvent e) {
WriteCoordinates();
}
});
this.add(this.btnSave);
addMouseListener(new MouseAdapter() {
public void mousePressed(MouseEvent e) {
x=e.getX();
y=e.getY();
SaveCoordinates(x,y,"PRESSED");
repaint();
}
});
addMouseMotionListener(new MouseAdapter() {
public void mouseDragged(MouseEvent e) {
x=e.getX();
y=e.getY();
SaveCoordinates(x,y,"DRAGGED");
repaint();
}
});
addMouseListener(new MouseAdapter() {
public void mouseReleased(MouseEvent e) {
x=e.getX();
y=e.getY();
SaveCoordinates(x,y,"RELEASED");
repaint();
}
});
}
void SaveCoordinates(int xCoordinate, int yCoordinate, String actionIndicator){
try {
Calendar cal = Calendar.getInstance();
Date currDate = cal.getTime();
double timeStamp=(double)(currDate.getTime()-reducedMillis);
PointInfo pointObj = new PointInfo(xCoordinate, yCoordinate, timeStamp);
switch (actionIndicator) {
case "PRESSED":
StrokeInfo newStroke = new StrokeInfo();
newStroke.points.add(pointObj);
strokes.add(newStroke);
break;
case "DRAGGED":
strokes.get(strokeIndex).points.add(pointObj);
break;
case "RELEASED":
strokeIndex+=1;
break;
}
} catch (Exception ex){
String errMsg = ex.getMessage();
System.out.println(errMsg);
}
}
void WriteCoordinates() {
try {
Calendar cal = Calendar.getInstance();
SimpleDateFormat dateFormat = new SimpleDateFormat("yyyy-MM-dd-HH-mm-ss");
String currTimeString = dateFormat.format(cal.getTime());
DecimalFormat decFormat = new DecimalFormat("#");
decFormat.setMaximumFractionDigits(2);
FileWriter writer = new FileWriter("D:\\HandwritingCaptures\\HandwritingText\\"+currTimeString+".txt");
SetStrokeAttributes(strokes);
ListIterator<PointInfo> pointItr;
if (strokes.isEmpty()==false) {
for (int index = 0; index < strokeIndex; index++) {
writer.write(strokes.get(index).colour);
writer.append('\t');
writer.write(decFormat.format( strokes.get(index).startTime));
writer.append('\t');
writer.write(decFormat.format( strokes.get(index).endTime));
writer.append('\n');
pointItr = strokes.get(index).points.listIterator();
while (pointItr.hasNext()) {
PointInfo currPoint = pointItr.next();
writer.write(String.valueOf(currPoint.x));
writer.append('\t');
writer.write(String.valueOf(currPoint.y));
writer.append('\t');
writer.write(decFormat.format(currPoint.timestamp));
writer.append('\n');
}
writer.append('#');
writer.append('\n');
}
}
writer.close();
SaveScreenshot("D:\\HandwritingCaptures\\Screenshots\\"+currTimeString+".png");
} catch (IOException ex) {
System.out.println(ex.getMessage());
}
}
void SetStrokeAttributes(ArrayList<StrokeInfo> strokeList) {
double startTime, endTime;
String colour;
StrokeInfo tmpStroke;
ArrayList<PointInfo> points;
if (strokeList.isEmpty() == false) {
for (int index = 0; index < strokeList.size(); index++) {
tmpStroke = strokeList.get(index);
points = tmpStroke.points;
tmpStroke.colour = "black";
tmpStroke.startTime=points.get(0).timestamp;
tmpStroke.endTime=points.get(points.size()-1).timestamp;
strokeList.set(index, tmpStroke);
}
}
}
void SaveScreenshot(String imgFilePath){
try {
Rectangle screenRect = new Rectangle(Toolkit.getDefaultToolkit().getScreenSize());
BufferedImage capture = new Robot().createScreenCapture(screenRect);
ImageIO.write(capture, "png", new File(imgFilePath));
} catch (IOException | AWTException ex) {
System.out.println(ex.getMessage());
}
}
public Dimension getPreferredSize() {
return new Dimension(1366,768);
}
protected void paintComponent(Graphics g) {
super.paintComponents(g);
g.setColor(Color.BLACK);
g.drawLine(x, y, x, y);
}
}
class PointInfo {
int x,y;
double timestamp;
public PointInfo(int px, int py, double ts) {
this.x=px;
this.y=py;
this.timestamp=ts;
}
}
class StrokeInfo {
ArrayList<PointInfo> points;
double startTime, endTime;
String colour;
public StrokeInfo() {
points= new ArrayList<>();
}
}
I generated the .jar file using the IDE itself (Project Properties-> Build-> Packaging-> Compress JAR file)
Then copied the .jar file to a HP EliteBook 2730P Notebook PC with JRE 1.7.0.800 and Windows 7 Pro OS (32-bit) where it ran fine collecting handwriting strokes from all areas of the screen
But when I copied the same .jar to a HP Elite x2 1011 G1 tablet with JRE 1.8.0_45 and Windows 8.1 (64-bit) and ran it, I found that strangely it captures stylus inputs ONLY from a specific region of the screen - more specifically towards the upper right. It is completely non-responsive on the other areas
Could someone please help me understand why this is happening? Would have posted couple screenshots here but my low reputation prevents me from doing so.
ADDITIONAL THOUGHTS: Would it be better to use .NET or Java FX for developing such a tool for using in a Windows 8.1 environment?
Your panel seems to have a fixed size:
return new Dimension(1366,768);
Is the resolution of your tablet bigger than that?
Edit:
This should help:
private static void CreateAndShowGUI() {
frame = new JFrame("Writing Surface v0.1");
// Using the default BorderLayout here.
frame.setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE);
frame.setBackground(Color.LIGHT_GRAY);
frame.getContentPane().add(new MyPanel(), BorderLayout.CENTER);
frame.pack();
frame.setVisible(true);
}
However there are still problems with your layout, like the "Save" button jumping around. You should take a look at this tutorial:
https://docs.oracle.com/javase/tutorial/uiswing/layout/index.html
The fact that the problem could be related to the hardcoded dimensions became clear on seeing the behavior of the .jar file upon running
The tablet had a higher resolution (1920 x 1080 pixels - found by checking the screen resolution through a right click on the desktop) than the machine I had initially run the program on (1366 x 768 pixels). The dimensions of the writable area towards the upper right corner were infact true to the hardcoded dimensions
Ergo, I modified the overridden method getPreferredSize() in the following manner:
public Dimension getPreferredSize() {
Dimension screenSize = Toolkit.getDefaultToolkit().getScreenSize();
return new Dimension((int)screenSize.getWidth(),(int)screenSize.getHeight());
}
This causes the preferred size components to take on the effective height and width of the screen of the device the .jar is executed on. This is important because depending on the layout manager used the treatment of the above method by it does differ
The "SAVE SAMPLE" button intermittently still seems to cast its images (non-clickable) on other areas of the screen - this probably might have something to do with the layout manager handling of the UI components. Shall add an edit for that issue too once I find a solution
Actually, i have already ask this question in here. But, i'm making mistake. I haven't already get the solution.
First, at the question before, i can get Rectangle with
Rectangle rectangle = textArea.modelToView( textArea.getCaretPostion() );
I'm also get X and Y position.
I'm creating a editor that can add new Text Area each i press Enter key. XY position with code above always give same return in every Text Area. Look my code.
import java.awt.Container;
import java.awt.Font;
import java.awt.Rectangle;
import java.awt.event.ActionEvent;
import java.awt.event.KeyEvent;
import java.util.LinkedList;
import java.util.logging.Level;
import java.util.logging.Logger;
import javax.swing.AbstractAction;
import javax.swing.Action;
import javax.swing.Box;
import javax.swing.JFrame;
import javax.swing.JTextArea;
import javax.swing.KeyStroke;
import javax.swing.text.BadLocationException;
import javax.swing.text.JTextComponent;
public class forquestion extends JFrame {
Container textAreaBox;
LinkedList<JTextComponent> textArea;
int nameTA;
public forquestion() {
int nameTA = 0;
textArea = new LinkedList<>();
textAreaBox = Box.createVerticalBox();
textAreaBox.add(Box.createVerticalGlue());
addLine();
this.add(textAreaBox);
this.setVisible(true);
}
public static void main(String[] args) {
forquestion app = new forquestion();
app.setDefaultCloseOperation( JFrame.EXIT_ON_CLOSE );
}
public void addLine () {
JTextComponent temp_ta = createTextComponent();
textArea.add(temp_ta);
textAreaBox.add(textArea.getLast());
textAreaBox.add(Box.createVerticalGlue());
}
protected JTextComponent createTextComponent() {
JTextArea ta = new JTextArea("test");
/*if (count%2==0)
ta.setForeground(Color.red);
else
ta.setForeground(Color.GREEN);*/
ta.setFont(new Font("Courier New",Font.PLAIN,16));
ta.setLineWrap(true);
ta.setWrapStyleWord(true);
ta.setName(Integer.toString(nameTA));
nameTA+=1;
basicKey("ENTER", enter, ta);
ta.addMouseListener(new java.awt.event.MouseAdapter() {
public void mousePressed(java.awt.event.MouseEvent ev) {
try {
taMousePressed(ev);
} catch (BadLocationException ex) {
Logger.getLogger(forquestion.class.getName()).log(Level.SEVERE, null, ex);
}
}
});
return ta;
}
public void basicKey(String s, Action a, JTextArea ta) {
ta.getInputMap().put(KeyStroke.getKeyStroke(s), s);
ta.getActionMap().put(s, a);
}
Action enter = new AbstractAction() {
#Override
public void actionPerformed(ActionEvent e) {
addLine();
}
};
private void taMousePressed(java.awt.event.MouseEvent ev) throws BadLocationException {
int now_focus = Integer.parseInt(ev.getComponent().getName());
int _caret;
_caret = textArea.get(now_focus).getCaretPosition();
Rectangle rectangle = textArea.get(now_focus).modelToView(_caret);
double x = rectangle.getX();
//int xc = textArea.get(now_focus).getLocation().x;
double y = rectangle.getY();
//int yc = textArea.get(now_focus).getLocation().y;
//double h = rectangle.getHeight();
//double w = rectangle.getWidth();
System.out.println(x);
System.out.println(y);
//System.out.println(xc);
//System.out.println(yc);
//System.out.println(h);
//System.out.println(w);
System.out.println("");
}
}
My code will print XY position each time you press a Text Area. But, the display always same in every text area. (Try to make many Text Area and give some text) Btw, it just simple code. You need change the window frame size for update the new text area after you press enter key..hahaha.
So, my question is: How can i get the XY position of caret (text cursor) in any Text Area. I want to display JPopmenu there. :)
I hope this question clear for you. Thx before.
The Rectangle reported back is relative to the text area, where it's 0x0 position is the top, left corner of the component.
If you use something like...
popup.show(textArea.get(now_focus), rectangle.x, rectangle.y + rectangle.height);
Where popup is a JPopupMenu, it will make the required translations to the screen itself.
Now. Having said that. Personally, I would prefer to use the popup API support provided by Swing. This is going to mean needing to create a custom component that extends from JTextArea to achieve it...
public class MyPopupTextArea extends JTextArea {
/*...*/
public Point getPopupLocation(MouseEvent evt) {
Rectangle rectangle = textArea.get(now_focus).modelToView(_caret);
Point p = rectangle.getLoction();
p.y += rectangle.height;
return p;
}
}
Then, based on your needs, you can use setComponentPopup to provide a shared instance of the JPopupMenu or, if required, create a custom JPopupMenu for each instance of the custom editor and use setComponentPopup as you see fit...no messing about with mouse listeners ;)