How can I change coordinates to bottom left corner?
I know that's in Java the coordinates begin from Top=Left corner, but I'm asking if can someone help me how can I change it to begin (0,0) coordinates from Bottom-Left corner?
I think it's too late, but for people like me (new to android development). The above answers are correct but here is a more detailed one
If you get the coordinate with respect to top left as (a,b),
then the coordinates with respect to the bottom left are simply
(a, h-b) where h is the height of the view.
Example:
float x = getXcoordinatesonTouch();
float y = getYcoordinatesonTouch();
//should return height
float h = getHeightoftheView();
float transformY = h - y;
//"x" should be as it is
//Now you can show "x" and "transformY"
getHeight() will get you the size height. so (0, getHeight()) will give you the left-bottom point. But take into consideration the height of the object you want to place. So you may want to use
(0, getHeight() - heightOfObject)
Use the value (x, HEIGHT - y).
Related
What I'm trying to do:
Create a compass when given a direction in degrees (0 - 360). Like so:
What I have done:
I have managed to get the SVG image to point in the right direction but I can't seem to get it to rotate around the circle. To attempt a solution around the circle, I have decided to try get the positioning using the ellipse tool and this formula. This is what it looks like at the moment:
(notice how the arrow faces a different direction to the ellipse, in the circle, on the axis - given the center point is the middle of the green circle)
void setDirection(float value, int radius) {
fill(secondaryColour);
float origin_x = (1280 - (width-400)/2);
float origin_y = height/2;
float x = origin_x + radius * cos(radians(value));
float y = origin_y +radius * sin(radians(value));
//grey circle
ellipse(x, y, 20, 20);
//arrow SVG
pushMatrix();
translate(200, 300);
rotate(radians(value));
scale(0.5);
shape(arrow);
popMatrix();
}
Please note: value is in degrees and radius is the radius I want the arrow to sit on. What am I doing wrong with the ellipse? and how can I bring them both together?
I found that the starting angle started on the white line, but I was assuming it would start on the red (similar to the angle of the arrow). To resolve the issue I needed to subtract 90 degrees from the variable value before converting it into radians.
I am trying to create a Shape with the centre of the ship being in the middle.
one.x and one.z is the X and Z positions of the ship. The ship size is about 100 on the X, and 50 on the Z.
Shape my = new Rectangle(
(int) one.x - disToLeft, // upper-left corner X
(int) one.z - disToTop, // upper-left corner Y
disToLeft + disToRight, // width
disToTop + disToBottom // height
);
I'm then rotating the Shape, to of course be facing the correct way. This appears to work:
int rectWidth = (disToLeft + disToRight);
int rectHeight = (disToTop + disToBottom);
AffineTransform tr = new AffineTransform();
// rotating in central axis
tr.rotate(
Math.toRadians(one.rotation),
x + (disToLeft + disToRight) / 2,
z + (disToTop + disToBottom) / 2
);
my = tr.createTransformedShape(my);
I am then doing the exact same thing with another Shape, and testing for intersection. This works.
However, it feels like the Shape is the incorrect dimensions. Or something. My ship is colliding very far out to one side (outside where it graphically exists), but through the other side, I can almost go right through the ship before any collision is detected!
Basically the Shapes are simply intersecting at the wrong location. And I cannot work out why. Either the shape, the location, or the rotation must be wrong.
int disToLeft = 100;
int disToRight = 100;
int disToTop = 150;
int disToBottom = 100;
These are the distance from the centre to the left, right, top, and bottom sides.
I am using Z instead of Y because my game is in a 3D world and the sea level is pretty much the same (hence I don't need to worry about Y axis collision!).
Update:
After doing a lot of testing, I have discovered that the top of the rectangle is in the middle! I have done a lot of messing around, but without being able to graphically see the squares, it's been very hard to test.
This means that the box is on the side of the ship, like this:
Obviously when the ship on the left rotates to what it's like in this picture, a collision is detected.
It seems that your rotation is wrong. From my understanding of math it should be
tr.rotate(Math.toRadians(one.rotation), x + (disToRight - disToLeft) /2, z + (disToBottom - disToTop) /2);
Note the signs and the order of the variables
Edit:
Let's take apart the formula:
Your Rectangle is defined like this:
x-coordinate (x): one.x-disToLeft
y-coordinate (y): one.z-disToTop
width: disToLeft+disToRight
height: disToTop+disToBottom
The centre of the Rectangle (where you are rotating) is therefore:
(x+width/2)
(y+height/2)
if you replace x, width, y and height with the declarations above you get
(one.x-disToLeft + (disToLeft+disToRight)/2)
(one.z-disToTop + (disToTop+disToBottom)/2)
This is already the point you need, but it can be simplyfied:
one.x- disToLeft + (disToLeft+disToRight)/2
is equal to
one.x-(2*disToLeft/2)+(disToLeft/2)+(disToRight/2)
is equal to
one.x-(distoLeft/2) + (disToRight/2)
is equal to
one.x+(disToRight-disToLeft)/2
The other coordinate works exactly the same.
I believe this is more of a logic question than a java question, sorry.
My intent is rather straightforward, i want the ship to move and rotate with a matrix, with the bitmap ship1 being the center pivot of the rotation. The code works great except the pivot is off by a strange offset. (picture of conundrum linked at bottom)
The default value rotation at 0 works but all the other values seem to slide away from the center, with 180 being the furthest from the center.
centerX = playerValues[Matrix.MTRANS_X] + ship1.getWidth()/2;
centerY = playerValues[Matrix.MTRANS_Y] + ship1.getHeight()/2;
newRotation = ((float) Math.toDegrees(Math.atan2(fingery1 - centerY, fingerx1 - centerX)));
matrix.postRotate((newRotation - prevRotation), centerX, centerY);
prevRotation = newRotation;
if (fingerx1 > playerX) {
xspeed = 1;
} else
if (fingerx1 < playerX) {
xspeed = 0;
} else
if (fingery1 > playerY) {
yspeed = 1;
} else
if (fingery1 < playerY) {
yspeed = 0;
}
matrix.postTranslate(xspeed, yspeed);
matrix.getValues(playerValues);
I tried to draw how the relation of the bitmap looks at different angles. (the blue dot is where I intend to rotate the bitmap around, the arrow pointing right is the only correct one).
http://i.stack.imgur.com/2Yw76.png
Please let me know if you see any errors or any feedback helps! I just need a second pair of eyes on this because mine are going to explode soon.
Consider studying a good computer graphics text re matrix math. Foley and Van Dam is always a safe bet.
The matrix A is applied to point x with multiplication Ax. You have A = RT a rotation with translation post multiplied. The result is RTx which is R (T x) meaning the point is translated then rotated, when you probably meant the opposite.
Additionally it appears you are concatenating incremental changes repeatedly. Floating point errors will accumulate, visible as worsening distortions. Instead maintain orientation parameters x, y, theta for each ship. These are controlled by the UI. Set the matrix from these in each rendering. The transform will be rotation about the point (w/2, h/2) followed by translation to (x, y). But the matrix to effect this is the translation post multiplied by the rotation! Also you must reset the matrix for each ship.
Im developing simple game. I have cca. 50 rectangles arranged in 10 columns and 5 rows. It wasn't problem to put them somehow to fit the whole screen. But when I rotate the canvas, let's say about 7° angle, the old coordinates does't fit in the new position of the coordinates. In constructor I already create and define the position of that rectangles, in onDraw method I'm drawing this rectangles (of course there are aready rotated) bud I need some method that colliding with the current rectangle. I tried to use something like this (i did rotation around the center point of the screen)
int newx = (int) ((x * Math.cos(ROTATE_ANGLE) - (y * Math.sin(ROTATE_ANGLE))) + width / 2);
int newy = (int) ((y * Math.cos(ROTATE_ANGLE) + (x * Math.sin(ROTATE_ANGLE))) + height / 2);
but it doesn't works (becuase it gives me absolute wrong new coordinates). x and y are coordinates of the touch that I'm trying to calculate new position in manner of rotation. ROTATE_ANGLE is the angle of rotation the screen.
Does anybody know how to solve this problem, I already go thorough many articles, wiki, wolframalpha categories but not luck. Maybe I just need some link to understand problem more.
Thank you
You use a rotation matrix.
Matrix mat = new Matrix(); //mat is identity
mat.postRotate(ROTATE_ANGLE); //mat is a rotation matrix of ROTATE_ANGLE degrees
float point[] = {10.0, 20.0}; //create a new float array representing the point (10, 20)
mat.mapPoints(point); //rotate the point by the requested amount
Ok, find the solution.
First it is important to convert from angle to radian
Then I personly need to negate that radian value.
That's all, this solution is correct
I'm not all that good with Maths, so i was hoping some of you guys could help?
I'm trying to make a function to convert mouse coordiantes into a particular tile in an isometric view.
It won't let me post images for a stupid reason, so ill just link the image:
Link
All of the algorithms i have seen so far work with the X & Y axes going diagonal, my game is currently set up like this, and i would like to keep it so.
Is there an algorithm so that if the mouse was at the red dot, it would return the coordinates of the tile that it is sitting on? (6,2)
Thanks in advance!
There is a good start : http://www.java-gaming.org/index.php?topic=23656.0
Enjoy :)
EDIT
Full-trusted "DrDobb's" website, full article on this : http://www.drdobbs.com/parallel/designing-isometric-game-environments/184410055
<0;4>
x <0;3> <1;4>
<0;2> <1;3> <2;4>
<0;1> <1;2> <2;3> <3;4>
<0;0> <1;1> <2;2> <3;3> <4;4>
<1;0> <2;1> <3;2> <4;3>
<2;0> <3;1> <4;2>
y <3;0> <4;1>
<4;0>
I rendered the tiles like above.
the sollution is VERY simple!
first thing:
my Tile width and height are both = 32 this means that in isometric view, the width = 32 and height = 16! Mapheight in this case is 5 (max. Y value)
y_iso & x_iso == 0 when y_mouse=MapHeight/tilewidth/2 and x_mouse = 0
when x_mouse +=1, y_iso -=1
so first of all I calculate the "per-pixel transformation"
TileY = ((y_mouse*2)-((MapHeight*tilewidth)/2)+x_mouse/2;
TileX = x_mouse-TileY;
to find the tile coordinates I just devide both by tilewidth
TileY = TileY/32; TileX = TileX/32;
DONE!! never had any problems!
It's quite easy actually once you get your head wrapped around it. All you do is find out where your mouse is relative to the map and then reverse to how you are drawing the tiles.
I draw my map in the double "for" loop like this:
For x coord: x * (TileWidth / 2) - (y * (TileWidth / 2))
For y coord: x * (TileHeight / 2) + (y * (TileHeight / 2))
So my x goes from top left to bottom right and my y goes from top right to bottom left. Mind though, like for the first tile the world coord will be 0,0 but the top pixel starts at x=0 + (tilewidth / 2) so we have to compensate for that when we are looking to find which tile the mouse is over. (or we could do that for the whole world itself by giving it a offset).
Now first we have to find the mouse position in relation to the world since you probably want a moving camera. My camera's centre starts as 0,0 so i have to compensate the mouse by half the screen width like so:
mouseWorldPosX = mouse.x + cam.x - (screen.width / 2)
mouseWorldPosY = mouse.y + cam.y - (screen.height / 2)
This is all we need to calculate the mouse position back to tile position.
For X:
tileX = (mouseWorldPosX + (2 * mouseWorldPosY) - (tileWidth / 2)) / tileWidth
As you can see we divide the whole thing by the tilewidth since we multiplied it in the draw method. The (tileWidth / 2) is just there to compensate for the offset i mentioned earlier.
For Y:
tileY = (mouseWorldPosX - (2 * mouseWorldPosY) - (tileHeight / 2) / -tileWidth
It's practically the same but the other way around. We subtract the Y world position since the Y axis runs the other way around. This time we compensate the offset for the height of the tile and we divide the whole thing by negative tilewidth, again since it runs the other way.
I hope this helps below is a working example of a method i looked up, it returns a vector with the tile coordinates:
public Vector2 MouseTilePosition(Camera cam, GraphicsDevice device)
{
float mPosX = newMouseState.X + (cam.Position.X - (device.Viewport.Width / 2));
float mPosY = newMouseState.Y + (cam.Position.Y - (device.Viewport.Height / 2));
float posx = (mPosX + (2 * mPosY) - (Map.TileWidth / 2)) / Map.TileWidth;
float posy = (mPosX - (2 * mPosY) - (Map.TileHeight / 2)) / -Map.TileWidth;
return new Vector2((int)posx, (int)posy);
}