Setting the thrust position on a moving and rotating object - java

I have a spaceship, that spaceship moves though space 360 Degrees.
The spaceship needs thrust animation in 2d. The trust animation needs to be at the bottom middle line of the spaceship. I have the following variables.
_Rotation
_Spaceship.width
_Spaceship.height
_Spaceship.Position(x,y)
I've uploaded an image of my problem too in-case people don't understand my bad explanation:
http://imgur.com/Lgchc
Both animation render like so:
this._itemAnimation.render(this.getPosition(), canvas, this._degrees);
this._thrustAnimation.render(this.thrustPosition(), canvas, this._degrees);
I have tried so far and failed:
_thurstPosition.set(((int)_object_x + Math.cos(_degrees) * _itemAnimation.getWidth() / 2) ,
((int)_object_y + Math.sin(_degrees) * _itemAnimation.getWidth() / 2));
I'm fail, somebody help me.
--- UPDATE ---
I've updated the code so it's better understood:
int SpaceshipCenterX = getPosition().x + (_SpaceshipAnimation.getWidth() / 2) - (_thrustAnimation.getWidth() / 2);
int SpaceshipCenterY = getPosition().y + ((_SpaceshipAnimation.getHeight() / 2) - (_thrustAnimation.getHeight() / 2));
double OffSetCos = Math.cos(_spaceshipDegrees);
double OffSetSin = Math.sin(_spaceshipDegrees);
_thurstPosition.set
(
(int)(SpaceshipCenterX + (SpaceshipAnimation.getWidth() / 2) * OffSetCos)
,
(int)(SpaceshipCenterY + (SpaceshipAnimation.getHeight() / 2) * OffSetSin)
);
I still can't get it too work. It's going around the spaceship but very fast and flashing everywhere.
--- UPDATE 2 ---
This is almost working but it's going too far out:
int xOffset = -1 * (_itemAnimation.getWidth() / 2);
double DegreeToRadien = Math.toRadians(_degrees);
_thurstPosition.set
(
(int)(((xOffset) * Math.cos(DegreeToRadien)) + getPosition().x),
(int)(((xOffset) * Math.sin(DegreeToRadien)) + getPosition().y)
);

Assuming that you are using this coordinate/angle system:
90 (pi/2) - Up
^
|
Left - 180 (pi) <----|----> 0 - Right
|
v
Down - 270 (3pi/2)
And that your spaceship is going to the right at 0 degrees
>[ } 0
Then for any direction you need to translate the thrust relatively from the centre of the spaceship, let's say we translate in the x direction by
offset = -1 * width/2;
Then rotate it by the angle of the spaceship and finally translate it by the position of the spaceship.
To compute this transformation, write out the 3 transformations as matrices in reverse order and multiply them out, transforming a point starting at (0,0)
[1 0 spaceshipX] [cos(dir) -sin(dir) 0] [1 0 -width/2] [0]
[0 1 spaceshipY] [sin(dir) cos(dir) 0] [0 1 0 ] [0]
[0 0 1 ] [ 0 0 1] [0 0 1 ] [1]
So that would give you the position of the thrust as
thrustX = (-width/2)cos(dir) + spaceshipX
thrustY = (-width/2)sin(dir) + spaceshipY
So I suppose you just missed the fact you need to subtract width/2, not add it.
I've edited this with a correct and more readable syntax. Using underscores everywhere really hurts readability. I am assuming you have a spaceship class, and the spaceship has a width, height, x position, y position and rotation. You have a thruster class which also has a width, height, x position, y position and rotation. (They could inherit from a Sprite abstract class). To set the position of a thruster object we call thruster.setPosition(x,y);
int xOffset = -1 * (spaceship.width / 2);
thruster.setPosition(
(int)(((xOffset) * Math.cos(spaceship.rotation)) + spaceship.x),
(int)(((xOffset) * Math.sin(spaceship.rotation)) + spaceship.y)
);
Hopefully this makes it obvious to you which values you need to be setting where. I can't decipher your code without seeing more of it, where these variables are declared and what they actually mean.
Update
Just to conclude, as I think you may have discovered. Math.cos and Math.sin require angles to be in Radians, not degrees. The solution I have given here is correct, and I have shown how you compute the position of any relatively positioned object by performing the matrix calculation. You just have to remember that spaceship.rotation must be in radians or you must translate it to radians from degrees before passing it to Math.cos() or Math.sin().
int xOffset = -1 * (spaceship.width / 2);
double radians = Math.toRadians(spaceship.rotation);
thruster.setPosition(
(int)(((xOffset) * Math.cos(radians)) + spaceship.x),
(int)(((xOffset) * Math.sin(radians)) + spaceship.y)
);

Your code looks close to me. You need to be careful about what _degrees means. If _degrees is the direction in which the rocket points (counterclockwise from the +x axis), then you'll need to put some minus signs in there because the thrust is at the back of the rocket, not the front. Also, I would think the thrust is on the height end of the rocket, so use getLength instead of getWidth. You may need to add some extra for the height of the thrust animation as well (depending on where its anchor is). So something like
_thurstPosition.set(((int)_object_x - Math.cos(_degrees) * (_itemAnimation.getHeight() / 2 + _thrustAnimation.getHeight() / 2)) ,
((int)_object_y - Math.sin(_degrees) * (_itemAnimation.getHeight() / 2 + _thrustAnimation.getHeight() / 2)));
You'll also need to set thrust orientation if it isn't a circle.
(I'm assuming _degrees == _Rotation)

double DegreeToRadien = Math.toRadians(_degrees);
int ObjectXCenter = (int) (_object_x + ((_itemAnimation.getWidth() / 2)) - _thrustAnimation.getWidth() / 2);
int ObjectYCenter = (int) (_object_y + ((_itemAnimation.getHeight() / 2)) - _thrustAnimation.getHeight() / 2);
int xOffset = -1 * (_itemAnimation.getWidth() / 2);
_thurstPosition.set
(
(int)(((xOffset) * Math.cos(DegreeToRadien)) + ObjectXCenter),
(int)(((xOffset) * Math.sin(DegreeToRadien)) + ObjectYCenter)
);

Related

Scale coordinates on radar

I'm currently developing an radar for android. (Following this tutorial: http://www.androidph.com/2009/02/app-10-beer-radar.html )
I'm getting all users within a range of 5KM around my current location from the server after that I draw in my customview like this:
float xU = (float)(userLocation.getLongitude() + getWidth() / 2 - currentLong);
float yU = (float)(getHeight() / 2 - userLocation.getLatitude() + currentLat);
canvas.drawBitmap(bmpUser,xU,yU,radarPaint);
Now my problem is, that I need to scale the the points / coordinates I draw on the radar, because all users within a distance below 5KM will be drawn like only 2-3 Pixel away from the center. How would I manage to do that?
This is one way to do it. All you need to do is to set the value of the "scale" variable to the required value;
float scale = 3.0f; //set this to any number to change the drawing scale
float xU = (float)(getWidth() / 2 + (userLocation.getLongitude() - currentLong) * scale);
float yU = (float)(getHeight() / 2 - (userLocation.getLatitude() - currentLat) * scale);
canvas.drawBitmap(bmpUser,xU,yU,radarPaint);

Isometric Tile Selection

I'm not all that good with Maths, so i was hoping some of you guys could help?
I'm trying to make a function to convert mouse coordiantes into a particular tile in an isometric view.
It won't let me post images for a stupid reason, so ill just link the image:
Link
All of the algorithms i have seen so far work with the X & Y axes going diagonal, my game is currently set up like this, and i would like to keep it so.
Is there an algorithm so that if the mouse was at the red dot, it would return the coordinates of the tile that it is sitting on? (6,2)
Thanks in advance!
There is a good start : http://www.java-gaming.org/index.php?topic=23656.0
Enjoy :)
EDIT
Full-trusted "DrDobb's" website, full article on this : http://www.drdobbs.com/parallel/designing-isometric-game-environments/184410055
<0;4>
x <0;3> <1;4>
<0;2> <1;3> <2;4>
<0;1> <1;2> <2;3> <3;4>
<0;0> <1;1> <2;2> <3;3> <4;4>
<1;0> <2;1> <3;2> <4;3>
<2;0> <3;1> <4;2>
y <3;0> <4;1>
<4;0>
I rendered the tiles like above.
the sollution is VERY simple!
first thing:
my Tile width and height are both = 32 this means that in isometric view, the width = 32 and height = 16! Mapheight in this case is 5 (max. Y value)
y_iso & x_iso == 0 when y_mouse=MapHeight/tilewidth/2 and x_mouse = 0
when x_mouse +=1, y_iso -=1
so first of all I calculate the "per-pixel transformation"
TileY = ((y_mouse*2)-((MapHeight*tilewidth)/2)+x_mouse/2;
TileX = x_mouse-TileY;
to find the tile coordinates I just devide both by tilewidth
TileY = TileY/32; TileX = TileX/32;
DONE!! never had any problems!
It's quite easy actually once you get your head wrapped around it. All you do is find out where your mouse is relative to the map and then reverse to how you are drawing the tiles.
I draw my map in the double "for" loop like this:
For x coord: x * (TileWidth / 2) - (y * (TileWidth / 2))
For y coord: x * (TileHeight / 2) + (y * (TileHeight / 2))
So my x goes from top left to bottom right and my y goes from top right to bottom left. Mind though, like for the first tile the world coord will be 0,0 but the top pixel starts at x=0 + (tilewidth / 2) so we have to compensate for that when we are looking to find which tile the mouse is over. (or we could do that for the whole world itself by giving it a offset).
Now first we have to find the mouse position in relation to the world since you probably want a moving camera. My camera's centre starts as 0,0 so i have to compensate the mouse by half the screen width like so:
mouseWorldPosX = mouse.x + cam.x - (screen.width / 2)
mouseWorldPosY = mouse.y + cam.y - (screen.height / 2)
This is all we need to calculate the mouse position back to tile position.
For X:
tileX = (mouseWorldPosX + (2 * mouseWorldPosY) - (tileWidth / 2)) / tileWidth
As you can see we divide the whole thing by the tilewidth since we multiplied it in the draw method. The (tileWidth / 2) is just there to compensate for the offset i mentioned earlier.
For Y:
tileY = (mouseWorldPosX - (2 * mouseWorldPosY) - (tileHeight / 2) / -tileWidth
It's practically the same but the other way around. We subtract the Y world position since the Y axis runs the other way around. This time we compensate the offset for the height of the tile and we divide the whole thing by negative tilewidth, again since it runs the other way.
I hope this helps below is a working example of a method i looked up, it returns a vector with the tile coordinates:
public Vector2 MouseTilePosition(Camera cam, GraphicsDevice device)
{
float mPosX = newMouseState.X + (cam.Position.X - (device.Viewport.Width / 2));
float mPosY = newMouseState.Y + (cam.Position.Y - (device.Viewport.Height / 2));
float posx = (mPosX + (2 * mPosY) - (Map.TileWidth / 2)) / Map.TileWidth;
float posy = (mPosX - (2 * mPosY) - (Map.TileHeight / 2)) / -Map.TileWidth;
return new Vector2((int)posx, (int)posy);
}

Java: Rotate a Point around an other using Google Maps Coordinates

From Google Earth I got a Box with coordinates for a picture, like following:
<LatLonBox>
<north>53.10685</north>
<south>53.10637222222223</south>
<east>8.853144444444444</east>
<west>8.851858333333333</west>
<rotation>-26.3448</rotation>
</LatLonBox>
Now I want to test weather a point intersect with this LatLonBox.
My base idea to check, whether a point intersect with the LatLonBox was, to rotate the point back by the given angle, and then to test whether the point intersect with a regular (not rotated) rectangle.
I tried to calculate the rotation manually:
public static MyGeoPoint rotatePoint(MyGeoPoint point, MyGeoPoint origion, double degree)
{
double x = origion.getLatitude() + (Math.cos(Math.toRadians(degree)) * (point.getLatitude() - origion.getLatitude()) - Math.sin(Math.toRadians(degree)) * (point.getLongitude() - origion.getLongitude()));
double y = origion.getLongitude() + (Math.sin(Math.toRadians(degree)) * (point.getLatitude() - origion.getLatitude()) + Math.cos(Math.toRadians(degree)) * (point.getLongitude() - origion.getLongitude()));
return new MyGeoPoint(x, y);
}
public boolean intersect(MyGeoPoint geoPoint)
{
geoPoint = MyGeoPoint.rotatePoint(geoPoint, this.getCenter(), - this.getRotation());
return (geoPoint.getLatitude() < getTopLeftLatitude()
&& geoPoint.getLatitude() > getBottomRightLatitude()
&& geoPoint.getLongitude() > getTopLeftLongitude()
&& geoPoint.getLongitude() < getBottomRightLongitude());
}
And it seems that the results are wrong.
LatLonBox box = new LatLonBox(53.10685, 8.851858333333333, 53.10637222222223, 8.853144444444444, -26.3448);
MyGeoPoint point1 = new MyGeoPoint(53.106872, 8.852311);
MyGeoPoint point2 = new MyGeoPoint(53.10670378322918, 8.852967186822669);
MyGeoPoint point3 = new MyGeoPoint(53.10652664993972, 8.851994565566875);
MyGeoPoint point4 = new MyGeoPoint(53.10631650700605, 8.85270995172055);
System.out.println(box.intersect(point1));
System.out.println(box.intersect(point2));
System.out.println(box.intersect(point3));
System.out.println(box.intersect(point4));
The result is true, false, false, true. But it should be 4x true.
Probably I´, making some kind of error in reasoning.
Maybe because the latitude values are getting bigger upwards. But I don´t knwo how to change the formular.
I need some help ...
EDIT:
I think my basic idea and formular is right. Also I found similar solutions eg. link and couldn´t find any difference.
So I think the only possible error source is, that the axis are not proportional. So the problem is how to take account of this.
I hope someone has got an idea.
The problem was indeed that the axis were not proportional.
The following method takes care of it.
public static MyGeoPoint rotatePoint(MyGeoPoint point, MyGeoPoint origion, double degree)
{
double x = origion.longitude + (Math.cos(Math.toRadians(degree)) * (point.longitude - origion.longitude) - Math.sin(Math.toRadians(degree)) * (point.latitude - origion.latitude) / Math.abs(Math.cos(Math.toRadians(origion.latitude)));
double y = origion.latitude + (Math.sin(Math.toRadians(degree)) * (point.longitude - origion.longitude) * Math.abs(Math.cos(Math.toRadians(origion.latitude))) + Math.cos(Math.toRadians(degree)) * (point.latitude - origion.latitude));
return new MyGeoPoint(x, y);
}
if I understand correctly you want to check if these four points are in rotated rectangle.
I would recommend checking not by corner points because your rectangle is rotated but:
if you have rotated rectangle ABCD then calculate lines |AB|, |BC|,|CD| and |DA|. If you have two points then use y=ax+b (you will calculate a,b by by giving [x,y] of both coordinates that gives you two easy equatations).
Finally function intersect will check
if point <= line |CD|
AND point >= line |AB|
AND point <= line |BC|
AND point >= |DA|
then it is inside rect.
This can be done when your point P[x,y] you put in ax+y+b (a>0 or -ax-y-b). If it is zero it is lying on the line, if it is < than it is under line or "on the left side". Hope I helped..
BTW why are you using -degree value, which you multiply by -1 , is it necessary?
The problem appears to be that the data structure LatLonBox doesn't make any sense as a description for the boundary of a picture. A box in lat-lon coordinates is not a geometric rectangle. (Think about a box near or including the north pole.) You need to re-think your application to deal in a lat/lon coordinate for the center of the picture and then deal with the rotation as an angle with respect to lines of latitude (parallel to the equator). (Even then, a picture with center on the north or south pole will be a degenerate case that must be handled separately.) So a box should properly be something like:
<geobox>
<center_lat>41</center_lat>
<center_lon>-74</center_lon>
<rotation_degrees_ccw>-23</rotation_degrees_ccw>
<width>1000</width> <!-- in pixels or meters, but not in degrees! -->
<height>600</height> <!-- same as above -->
</geobox>
Having said all that, suppose you have a true geometric box centered at (x0,y0), width w, height h, rotated angle T about its center. Then you can test a point P(x,y) for membership in the box with the following. You need the transformation that takes the box to the origin and aligns it with the axes. This is Translate(-x0,-y0) then Rotate(-T). This transformation as a matrix is
[cos(-T) -sin(-T) 0][1 0 -x0] [ cos(T) sin(T) -x0*cos(T)-y0*sin(T)]
[sin(-T) cos(-T) 0][0 1 -y0] = [-sin(T) cos(T) x0*sin(T)-y0*cos(T)]
[0 0 1][0 0 1] [ 0 0 1 ]
You want to apply this transformation to the point to be tested and then see if it lies in the desired box:
// Transform the point to be tested.
ct = cos(T);
st = sin(T);
xp = ct * x + st * y - x0 * ct - y0 * st;
yp = -st * x + ct * y + x0 * st - y0 * ct;
// Test for membership in the box.
boolean inside = xp >= -w/2 && xp <= w/2 && yp >= -h/2 && yp <= h/2;
It's late and I haven't checked this arithmetic, but it's close. Say if it doesn't work.

Getting distance of a point in a 2d triangle without calculating perpendicular vectors?

Alright, so I'm trying to get the distance of a point in a 2d triangle without calculating perpendicular vectors.
float qd = Vector2f.dot(new Vector2f(pos.x, pos.z),
new Vector2f(normal.pos.x, normal.pos.z)) -
Vector2f.dot(new Vector2f(q.center.x, q.center.z),
new Vector2f(normal.pos.x, normal.pos.z));
That's the code I'm using. (Note: it's converting 3f vectors to 2d ones, but you don't have to worry about that). I need the result of the calculation to be between 0 and 1 I.E. 0.5 or something.
If I'm still not explaining right maybe this will help?
My question is: How do I get the distance of a point in a 2d triangle without calculating perpendicular vector's distance? I.E. if the triangle is facing up (y = -1) without any tilt
I would need the distance in the triangle without any X.
Edit1: About what you're saying, Banthar, This is what I got out of it, and it doesn't work, but it seems like it's close to working.
float d = (float) Math.sqrt( 0 /*cause the two x's should be the same */ + Math.pow(pos.z - q.max.z, 2));
float h = (float) Math.sqrt( 0 /*cause the two x's should be the same */ + Math.pow(q.min.z - q.max.z, 2));
float myDist = d/h;
Let's say your triangle is ABC and the point is P.
The number you are looking for is the distance from P to AB divided by the distance from C to AB.
This is the same as the ratio of the corresponding areas. So you can compute the two areas:
Area(ABP) / Area(ABC)
The best way to compute the triangle area depends on what information you have about your triangle.
If you have the vertices only, then you can use:
Area(ABP) / Area(ABC) = ( Ax*By - Ax*Py + Ay*Px - Ay*Bx + Bx*Py - By*Px ) /
( Ax*By - Ax*Cy + Ay*Cx - Ay*Bx + Bx*Cy - By*Cx )

Cartesian coordinates in Java

I'm trying to draw a function's curve, so I need a method to convert my curve points coordinates to screen coordinates but I can't get it to work.
Here's the method I use to convert:
public Point tradPoint(Point P){
Point Ptd = new Point();
Ptd.x=getWidth()/2 + P.x*getWidth()/20;
Ptd.y=getHeight()/2 - P.y*getHeight()/20;
return Ptd;
}
but it doesn't work.
I should mention that I'm using a Cartesian coordinate system and a unit=20.
Any suggestions?
Thanks
Should be
Ptd.x = getWidth() / 2 + P.x * 20;
Ptd.y = getHeight() / 2 - P.y * 20;
where 20 is the unit width.
Also, Ptd should be pTd or even better pointTranslated and P should be p or point. Java identifiers should start with a lowercase letter and be descriptive.

Categories

Resources