I am currently developing an application and want to understand the core principles of computer graphics. I was wandering if anyone can provide a formula to check if a point in three variables (x,y,z) intersects a line that consist of two points (x1,y1,z1) and (x2,y2,z2)
Say if you wanted to find if point0 =(x0,y0,z0) intersects line which cross at point1=(x1,y1,z1) and point2 = (x2,y2,z2).
Formula to check if point0 lies on line (point1->point2) would be to check if (X0,y0,z0) = (x1,y1,z1) + t(x2-x1,y2-y1,z2-z1).
For any value of t that is constant for x , y and z axes.
so
x0 = x1 + t(x2-x1)
y0 = y1 + t(y2-y1)
etc..
Related
If I have 2 points in 3d space how can I determine if a rectangular prism lies between those 2 points? Also the only information I have about the the rectangular prism is its min and max x, y, and z value. I know I could iterate down the line between those 2 points checking to see if that point is within the rectangular prism but that seems very resource heavy. What I really need is a way to check if a line segment intersects the prism but I am not sure how to do that any ideas?
I found these two resources that seem similar to my question
https://math.stackexchange.com/questions/2134505/how-to-check-if-an-infinite-line-intersects-a-rectangular-prism-in-3d-space
How to check if an infinite line intersects a rectangular prism in 3d space?
Looking at that bottom link it just simply says to find parameters t that intersects the rectangular prism which is obvious but the problem is I dont know how to do that any thoughts?
project your line points and prism edges onto 2D plane that is perpendicular to your line.
On the 2D plane, the two points of line will be just one point and prism edges is just a bunch of connected vertices forming a closed region. check if the one point is within the closed region, this will be easy to do for 2D.
If your point is within then the line intersects the prism in 3D, if not then no.
now there is a case where it is a line segment where the two ends don't touch the prism. In this case you just check point to prism surface distance, there is a equation for that.
Let line segment is defined by two points (X1, Y1) and (X2, Y2).
Let box is defined by ranges xmin..xmax, ymin..ymax, zmin..zmax.
We can write parametric eqiation for line segment, where t is in range 0..1:
X = X1 + t * (X2 - X1)
Y = Y1 + t * (Y2 - Y1)
Z = Z1 + t * (Z2 - Z1)
Box has 6 facets. Let the first one is at xmin. We can substitute xmin in the first equation and find parameter t where line intersects this facet.
xmin = X1 + t1 * (X2 - X1)
t1 = (xmin - X1) / (X2 - X1)
Now check that t1 is in range 0..1. If yes, substitute this t1 value in the second and third equations
Y = Y1 + t1 * (Y2 - Y1)
Z = Z1 + t1 * (Z2 - Z1)
and check that resulting Y and Z lie in ranges ymin..ymax and zmin..zmax respectively.
If yes - line segment does intersect the box
If no, repeat for other faces xmax, ymin and so on.
P.S. Also you can consider special case when line segment is fully inside the box (just check if X1,Y1,Z1 is in box range)
P.P.S. When line is parallel to some coordinate plane, don't check intersections with corresponding faces (for example, if X2-X1==0, you cannot divide by zero, but you just don't need check xmin and xmax faces)
Quick-made Python-like pseudocode for reference:
def DoesSegmentIntersectBox(x1,y1,z1,x2,y2,z2,xmin,xmax,yin,ymax,zmin,zmax):
if zmin<=z1<=zmax and ymin<=y1<=ymax and xmin<=x1<=xmax:
return True #end inside the box
if (x2-x1):
t = (xmin-x1) / (x2-x1)
if 0<=t<=1: #line intersects plane in segment range
y = y1+t*(y2-y1)
if ymin<=y<=ymax: #segment intersects face in y range
z = z1+t*(z2-z1)
if zmin<=z<=zmax: #segment intersects face in z range
return True #segment does intersect face xmin
t = (xmax-x1) / (x2-x1)
#same 6 lines
if (y2-y1):
t = (ymin-y1) / (y2-y1)
if 0<=t<=1:
x = x1+t*(x2-x1)
if xmin<=x<=xmax:
z = z1+t*(z2-z1)
if zmin<=z<=zmax:
return True
t = (ymax-y1) / (y2-y1)
#same 6 lines
if (z2-z1):
t = (zmin-z1) / (z2-z1)
if 0<=t<=1:
x = x1+t*(x2-x1)
if xmin<=x<=xmax:
y = y1+t*(y2-y1)
if ymin<=y<=ymax:
return True
t = (zmax-z1) / (z2-z1)
#same 6 lines
return False
I am making a 3D Java game but i have got problems when rotating a hitbox. I, upon this point, only used a method which detects if a Vector3f is in a box.
But in my game I want to rotate houses, for example, so that method won't work. I could use circulair hitboxes but that wouldn't work for every instance of objects.
So far i have used this simple calculation to detect if a location is in a hitbox.
public boolean isinbox(Vector3f pos) {
Vector3f entPos = ent.getPosition();
float x1 = entPos.x + xOffset;
float z1 = entPos.z + zOffset;
float y1 = entPos.y + yOffset;
float x2 = entPos.x - xOffset;
float z2 = entPos.z - zOffset;
float y2 = entPos.y;
return pos.x < x1 && pos.x > x2 && pos.z < z1 && pos.z > z2 && pos.y > y2 && pos.y < y1;
}
This works in many ways, but I can't figure out how to rotate them and still be able to detect it. The xOffset is the ofset if side a to the center and negative side b to the center.
How would I be able to rotate a hitbox and detect if an Vector is in it?
There are a couple of ways of getting around this issue and one (or more) ways of solving this issue:
Solving It
SAT Collision
SAT Stands for Separating Axis Theorem.
TutsPlus and MetaSoftware are great websites to learn how it works and how to implement it.
The Separating Axis Theorem (SAT for short) essentially states if you are able to draw a line to separate two polygons, then they do not collide. It's that simple. (gamedevelopment.tutsplus.com)
Here is the basic idea:
It can also be used for more complex shapes:
Getting Around It
AABB Collision
This is done by assuming that collisions will only ever happen on the X/Y axis (never on an axis defined by an arbitrary line).
Here is an example of what this looks like:
Therefore, you must define this axis-aligned hitbox by using the minimum X-and-Y values and the maximum X-and-Y values of the original box.
Circle Collision
This is a much simpler collision check which only detects when two objects are within a certain distance of each other.
Here is an example of what this looks like:
The way this works is that if the distance between the two objects is less than the sum of each circle's radius, then the objects are colliding.
Using an External Library
Bullet Physics
Box2D
So I'm trying to make my first game on android. The thing is I have a small moving ball and I want it to bounce from a line that I drew. For that I need to find if the x,y of the ball are also coordinates of one dot from the line.
I tried to implement these equations about lines
x=a1 + t*u1
y=a2 + t*u2 => (x-a1)/u1=(y-a2)/u2 (t=t which has to be if the point is on the line)
where x and y are the coordinates I'm testing, dot[a1,a2] is a dot that is on the line and
u(u1,u2) is the vector of the line.
heres the code:
public boolean Collided()
{
float u1 =Math.abs(Math.round(begin_X)-Math.round(end_X));
float u2 =Math.abs(Math.round(begin_Y)-Math.round(end_Y));
float t_x =Math.round((elect_X - begin_X)/u1);
float t_y =Math.round((elect_Y - begin_Y)/u2);
if(t_x==t_y)
{
return true;
}
else
{
return false;
}
}
points [begin_X,end_X] and [begin_Y,end_Y] are the two points from the line and [elect_X,elect_Y] are the coordinates of the ball
theoretically it should work, but in the reality the ball most of the time just goes straight through the line or bounces somewhere else where it shouldn't
I'm not sure what your code is doing, but looks weird
you do some operations on x coordinates of your data, then on y, and at the end you want them to be equal;
go and try hereShortest distance between a point and a line segment and then if distance == 0 (or smaller on equal to radius of your ball) you will have collision
The issue lies in the fact that you're testing whether the dot 'hits' the line that you want it to bounce from. I'm assuming that you're incrementing the position of your dot every frame with a small amount.
Say your dot is placed at [1,1], your line runs from [0,0] to [5,0], the velocity of your dot is 1 unit per second and the direction is [-1,0]. I'm assuming your calculating the increment based on the time per frame to allow for smoother animation.
What's happening is the following:
Dot at [1,1]
Time per frame = 0,7
Dot is moved to [0.3,0]
Test intersection = false
--- next frame ---
Dot at [0.3,0]
Time per frame = 0.5 (this usually varies per frame)
Dot is moved to [0.2,0]
Test intersection = false
So the tests say there hasn't been an intersection because your testing discrete positions of your dot.
As Aki Suihkonen suggests you want to test for line intersection between a line formed by the last position + the current position and the line that you want your dot to collide with.
java.awt.geom.Line2D.linesIntersect(double X1, double Y1, double X2, double Y2, double X3, double Y3, double X4, double Y4) allows you to check for these intersections easily.
Your math is OK, but you the code isn't.
It's simpler to use the genearl line equation y = y1 + a(x - x1) where a = (y2 - y1) / ( x2 - x1) , being (x1,y1) and (x2,y2) to points from the line.
To get ball the distance from the line when the ball is at point (bx,by)use:
double a = (y2 - y1) / (x2 - x1);
double distance = (by - y1 - a * (bx - x1)) * Math.cos(Math.atan(a));
Now you can compare if Math.abs(distance) is bellow a specific value (i.e. ball diameter) to confirm collision.
Note: this only works for non vertical lines. If you have a vertical line just use:
double distance = bx - x1;
good luck.
From Google Earth I got a Box with coordinates for a picture, like following:
<LatLonBox>
<north>53.10685</north>
<south>53.10637222222223</south>
<east>8.853144444444444</east>
<west>8.851858333333333</west>
<rotation>-26.3448</rotation>
</LatLonBox>
Now I want to test weather a point intersect with this LatLonBox.
My base idea to check, whether a point intersect with the LatLonBox was, to rotate the point back by the given angle, and then to test whether the point intersect with a regular (not rotated) rectangle.
I tried to calculate the rotation manually:
public static MyGeoPoint rotatePoint(MyGeoPoint point, MyGeoPoint origion, double degree)
{
double x = origion.getLatitude() + (Math.cos(Math.toRadians(degree)) * (point.getLatitude() - origion.getLatitude()) - Math.sin(Math.toRadians(degree)) * (point.getLongitude() - origion.getLongitude()));
double y = origion.getLongitude() + (Math.sin(Math.toRadians(degree)) * (point.getLatitude() - origion.getLatitude()) + Math.cos(Math.toRadians(degree)) * (point.getLongitude() - origion.getLongitude()));
return new MyGeoPoint(x, y);
}
public boolean intersect(MyGeoPoint geoPoint)
{
geoPoint = MyGeoPoint.rotatePoint(geoPoint, this.getCenter(), - this.getRotation());
return (geoPoint.getLatitude() < getTopLeftLatitude()
&& geoPoint.getLatitude() > getBottomRightLatitude()
&& geoPoint.getLongitude() > getTopLeftLongitude()
&& geoPoint.getLongitude() < getBottomRightLongitude());
}
And it seems that the results are wrong.
LatLonBox box = new LatLonBox(53.10685, 8.851858333333333, 53.10637222222223, 8.853144444444444, -26.3448);
MyGeoPoint point1 = new MyGeoPoint(53.106872, 8.852311);
MyGeoPoint point2 = new MyGeoPoint(53.10670378322918, 8.852967186822669);
MyGeoPoint point3 = new MyGeoPoint(53.10652664993972, 8.851994565566875);
MyGeoPoint point4 = new MyGeoPoint(53.10631650700605, 8.85270995172055);
System.out.println(box.intersect(point1));
System.out.println(box.intersect(point2));
System.out.println(box.intersect(point3));
System.out.println(box.intersect(point4));
The result is true, false, false, true. But it should be 4x true.
Probably I´, making some kind of error in reasoning.
Maybe because the latitude values are getting bigger upwards. But I don´t knwo how to change the formular.
I need some help ...
EDIT:
I think my basic idea and formular is right. Also I found similar solutions eg. link and couldn´t find any difference.
So I think the only possible error source is, that the axis are not proportional. So the problem is how to take account of this.
I hope someone has got an idea.
The problem was indeed that the axis were not proportional.
The following method takes care of it.
public static MyGeoPoint rotatePoint(MyGeoPoint point, MyGeoPoint origion, double degree)
{
double x = origion.longitude + (Math.cos(Math.toRadians(degree)) * (point.longitude - origion.longitude) - Math.sin(Math.toRadians(degree)) * (point.latitude - origion.latitude) / Math.abs(Math.cos(Math.toRadians(origion.latitude)));
double y = origion.latitude + (Math.sin(Math.toRadians(degree)) * (point.longitude - origion.longitude) * Math.abs(Math.cos(Math.toRadians(origion.latitude))) + Math.cos(Math.toRadians(degree)) * (point.latitude - origion.latitude));
return new MyGeoPoint(x, y);
}
if I understand correctly you want to check if these four points are in rotated rectangle.
I would recommend checking not by corner points because your rectangle is rotated but:
if you have rotated rectangle ABCD then calculate lines |AB|, |BC|,|CD| and |DA|. If you have two points then use y=ax+b (you will calculate a,b by by giving [x,y] of both coordinates that gives you two easy equatations).
Finally function intersect will check
if point <= line |CD|
AND point >= line |AB|
AND point <= line |BC|
AND point >= |DA|
then it is inside rect.
This can be done when your point P[x,y] you put in ax+y+b (a>0 or -ax-y-b). If it is zero it is lying on the line, if it is < than it is under line or "on the left side". Hope I helped..
BTW why are you using -degree value, which you multiply by -1 , is it necessary?
The problem appears to be that the data structure LatLonBox doesn't make any sense as a description for the boundary of a picture. A box in lat-lon coordinates is not a geometric rectangle. (Think about a box near or including the north pole.) You need to re-think your application to deal in a lat/lon coordinate for the center of the picture and then deal with the rotation as an angle with respect to lines of latitude (parallel to the equator). (Even then, a picture with center on the north or south pole will be a degenerate case that must be handled separately.) So a box should properly be something like:
<geobox>
<center_lat>41</center_lat>
<center_lon>-74</center_lon>
<rotation_degrees_ccw>-23</rotation_degrees_ccw>
<width>1000</width> <!-- in pixels or meters, but not in degrees! -->
<height>600</height> <!-- same as above -->
</geobox>
Having said all that, suppose you have a true geometric box centered at (x0,y0), width w, height h, rotated angle T about its center. Then you can test a point P(x,y) for membership in the box with the following. You need the transformation that takes the box to the origin and aligns it with the axes. This is Translate(-x0,-y0) then Rotate(-T). This transformation as a matrix is
[cos(-T) -sin(-T) 0][1 0 -x0] [ cos(T) sin(T) -x0*cos(T)-y0*sin(T)]
[sin(-T) cos(-T) 0][0 1 -y0] = [-sin(T) cos(T) x0*sin(T)-y0*cos(T)]
[0 0 1][0 0 1] [ 0 0 1 ]
You want to apply this transformation to the point to be tested and then see if it lies in the desired box:
// Transform the point to be tested.
ct = cos(T);
st = sin(T);
xp = ct * x + st * y - x0 * ct - y0 * st;
yp = -st * x + ct * y + x0 * st - y0 * ct;
// Test for membership in the box.
boolean inside = xp >= -w/2 && xp <= w/2 && yp >= -h/2 && yp <= h/2;
It's late and I haven't checked this arithmetic, but it's close. Say if it doesn't work.
Alright, so I'm trying to get the distance of a point in a 2d triangle without calculating perpendicular vectors.
float qd = Vector2f.dot(new Vector2f(pos.x, pos.z),
new Vector2f(normal.pos.x, normal.pos.z)) -
Vector2f.dot(new Vector2f(q.center.x, q.center.z),
new Vector2f(normal.pos.x, normal.pos.z));
That's the code I'm using. (Note: it's converting 3f vectors to 2d ones, but you don't have to worry about that). I need the result of the calculation to be between 0 and 1 I.E. 0.5 or something.
If I'm still not explaining right maybe this will help?
My question is: How do I get the distance of a point in a 2d triangle without calculating perpendicular vector's distance? I.E. if the triangle is facing up (y = -1) without any tilt
I would need the distance in the triangle without any X.
Edit1: About what you're saying, Banthar, This is what I got out of it, and it doesn't work, but it seems like it's close to working.
float d = (float) Math.sqrt( 0 /*cause the two x's should be the same */ + Math.pow(pos.z - q.max.z, 2));
float h = (float) Math.sqrt( 0 /*cause the two x's should be the same */ + Math.pow(q.min.z - q.max.z, 2));
float myDist = d/h;
Let's say your triangle is ABC and the point is P.
The number you are looking for is the distance from P to AB divided by the distance from C to AB.
This is the same as the ratio of the corresponding areas. So you can compute the two areas:
Area(ABP) / Area(ABC)
The best way to compute the triangle area depends on what information you have about your triangle.
If you have the vertices only, then you can use:
Area(ABP) / Area(ABC) = ( Ax*By - Ax*Py + Ay*Px - Ay*Bx + Bx*Py - By*Px ) /
( Ax*By - Ax*Cy + Ay*Cx - Ay*Bx + Bx*Cy - By*Cx )