If I have 2 points in 3d space how can I determine if a rectangular prism lies between those 2 points? Also the only information I have about the the rectangular prism is its min and max x, y, and z value. I know I could iterate down the line between those 2 points checking to see if that point is within the rectangular prism but that seems very resource heavy. What I really need is a way to check if a line segment intersects the prism but I am not sure how to do that any ideas?
I found these two resources that seem similar to my question
https://math.stackexchange.com/questions/2134505/how-to-check-if-an-infinite-line-intersects-a-rectangular-prism-in-3d-space
How to check if an infinite line intersects a rectangular prism in 3d space?
Looking at that bottom link it just simply says to find parameters t that intersects the rectangular prism which is obvious but the problem is I dont know how to do that any thoughts?
project your line points and prism edges onto 2D plane that is perpendicular to your line.
On the 2D plane, the two points of line will be just one point and prism edges is just a bunch of connected vertices forming a closed region. check if the one point is within the closed region, this will be easy to do for 2D.
If your point is within then the line intersects the prism in 3D, if not then no.
now there is a case where it is a line segment where the two ends don't touch the prism. In this case you just check point to prism surface distance, there is a equation for that.
Let line segment is defined by two points (X1, Y1) and (X2, Y2).
Let box is defined by ranges xmin..xmax, ymin..ymax, zmin..zmax.
We can write parametric eqiation for line segment, where t is in range 0..1:
X = X1 + t * (X2 - X1)
Y = Y1 + t * (Y2 - Y1)
Z = Z1 + t * (Z2 - Z1)
Box has 6 facets. Let the first one is at xmin. We can substitute xmin in the first equation and find parameter t where line intersects this facet.
xmin = X1 + t1 * (X2 - X1)
t1 = (xmin - X1) / (X2 - X1)
Now check that t1 is in range 0..1. If yes, substitute this t1 value in the second and third equations
Y = Y1 + t1 * (Y2 - Y1)
Z = Z1 + t1 * (Z2 - Z1)
and check that resulting Y and Z lie in ranges ymin..ymax and zmin..zmax respectively.
If yes - line segment does intersect the box
If no, repeat for other faces xmax, ymin and so on.
P.S. Also you can consider special case when line segment is fully inside the box (just check if X1,Y1,Z1 is in box range)
P.P.S. When line is parallel to some coordinate plane, don't check intersections with corresponding faces (for example, if X2-X1==0, you cannot divide by zero, but you just don't need check xmin and xmax faces)
Quick-made Python-like pseudocode for reference:
def DoesSegmentIntersectBox(x1,y1,z1,x2,y2,z2,xmin,xmax,yin,ymax,zmin,zmax):
if zmin<=z1<=zmax and ymin<=y1<=ymax and xmin<=x1<=xmax:
return True #end inside the box
if (x2-x1):
t = (xmin-x1) / (x2-x1)
if 0<=t<=1: #line intersects plane in segment range
y = y1+t*(y2-y1)
if ymin<=y<=ymax: #segment intersects face in y range
z = z1+t*(z2-z1)
if zmin<=z<=zmax: #segment intersects face in z range
return True #segment does intersect face xmin
t = (xmax-x1) / (x2-x1)
#same 6 lines
if (y2-y1):
t = (ymin-y1) / (y2-y1)
if 0<=t<=1:
x = x1+t*(x2-x1)
if xmin<=x<=xmax:
z = z1+t*(z2-z1)
if zmin<=z<=zmax:
return True
t = (ymax-y1) / (y2-y1)
#same 6 lines
if (z2-z1):
t = (zmin-z1) / (z2-z1)
if 0<=t<=1:
x = x1+t*(x2-x1)
if xmin<=x<=xmax:
y = y1+t*(y2-y1)
if ymin<=y<=ymax:
return True
t = (zmax-z1) / (z2-z1)
#same 6 lines
return False
Related
So I have made the following game:
The player moves around freely and shoot bullets which bounce off the walls . I am trying to implement auto aiming. I have tried to implement this using lines, which go around the player.
Now what I am having problems with is calculating where the lines intersect with the walls. I am unsure of how to do this.
I can't use getBounds() as the lines are not 2DRectangles. If anyone has any idea on how I can calculate where the lines intersect with a wall, and return that position, it would be very helpful.
You just need to do the math here.
Suppose your line has start at (startX, startY) and end at (endX, endY). Then using basic grade-school geometry, any point (x,y) on the line satisfies the equation
(y-startY) / (x-startX) = (endY - startY) / (endX - startX)
Of course,
(endY - startY) / (endX - startX)
is just the slope of the line, so set
slope = (endY - startY) / (endX - startX)
and then you have
(y-startY) / (x-startX) = slope
This might be more convenient if you know the starting point (startX and startY for the line) and the angle, as you can just do slope = Math.tan(angle).
For the example of the intersection with a horizontal wall, all points on the edge of the wall have the same y-coordinate, call it wallY. So if x is the x-coordinate of the intersection, you have
(wallY-startY) / (x-startX) = slope
which you can rearrange to
x = startX + (wallY-startY) / slope
so the intersection point is (x, wallY) with x as in the last equation.
If the wall is finite (i.e. it has start and end x points), then the check to see if the line actually intersects the wall is simply x >= wallStartX && x <= wallEndX, assuming wallStartX is the left end of the wall and wallEndX the right end.
If the wall is vertical, then the math is basically the same, except you know the x coordinate on the wall (say x = wallX), and you want to find the y coordinate. So just substitute wallX for x in the first (or fourth) equation and solve for y.
If the wall is not horizontal or vertical, then the math is a little more complicated, though not much. (Left as an exercise for the reader.)
You can use the well tested JTS library for doing such things.
I am currently developing an application and want to understand the core principles of computer graphics. I was wandering if anyone can provide a formula to check if a point in three variables (x,y,z) intersects a line that consist of two points (x1,y1,z1) and (x2,y2,z2)
Say if you wanted to find if point0 =(x0,y0,z0) intersects line which cross at point1=(x1,y1,z1) and point2 = (x2,y2,z2).
Formula to check if point0 lies on line (point1->point2) would be to check if (X0,y0,z0) = (x1,y1,z1) + t(x2-x1,y2-y1,z2-z1).
For any value of t that is constant for x , y and z axes.
so
x0 = x1 + t(x2-x1)
y0 = y1 + t(y2-y1)
etc..
I need to draw parallel lines between two squares. They can be placed at angle. I need to find out 6 points (3 on square A and 3 on square B) so that lines drawn between them are equally spaced. Thanks
Best you get acquainted with a bit of vector math.
Ideally the lines would orient themselves to the vector between the centers of the two squares (x0, y0) - (x1, y1).
The direction of the three lines is:
x = (x1 - x0)
y = (y1 - y0)
A vector 90° to (x, y), and with size 1:
vn = (y, - x) / sqrt(x² + y²)
So a line 10 px from the center would be
(x0, y0) + 10.vn + µ.(x, y)
Use -10.vn, 0, +10.vn for the three lines.
Determine intersection points with square's edges (µ > 0 for first square).
As it is rewarding for ones self-consciousness, I leave the solution to you. It also is not as readable anymore.
So I'm trying to make my first game on android. The thing is I have a small moving ball and I want it to bounce from a line that I drew. For that I need to find if the x,y of the ball are also coordinates of one dot from the line.
I tried to implement these equations about lines
x=a1 + t*u1
y=a2 + t*u2 => (x-a1)/u1=(y-a2)/u2 (t=t which has to be if the point is on the line)
where x and y are the coordinates I'm testing, dot[a1,a2] is a dot that is on the line and
u(u1,u2) is the vector of the line.
heres the code:
public boolean Collided()
{
float u1 =Math.abs(Math.round(begin_X)-Math.round(end_X));
float u2 =Math.abs(Math.round(begin_Y)-Math.round(end_Y));
float t_x =Math.round((elect_X - begin_X)/u1);
float t_y =Math.round((elect_Y - begin_Y)/u2);
if(t_x==t_y)
{
return true;
}
else
{
return false;
}
}
points [begin_X,end_X] and [begin_Y,end_Y] are the two points from the line and [elect_X,elect_Y] are the coordinates of the ball
theoretically it should work, but in the reality the ball most of the time just goes straight through the line or bounces somewhere else where it shouldn't
I'm not sure what your code is doing, but looks weird
you do some operations on x coordinates of your data, then on y, and at the end you want them to be equal;
go and try hereShortest distance between a point and a line segment and then if distance == 0 (or smaller on equal to radius of your ball) you will have collision
The issue lies in the fact that you're testing whether the dot 'hits' the line that you want it to bounce from. I'm assuming that you're incrementing the position of your dot every frame with a small amount.
Say your dot is placed at [1,1], your line runs from [0,0] to [5,0], the velocity of your dot is 1 unit per second and the direction is [-1,0]. I'm assuming your calculating the increment based on the time per frame to allow for smoother animation.
What's happening is the following:
Dot at [1,1]
Time per frame = 0,7
Dot is moved to [0.3,0]
Test intersection = false
--- next frame ---
Dot at [0.3,0]
Time per frame = 0.5 (this usually varies per frame)
Dot is moved to [0.2,0]
Test intersection = false
So the tests say there hasn't been an intersection because your testing discrete positions of your dot.
As Aki Suihkonen suggests you want to test for line intersection between a line formed by the last position + the current position and the line that you want your dot to collide with.
java.awt.geom.Line2D.linesIntersect(double X1, double Y1, double X2, double Y2, double X3, double Y3, double X4, double Y4) allows you to check for these intersections easily.
Your math is OK, but you the code isn't.
It's simpler to use the genearl line equation y = y1 + a(x - x1) where a = (y2 - y1) / ( x2 - x1) , being (x1,y1) and (x2,y2) to points from the line.
To get ball the distance from the line when the ball is at point (bx,by)use:
double a = (y2 - y1) / (x2 - x1);
double distance = (by - y1 - a * (bx - x1)) * Math.cos(Math.atan(a));
Now you can compare if Math.abs(distance) is bellow a specific value (i.e. ball diameter) to confirm collision.
Note: this only works for non vertical lines. If you have a vertical line just use:
double distance = bx - x1;
good luck.
From Google Earth I got a Box with coordinates for a picture, like following:
<LatLonBox>
<north>53.10685</north>
<south>53.10637222222223</south>
<east>8.853144444444444</east>
<west>8.851858333333333</west>
<rotation>-26.3448</rotation>
</LatLonBox>
Now I want to test weather a point intersect with this LatLonBox.
My base idea to check, whether a point intersect with the LatLonBox was, to rotate the point back by the given angle, and then to test whether the point intersect with a regular (not rotated) rectangle.
I tried to calculate the rotation manually:
public static MyGeoPoint rotatePoint(MyGeoPoint point, MyGeoPoint origion, double degree)
{
double x = origion.getLatitude() + (Math.cos(Math.toRadians(degree)) * (point.getLatitude() - origion.getLatitude()) - Math.sin(Math.toRadians(degree)) * (point.getLongitude() - origion.getLongitude()));
double y = origion.getLongitude() + (Math.sin(Math.toRadians(degree)) * (point.getLatitude() - origion.getLatitude()) + Math.cos(Math.toRadians(degree)) * (point.getLongitude() - origion.getLongitude()));
return new MyGeoPoint(x, y);
}
public boolean intersect(MyGeoPoint geoPoint)
{
geoPoint = MyGeoPoint.rotatePoint(geoPoint, this.getCenter(), - this.getRotation());
return (geoPoint.getLatitude() < getTopLeftLatitude()
&& geoPoint.getLatitude() > getBottomRightLatitude()
&& geoPoint.getLongitude() > getTopLeftLongitude()
&& geoPoint.getLongitude() < getBottomRightLongitude());
}
And it seems that the results are wrong.
LatLonBox box = new LatLonBox(53.10685, 8.851858333333333, 53.10637222222223, 8.853144444444444, -26.3448);
MyGeoPoint point1 = new MyGeoPoint(53.106872, 8.852311);
MyGeoPoint point2 = new MyGeoPoint(53.10670378322918, 8.852967186822669);
MyGeoPoint point3 = new MyGeoPoint(53.10652664993972, 8.851994565566875);
MyGeoPoint point4 = new MyGeoPoint(53.10631650700605, 8.85270995172055);
System.out.println(box.intersect(point1));
System.out.println(box.intersect(point2));
System.out.println(box.intersect(point3));
System.out.println(box.intersect(point4));
The result is true, false, false, true. But it should be 4x true.
Probably I´, making some kind of error in reasoning.
Maybe because the latitude values are getting bigger upwards. But I don´t knwo how to change the formular.
I need some help ...
EDIT:
I think my basic idea and formular is right. Also I found similar solutions eg. link and couldn´t find any difference.
So I think the only possible error source is, that the axis are not proportional. So the problem is how to take account of this.
I hope someone has got an idea.
The problem was indeed that the axis were not proportional.
The following method takes care of it.
public static MyGeoPoint rotatePoint(MyGeoPoint point, MyGeoPoint origion, double degree)
{
double x = origion.longitude + (Math.cos(Math.toRadians(degree)) * (point.longitude - origion.longitude) - Math.sin(Math.toRadians(degree)) * (point.latitude - origion.latitude) / Math.abs(Math.cos(Math.toRadians(origion.latitude)));
double y = origion.latitude + (Math.sin(Math.toRadians(degree)) * (point.longitude - origion.longitude) * Math.abs(Math.cos(Math.toRadians(origion.latitude))) + Math.cos(Math.toRadians(degree)) * (point.latitude - origion.latitude));
return new MyGeoPoint(x, y);
}
if I understand correctly you want to check if these four points are in rotated rectangle.
I would recommend checking not by corner points because your rectangle is rotated but:
if you have rotated rectangle ABCD then calculate lines |AB|, |BC|,|CD| and |DA|. If you have two points then use y=ax+b (you will calculate a,b by by giving [x,y] of both coordinates that gives you two easy equatations).
Finally function intersect will check
if point <= line |CD|
AND point >= line |AB|
AND point <= line |BC|
AND point >= |DA|
then it is inside rect.
This can be done when your point P[x,y] you put in ax+y+b (a>0 or -ax-y-b). If it is zero it is lying on the line, if it is < than it is under line or "on the left side". Hope I helped..
BTW why are you using -degree value, which you multiply by -1 , is it necessary?
The problem appears to be that the data structure LatLonBox doesn't make any sense as a description for the boundary of a picture. A box in lat-lon coordinates is not a geometric rectangle. (Think about a box near or including the north pole.) You need to re-think your application to deal in a lat/lon coordinate for the center of the picture and then deal with the rotation as an angle with respect to lines of latitude (parallel to the equator). (Even then, a picture with center on the north or south pole will be a degenerate case that must be handled separately.) So a box should properly be something like:
<geobox>
<center_lat>41</center_lat>
<center_lon>-74</center_lon>
<rotation_degrees_ccw>-23</rotation_degrees_ccw>
<width>1000</width> <!-- in pixels or meters, but not in degrees! -->
<height>600</height> <!-- same as above -->
</geobox>
Having said all that, suppose you have a true geometric box centered at (x0,y0), width w, height h, rotated angle T about its center. Then you can test a point P(x,y) for membership in the box with the following. You need the transformation that takes the box to the origin and aligns it with the axes. This is Translate(-x0,-y0) then Rotate(-T). This transformation as a matrix is
[cos(-T) -sin(-T) 0][1 0 -x0] [ cos(T) sin(T) -x0*cos(T)-y0*sin(T)]
[sin(-T) cos(-T) 0][0 1 -y0] = [-sin(T) cos(T) x0*sin(T)-y0*cos(T)]
[0 0 1][0 0 1] [ 0 0 1 ]
You want to apply this transformation to the point to be tested and then see if it lies in the desired box:
// Transform the point to be tested.
ct = cos(T);
st = sin(T);
xp = ct * x + st * y - x0 * ct - y0 * st;
yp = -st * x + ct * y + x0 * st - y0 * ct;
// Test for membership in the box.
boolean inside = xp >= -w/2 && xp <= w/2 && yp >= -h/2 && yp <= h/2;
It's late and I haven't checked this arithmetic, but it's close. Say if it doesn't work.