I use hibernate spatial attach a geolocation to a car. My card domain class looks like this:
import com.vividsolutions.jts.geom.Point
class Card {
String name
Point location
}
My program is in Grails so the samples I gave are in Groovy. I found a similar post here which does not really answer the most important question on how to specify the radius correctly to set n kilimeter for the radius.
Here is how I compute a circle Geometry:
private static Geometry createCircle(double x, double y, final double RADIUS) {
GeometricShapeFactory shapeFactory = new GeometricShapeFactory();
shapeFactory.setNumPoints(1000);
shapeFactory.setCentre(new Coordinate(x, y))
shapeFactory.setSize( (RADIUS * 2)/88.1)
return shapeFactory.createCircle().getBoundary()
}
The size of the circle is divided by 88.1 the is just a dirty fix to get an approximate dimension but it is still wrong.
My query is done like this:
double radius = 40
Geometry filter = createCircle(car.location.x, car.location.y, radius)
Session session = sessionFactory.currentSession
Query q = session.createQuery("select c from Car c where within(c.location, ?) = true")
q.setParameter(0, filter, GeometryUserType.TYPE)
q.list()
This works not very accurate. Some of the cars which should be outside circle are returned from this query.
Here is an example. My center is Hamburg and the radius is 40km. I made a google maps visualization.
Here is when I set radius = 40:
You can see that at the top left one car which has a location outside of the circle is still drawn. This should not be the case. It appears to me that the circle I draw with google maps is not equal to the circle Geometry I draw in code for my query.
Here is when I set radius = 30:
You see that the cars at the bottom right disappear which is correct, but the car at the top left still remains in the query.
When I draw the circle I created with createCircle I get the following (using getCoordinates() to get the coordinates of the circle):
How can I query all cars within a radius of 40km?
Try to compute the distance between the geometries instead. You can use dwithin function (see Hibernate spatial documentation ):
select c from Car c where dwithin(c.location, :geom, :dist) = true
or just distance:
select c from Car c where distance(c.location, :geom) < :dist
Also, don't forget transform distance measure to degrees (for WGS84 SRS).
edited:
Use geom the centre of your imaginary circle, so you don't have to generate a geometry, just filter by distance.
P.D.: In this post you can found a description about the problem of transformation of degrees and meters
Good luck!.
Chema.
Related
I have two set's of longitude and latitude, i am desperately trying to figure out how many meters point A is displaced from point B, horizontally and vertically.
My goal would be have to +/-X and +/-Y values - I already have the shortest distance between the two points via Location.distanceBetween()....i thought i could use this with the Location.bearingTo() to find the values im looking for via basic trigonometry.
My thinking was i could use the bearing as angle A, 90 degrees as angle C and legnth of Side C (distanceBetween) to calculate the legnth of side A (x axis) and B (y axis) but the results were underwhelming to say the least lol
//CALCULATE ANGLES
double ANGLE_A;
ANGLE_A = current_Bearing; //Location.bearingTo()
ANGLE_A = ANGLE_A*Math.PI/180; //CONVERT DEGREES TO RADIANS
double ANGLE_C;
ANGLE_C = 90; // Always Right Angle
ANGLE_C = ANGLE_C*Math.PI/180; //CONVERT DEGREES TO RADIANS
double ANGLE_B;
ANGLE_B = 180 - ANGLE_A - ANGLE_C; // 3 sides of triangle must add up to 180, if 2 sides known 3rd can be calced
ANGLE_B = ANGLE_B*Math.PI/180; //CONVERT DEGREES TO RADIANS
//CALCULATE DISTANCES
double SIDE_C = calculatedDistance; //Location.distanceTo()
double SIDE_A = Math.sin(ANGLE_A) * SIDE_C /Math.sin(ANGLE_C);
double SIDE_B = Math.sin(ANGLE_B)*SIDE_C/Math.sin(ANGLE_C);
What im noticing is that my bearing changes very little between the two points regardless of how we move, though mind you im testing this at 10 - 100m distance, its always at 64.xxxxxxx and only the last few decimals really change.
All the online references i can find always look at computing the shortest path, and although this awesome site references x and y positions it always ends up combining them into shortest distance again
Would SUPER appreciate any pointers in the right direction!
Since the earth is not flat, your idea with 90 degree angles will not work properly.
What might be better, is this.
Lets say your 2 known points A and B have latitude and longitude latA, longA and latB, longB.
Now you could introduce two additional points C and D with latC = latA, longC = longB, and latD = latB, longD = longA, so the points A, B, C, D form a rectangle on the earth's surface.
Now you can simply use distanceBetween(A, C) and distanceBerween(A, D) to get the required distances.
It may be possible to utilize Location.distanceBetween(), if following conditions meet,
the points are located far apart from polar regions and
distance is short enough (compared to radius of the Earth).
The way is very simple. Just fix either longitude or latitude and vary only the other. Then calculate distance.
Location location1 = new Location("");
Location location2 = new Location("");
location1.setLatitude(37.4184359437);
location1.setLongitude(-122.088038921);
location2.setLatitude(37.3800232707);
location2.setLongitude(-122.073230422);
float[] distance = new float[3];
Location.distanceBetween(
location1.getLatitude(), location1.getLongitude(),
location2.getLatitude(), location2.getLongitude(),
distance
);
double lat_mid = (location1.getLatitude() + location2.getLatitude()) * 0.5;
double long_mid = (location1.getLongitude() + location2.getLongitude()) * 0.5;
float[] distanceLat = new float[3];
Location.distanceBetween(
location1.getLatitude(), long_mid,
location2.getLatitude(), long_mid,
distanceLat
);
float[] distanceLong = new float[3];
Location.distanceBetween(
lat_mid, location1.getLongitude(),
lat_mid, location2.getLongitude(),
distanceLong
);
double distance_approx = Math.sqrt(
Math.pow(distanceLong[0], 2.0) + Math.pow(distanceLat[0], 2.0)
);
Compare distance[0] and distance_approx, check whether accuracy meets your requiement.
If your points are close enough, you may easily calculate x-y distances from latitude / longitude once you know that 1 degree of latitude is 111km, and one degree of longitude is 111km * cos(latitude):
y_dist = abs(a.lat - b.lat) * 111000;
x_dist = abs(a.lon - b.lon) * 111000 * cos(a.lat);
For short distances we could easily ignore that earth is not exactly a sphere, the error is approximately 0.1-0.2% depending on your exact location.
There is no valid answer to this question until you define what projection.
The azimuth of a "straight" line varies along the route unless you are travelling exactly due south or due north. You can only calculate the angles at each node, or azimuth at a specific point along the route. Angles at the nodes will not add up to 180° because you're referring to an ellipsoidal triangle, and calculating an ellipsoidal triangle is a multiple-step process that in all honesty, is better left to the libraries out there such as OSGEO.
If you want to fit the geometry to a plane Cartesian, it is usually using the Lambert projection for areas mostly long on east and west directions, and Transverse Mercator on longer north to south projections. The entire Earth is mapped in the UTM (Universal Transverse Mercator) that will give you Cartesian coordinates anywhere, but in no case will you get perfect Eucldian geometry when dealing with geodetics. For instance, if you go south 10 miles, turn left 90° and go east for 10 miles, turn left 90° again, you can be anywhere from 10 miles from your starting point, to exactly back to where you started, if that point happened to be the North pole. So you may have a mathematically beautiful bearing on the UTM coordinate plane, but on the ground, you cannot turn the same angles as the UTM geometry indicates and follow that same path on ground. You will either follow a straight line on the ground and a curved line on a cartesian plane, or vice-versa.
You could do a distance between two points on the same northings and separately, the same eastings, and derive a north distance and an east distance. However, in reality the angles of this triangle will make sense only on paper, and not on the globe. If a plane took off at the bearing calculated by such a triangle, it would arrive in the wrong continent.
EDIT: I found out that all the pixels were upside down because of the difference between screen and world coordinates, so that is no longer a problem.
EDIT: After following a suggestion from #TheVee (using absolute values), my image got much better, but I'm still seeing issues with color.
I having a little trouble with ray-tracing triangles. This is a follow-up to my previous question about the same topic. The answers to that question made me realize that I needed to take a different approach. The new approach I took worked much better, but I'm seeing a couple of issues with my raytracer now:
There is one triangle that never renders in color (it is always black, even though it's color is supposed to be yellow).
Here is what I am expecting to see:
But here is what I am actually seeing:
Addressing debugging the first problem, even if I remove all other objects (including the blue triangle), the yellow triangle is always rendered black, so I don't believe that it is an issues with my shadow rays that I am sending out. I suspect that it has to do with the angle that the triangle/plane is at relative to the camera.
Here is my process for ray-tracing triangles which is based off of the process in this website.
Determine if the ray intersects the plane.
If it does, determine if the ray intersects inside of the triangle (using parametric coordinates).
Here is the code for determining if the ray hits the plane:
private Vector getPlaneIntersectionVector(Ray ray)
{
double epsilon = 0.00000001;
Vector w0 = ray.getOrigin().subtract(getB());
double numerator = -(getPlaneNormal().dotProduct(w0));
double denominator = getPlaneNormal().dotProduct(ray.getDirection());
//ray is parallel to triangle plane
if (Math.abs(denominator) < epsilon)
{
//ray lies in triangle plane
if (numerator == 0)
{
return null;
}
//ray is disjoint from plane
else
{
return null;
}
}
double intersectionDistance = numerator / denominator;
//intersectionDistance < 0 means the "intersection" is behind the ray (pointing away from plane), so not a real intersection
return (intersectionDistance >= 0) ? ray.getLocationWithMagnitude(intersectionDistance) : null;
}
And once I have determined that the ray intersects the plane, here is the code to determine if the ray is inside the triangle:
private boolean isIntersectionVectorInsideTriangle(Vector planeIntersectionVector)
{
//Get edges of triangle
Vector u = getU();
Vector v = getV();
//Pre-compute unique five dot-products
double uu = u.dotProduct(u);
double uv = u.dotProduct(v);
double vv = v.dotProduct(v);
Vector w = planeIntersectionVector.subtract(getB());
double wu = w.dotProduct(u);
double wv = w.dotProduct(v);
double denominator = (uv * uv) - (uu * vv);
//get and test parametric coordinates
double s = ((uv * wv) - (vv * wu)) / denominator;
if (s < 0 || s > 1)
{
return false;
}
double t = ((uv * wu) - (uu * wv)) / denominator;
if (t < 0 || (s + t) > 1)
{
return false;
}
return true;
}
Is think that I am having some issue with my coloring. I think that it has to do with the normals of the various triangles. Here is the equation I am considering when I am building my lighting model for spheres and triangles:
Now, here is the code that does this:
public Color calculateIlluminationModel(Vector normal, boolean isInShadow, Scene scene, Ray ray, Vector intersectionPoint)
{
//c = cr * ca + cr * cl * max(0, n \dot l)) + cl * cp * max(0, e \dot r)^p
Vector lightSourceColor = getColorVector(scene.getLightColor()); //cl
Vector diffuseReflectanceColor = getColorVector(getMaterialColor()); //cr
Vector ambientColor = getColorVector(scene.getAmbientLightColor()); //ca
Vector specularHighlightColor = getColorVector(getSpecularHighlight()); //cp
Vector directionToLight = scene.getDirectionToLight().normalize(); //l
double angleBetweenLightAndNormal = directionToLight.dotProduct(normal);
Vector reflectionVector = normal.multiply(2).multiply(angleBetweenLightAndNormal).subtract(directionToLight).normalize(); //r
double visibilityTerm = isInShadow ? 0 : 1;
Vector ambientTerm = diffuseReflectanceColor.multiply(ambientColor);
double lambertianComponent = Math.max(0, angleBetweenLightAndNormal);
Vector diffuseTerm = diffuseReflectanceColor.multiply(lightSourceColor).multiply(lambertianComponent).multiply(visibilityTerm);
double angleBetweenEyeAndReflection = scene.getLookFrom().dotProduct(reflectionVector);
angleBetweenEyeAndReflection = Math.max(0, angleBetweenEyeAndReflection);
double phongComponent = Math.pow(angleBetweenEyeAndReflection, getPhongConstant());
Vector phongTerm = lightSourceColor.multiply(specularHighlightColor).multiply(phongComponent).multiply(visibilityTerm);
return getVectorColor(ambientTerm.add(diffuseTerm).add(phongTerm));
}
I am seeing that the dot product between the normal and the light source is -1 for the yellow triangle, and about -.707 for the blue triangle, so I'm not sure if the normal being the wrong way is the problem. Regardless, when I added made sure the angle between the light and the normal was positive (Math.abs(directionToLight.dotProduct(normal));), it caused the opposite problem:
I suspect that it will be a small typo/bug, but I need another pair of eyes to spot what I couldn't.
Note: My triangles have vertices(a,b,c), and the edges (u,v) are computed using a-b and c-b respectively (also, those are used for calculating the plane/triangle normal). A Vector is made up of an (x,y,z) point, and a Ray is made up of a origin Vector and a normalized direction Vector.
Here is how I am calculating normals for all triangles:
private Vector getPlaneNormal()
{
Vector v1 = getU();
Vector v2 = getV();
return v1.crossProduct(v2).normalize();
}
Please let me know if I left out anything that you think is important for solving these issues.
EDIT: After help from #TheVee, this is what I have at then end:
There are still problems with z-buffering, And with phong highlights with the triangles, but the problem I was trying to solve here was fixed.
It is an usual problem in ray tracing of scenes including planar objects that we hit them from a wrong side. The formulas containing the dot product are presented with an inherent assumption that light is incident at the object from a direction to which the outer-facing normal is pointing. This can be true only for half the possible orientations of your triangle and you've been in bad luck to orient it with its normal facing away from the light.
Technically speaking, in a physical world your triangle would not have zero volume. It's composed of some layer of material which is just thin. On either side it has a proper normal that points outside. Assigning a single normal is a simplification that's fair to take because the two only differ in sign.
However, if we made a simplification we need to account for it. Having what technically is an inwards facing normal in our formulas gives negative dot products, which case they are not made for. It's like light was coming from the inside of the object or that it hit a surface could not possibly be in its way. That's why they give an erroneous result. The negative value will subtract light from other sources, and depending on the magnitude and implementation may result in darkening, full black, or numerical underflow.
But because we know the correct normal is either what we're using or its negative, we can simply fix the cases at once by taking a preventive absolute value where a positive dot product is implicitly assumed (in your code, that's angleBetweenLightAndNormal). Some libraries like OpenGL do that for you, and on top use the additional information (the sign) to choose between two different materials (front and back) you may provide if desired. Alternatively, they can be set to not draw the back faces for solid object at all because they will be overdrawn by front faces in solid objects anyway (known as face culling), saving about half of the numerical work.
I have set values of minimum longitude and latitude of a specific static map image. That map image is a cut of some country.
/**
* Maximum longitude value of the map
*/
private float mapLongitudeMax;
/**
* Minimum longitude value of the map
*/
private float mapLongitudeMin;
/**
* Maximum latitude value of the map
*/
private float mapLatitudeMax;
/**
* Minimum latitude value of the map
*/
private float mapLatitudeMin;
And I have a BufferedImage called mapImage.
I have a method that I wrote with a friend that receives longitude and latitude and gives you an X and a Y position approximately on the map so you can draw something on the map.
Now if I want to move my mouse around the map, I want it to show longitude/latitude of my mouse position, that means I need to create a method which converts X and Y of the mouse position to longitude and latitude, which should do the opposite of my other method.
This is my method to convert globe coordinates to image X and Y:
protected Location getCoordinatesByGlobe(float latitude, float longitude) {
/**
* Work out minimum and maximums, clamp inside map bounds
*/
latitude = Math.max(mapLatitudeMin, Math.min(mapLatitudeMax, latitude));
longitude = Math.max(mapLongitudeMin, Math.min(mapLongitudeMax, longitude));
/**
* We need the distance from 0 or minimum long/lat
*/
float adjLon = longitude - mapLongitudeMin;
float adjLat = latitude - mapLatitudeMin;
float mapLongWidth = mapLongitudeMax - mapLongitudeMin;
float mapLatHeight = mapLatitudeMax - mapLatitudeMin;
float mapWidth = mapImage.getWidth();
float mapHeight = mapImage.getHeight();
float longPixelRatio = mapWidth / mapLongWidth;
float latPixelRatio = mapHeight / mapLatHeight;
int x = Math.round(adjLon * longPixelRatio) - 3;// these are offsets for the target icon that shows.. eedit laterrr #oz
int y = Math.round(adjLat * latPixelRatio) + 3; //
// turn it up
y = (int) (mapHeight - y);
return new Location(x, y);
}
Now I tried thinking, the first thought that came into my head is just doing the same in reverse... so I started doing it and I ran into problems like, I can't get the value of adjLon or adjLat without having the longitude or latitude, so this can't be simply done by reversing it. I am all new to coordinates systems so it's all a bit confusing for me but I am starting to catch it up.
Any tips for me?
EDIT (Its not possible?)
According to this answer, you can't really get real results because the earth is not flat, it can't really be converted to a flat map with longitude and latitude without implementing a real mathematical algorithm to make it work with the changes.
There are few reasons in my code why the answer can not be exact:
Because of the reason above
Because my X,Y values are integers and not floats.
So my question now, if it is really impossible with my method?
Sadly, there's not an easy answer to this. While you can write the projection routines yourself, the easiest thing to do is probably to get a GIS library, but since I ended up doing this in C# and not Java, I don't know what's available.
The biggest piece of information you need is exactly which projection your map image uses. The Mercator Projection is quite popular, but it's not the only one. You also need to make sure that your chosen projection works for the range of latitudes and longitudes you want. The Mercator projection kind of breaks if you start going above +-70 N, so if you're doing a lot of positions at the poles that might not be the projection for you.
From what i read in your code, your image is in longitude/latitude coordinates, and you draw it on a canvas to be display on screen. Then you add your listener on this image is that correct ?
if this is correct the response is trival, as you can retrieve the X/Y position in your image via MouseListener method on Canvas and transform it base on the position of the mouse inside the canvas (methode getX/getY from mouseEvent) and the current dimension of the canvas and then translate this position within the bound of longitude/latitude.
longitude = minLongitude + (MouseEvent.getX/Canvas.Width)*maxLongitude
lalitude = minLaltitude + (MouseEvent.getY/Canvas.Height)*maxLatitude
If not then you will have to know as #ginkner say the projection technique use to pass from long/lat to X/Y and take the inverse transformation knowing that you will lost some information.
There is a difference between geography coordinate and geometry coordinate it is quite like the 3D earch surface and a canvas to draw on. The web Mercator projection or other popular projection coordinate systems are used for an abstraction of visualization of the earth surface. So that pixel shift in different location would result in different distances.
If you are looking for a some basic GIS java library to handle this type of problem, Geotools in Java could be one option. GeoTools
Assume I have several bounding boxes with 4 coordinates pair (long/lat only) each representing the 4 corners of a square box. How can I check if 2 of those boxes intersects?
I know I could use java.awt.Rectangle to check if 2 rectangles intersects, but the problem is it is calculated using X/Y/Width/Height instead of coordinates.
Can someone please give me some directions on how can I do this calculations?
Thanks.
EDIT
What I am trying to accomplish is the same represented by this library.
Basically it calculates a square bounding box around a given point and check if the (imaginary) squares intersects with each other, like in this image:
(source: google.com)
So far I've been able to calculate the corners for each marker and now I need to somehow check if they intersect with each other. How can I do this intersection calculation?
EDIT 2
This is how I am calculating the corners:
private static double getLatitude(double distance, double lat, double angle) {
return toDegrees(asin(sin(toRadians(lat)) * cos(distance / RADIUS) + cos(toRadians(lat)) * sin(distance / RADIUS) * cos(toRadians(angle))));
}
private static double getLongitude(double distance, double lat, double lng, double angle) {
double newLat = getLatitude(distance, lat, angle);
return toDegrees(toRadians(lng) + atan2(sin(toRadians(angle)) * sin(distance / RADIUS) * cos(toRadians(lat)), cos(distance / RADIUS) - sin(toRadians(lat)) * sin(toRadians(newLat))));
}
Where RADIUS = 6378.1 and angle = 45/135/225/315 (top right, bottom right, bottom left and top left).
Example output
I'm assuming that in your "lat/long bounding box' each side follows the lines of constant longitude and latitude - in other words that the top side follows the line of constant latitude, and the left side the line of constant longitude.
While this is not actually a rectangle in real life, it can actually be treated as one for our purposes. Mathematically you can think of this as transforming the bounding box into a "lat/long' space, where the shape is in fact a rectangle. If that doesn't make sense you may have to take my word for it. In any case it is possible to show that the curved shapes in real space intersect if and only if the rectangles intersect in curved space.
The short version of this is: if you do a standard test for intersection of rectangles (using the Java Rectangle class code, and using latitude and longitude as the rectangle bounds) you will get the right result.
EXAMPLE
You have two areas, defined as:
The area between 50 and 52 degrees N and 75 and 77 degrees E
The area between 51 and 53 degrees N and 76 and 79 degrees E
You can correctly test for their intersection by doing:
Rectangle r1 = new Rectangle(75,50,2,2);
Rectangle r2 = new Rectangle(76,51,2,3);
boolean intersects = r1.insersects(r2);
It doesn't matter that the rectangles are not rectangular in Euclidean space.
P.S. This will not work if one of your rectangles actually contains either the north or south pole. In that case you will need to split each rectangle into two, one on each side of the pole. You need to normalize everything to +/- 90 latitude and +/- 180 longitude. You will need to do something clever if one or more of the rectangles overlaps the +/-180 longitude line.
I have a huge set of points already loaded within a plane I need to draw a circle/ellipse starting from a given point and a radius distance in meters then check which points are inside the circle.
I've already done this with a polygon with the within() method, but I can't find a way to draw a circle/ellipse without having to specify every point around the polygon.
Is there a way to do this on JTS or do I need another java library?
If I understood correctly you have the radius and the center, so you can draw a circle with JTS like this:
public static Geometry createCircle(double x, double y, final double RADIUS) {
GeometricShapeFactory shapeFactory = new GeometricShapeFactory();
shapeFactory.setNumPoints(32);
shapeFactory.setCentre(new Coordinate(x, y));
shapeFactory.setSize(RADIUS * 2);
return shapeFactory.createCircle();
}
You can just verify that the distance from the point is less than the radius. No need to draw the circle to know which points are inside it. For faster run times, compare the square of the distance with the square of the radius; this saves unnecessary square root operations.
For ellipses, the problem is only slightly harder, involving a quadratic form x^2 + k y^2.
You can simply buffer the circle center with a positive value like so:
Point centerPoint = ...;
Polygon circle = (Polygon) centerPoint.buffer(0.1);