why is my headingValue such diffrent from bearingValue while looking at targetLocation? - java

In my Android application I have to use my current heading (using accelerometer and magnetometer) and current bearing to targetLocation (location.bearingTo(targetLocation)).
I already know that using accelerometer and magnetometer to figure out current heading starts at 0° on magnetic North and current Bearing starts on geographical North. So i figured out, that i have to add to headingValue, depending on my current location, a value called declination.
For example, I pick up a certain GPS point from google-maps, adding this point as locationpoint in the application. Starting application, moving before measuring the device like a infinity-sign in the air and holding the device in front of me focused in target direction. So i notice that heading != bearing. Can anyone explain to me the error? Assume that i tried different distances between 50 and 3 meters and that my device is calibrated correctly. Below are important methods of source code:
#Override
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER)
mGravity = event.values.clone();
if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD)
mGeomagnetic = event.values.clone();
if (mGravity != null && mGeomagnetic != null) {
float R[] = new float[9];
float I[] = new float[9];
boolean success = SensorManager.getRotationMatrix(R, I, mGravity, mGeomagnetic);
if (success) {
float orientation[] = new float[3];
SensorManager.getOrientation(R, orientation);
double tempAzimuth = Math.toDegrees(orientation[0]); // orientation contains: azimut, pitch and roll
if(tempAzimuth < 0){
currentHeading = tempAzimuth + 360;
} else {
currentHeading = tempAzimuth;
}
TVheadingMag.setText(""+String.format( "%.2f",currentHeading)+"°");
}
}
}
#Override
public void onLocationChanged(Location location) {
//Declination http://stackoverflow.com/questions/4308262/calculate-compass-bearing-heading-to-location-in-android
geoField = new GeomagneticField(
Double.valueOf(location.getLatitude()).floatValue(),
Double.valueOf(location.getLongitude()).floatValue(),
Double.valueOf(location.getAltitude()).floatValue(),
System.currentTimeMillis());
if(location.bearingTo(this.target)<0){
currentBearing = location.bearingTo(this.target)+360;
} else {
currentBearing = location.bearingTo(this.target);
}
headingWithDeclination = currentHeading;
headingWithDeclination += geoField.getDeclination();
currentDistance = location.distanceTo(this.target);
TVheading.setText(""+String.format( "%.2f", headingWithDeclination)+"°");
TVheadingMag.setText(""+String.format( "%.2f",currentHeading)+"°");
TVbearing.setText(""+String.format( "%.2f",currentBearing)+"°");
TVgps.setText(""+ String.format( "%.6f",location.getLatitude()) + " " + String.format( "%.6f",location.getLongitude()));
}
UPDATE
Picture: https://pl.vc/1r6ap
The orange marked position is targetLocation.
Both position are heading to targetLocation.
Can you agree that these results are quiet correctly displayed?
During creation of this pic, i've noticed that both white marks are not equal to positions i was standing at. It seems like bad gps data is the reason because of the problem, isnt it?

Heading is the direction where you look, e.g a tank in which direction it would shoot, while bearing is the direction this vehicle moves. So that should answer why bearing is not heading.
They have different names, and meanings, they are different caluclated, they could not be expected to deliver the same value.
More details
You can move North (bearing = North) , but look at NE. (heading)
Gps delivers bearing (or course (over ground)), the direction the vehicle moves (altough some Api wrongly call it heading)
Compass (=magnetometer) delivers the direction in which you hold the device = (heading)
When you calculate the bearing between the two locations defined as coordinates in lat,lon , as you do in targetLocation (location.bearingTo(targetLocation)). then this is bearing! It is not heading!
And neither the compass not the accelrometer will deliver a decent heading value.
Some android device are very wrong in their magnetomter ( I saw +-20 degrees compared to +/- 2 degrees of my iPhone., Always use a traditional high quality compass as reference)
The ios devices shows the heading well within +/- 2 degress when well calibrated, (you have to calibrate each time before looking at the decice value, not only when you are asked by the operating system to calibrate).
GPS when moving > 10 km(h delives goot bearing results, but not heading.
Magnetometer can be off by some degree even when calibrated.
And usually the declination is smaller than the error.
Declination is nearly nothing in europe, 3 degress very north (europe), only a few places have a high declination >6-7°(north alaska)
Update to your further explantion in your graphic:
You have placed two points with a distance of only 15m, while GPS will not be much more acurate than 3-6m.
So imagine 6m offset of start or destination: such a triangle where a = 6m, b = 15, has an angle of atan2(6 / 15.0) = 21°. So you have an offset of 21° only by inacuracy of location. However still think at the differnce of heading by compass and bearing by line of sight between two locations.

Related

X Y distance from longitude and latitude

I have two set's of longitude and latitude, i am desperately trying to figure out how many meters point A is displaced from point B, horizontally and vertically.
My goal would be have to +/-X and +/-Y values - I already have the shortest distance between the two points via Location.distanceBetween()....i thought i could use this with the Location.bearingTo() to find the values im looking for via basic trigonometry.
My thinking was i could use the bearing as angle A, 90 degrees as angle C and legnth of Side C (distanceBetween) to calculate the legnth of side A (x axis) and B (y axis) but the results were underwhelming to say the least lol
//CALCULATE ANGLES
double ANGLE_A;
ANGLE_A = current_Bearing; //Location.bearingTo()
ANGLE_A = ANGLE_A*Math.PI/180; //CONVERT DEGREES TO RADIANS
double ANGLE_C;
ANGLE_C = 90; // Always Right Angle
ANGLE_C = ANGLE_C*Math.PI/180; //CONVERT DEGREES TO RADIANS
double ANGLE_B;
ANGLE_B = 180 - ANGLE_A - ANGLE_C; // 3 sides of triangle must add up to 180, if 2 sides known 3rd can be calced
ANGLE_B = ANGLE_B*Math.PI/180; //CONVERT DEGREES TO RADIANS
//CALCULATE DISTANCES
double SIDE_C = calculatedDistance; //Location.distanceTo()
double SIDE_A = Math.sin(ANGLE_A) * SIDE_C /Math.sin(ANGLE_C);
double SIDE_B = Math.sin(ANGLE_B)*SIDE_C/Math.sin(ANGLE_C);
What im noticing is that my bearing changes very little between the two points regardless of how we move, though mind you im testing this at 10 - 100m distance, its always at 64.xxxxxxx and only the last few decimals really change.
All the online references i can find always look at computing the shortest path, and although this awesome site references x and y positions it always ends up combining them into shortest distance again
Would SUPER appreciate any pointers in the right direction!
Since the earth is not flat, your idea with 90 degree angles will not work properly.
What might be better, is this.
Lets say your 2 known points A and B have latitude and longitude latA, longA and latB, longB.
Now you could introduce two additional points C and D with latC = latA, longC = longB, and latD = latB, longD = longA, so the points A, B, C, D form a rectangle on the earth's surface.
Now you can simply use distanceBetween(A, C) and distanceBerween(A, D) to get the required distances.
It may be possible to utilize Location.distanceBetween(), if following conditions meet,
the points are located far apart from polar regions and
distance is short enough (compared to radius of the Earth).
The way is very simple. Just fix either longitude or latitude and vary only the other. Then calculate distance.
Location location1 = new Location("");
Location location2 = new Location("");
location1.setLatitude(37.4184359437);
location1.setLongitude(-122.088038921);
location2.setLatitude(37.3800232707);
location2.setLongitude(-122.073230422);
float[] distance = new float[3];
Location.distanceBetween(
location1.getLatitude(), location1.getLongitude(),
location2.getLatitude(), location2.getLongitude(),
distance
);
double lat_mid = (location1.getLatitude() + location2.getLatitude()) * 0.5;
double long_mid = (location1.getLongitude() + location2.getLongitude()) * 0.5;
float[] distanceLat = new float[3];
Location.distanceBetween(
location1.getLatitude(), long_mid,
location2.getLatitude(), long_mid,
distanceLat
);
float[] distanceLong = new float[3];
Location.distanceBetween(
lat_mid, location1.getLongitude(),
lat_mid, location2.getLongitude(),
distanceLong
);
double distance_approx = Math.sqrt(
Math.pow(distanceLong[0], 2.0) + Math.pow(distanceLat[0], 2.0)
);
Compare distance[0] and distance_approx, check whether accuracy meets your requiement.
If your points are close enough, you may easily calculate x-y distances from latitude / longitude once you know that 1 degree of latitude is 111km, and one degree of longitude is 111km * cos(latitude):
y_dist = abs(a.lat - b.lat) * 111000;
x_dist = abs(a.lon - b.lon) * 111000 * cos(a.lat);
For short distances we could easily ignore that earth is not exactly a sphere, the error is approximately 0.1-0.2% depending on your exact location.
There is no valid answer to this question until you define what projection.
The azimuth of a "straight" line varies along the route unless you are travelling exactly due south or due north. You can only calculate the angles at each node, or azimuth at a specific point along the route. Angles at the nodes will not add up to 180° because you're referring to an ellipsoidal triangle, and calculating an ellipsoidal triangle is a multiple-step process that in all honesty, is better left to the libraries out there such as OSGEO.
If you want to fit the geometry to a plane Cartesian, it is usually using the Lambert projection for areas mostly long on east and west directions, and Transverse Mercator on longer north to south projections. The entire Earth is mapped in the UTM (Universal Transverse Mercator) that will give you Cartesian coordinates anywhere, but in no case will you get perfect Eucldian geometry when dealing with geodetics. For instance, if you go south 10 miles, turn left 90° and go east for 10 miles, turn left 90° again, you can be anywhere from 10 miles from your starting point, to exactly back to where you started, if that point happened to be the North pole. So you may have a mathematically beautiful bearing on the UTM coordinate plane, but on the ground, you cannot turn the same angles as the UTM geometry indicates and follow that same path on ground. You will either follow a straight line on the ground and a curved line on a cartesian plane, or vice-versa.
You could do a distance between two points on the same northings and separately, the same eastings, and derive a north distance and an east distance. However, in reality the angles of this triangle will make sense only on paper, and not on the globe. If a plane took off at the bearing calculated by such a triangle, it would arrive in the wrong continent.

Android: Calculate correct Azimuth while compensating for Pitch/Tilt changes (Device not flat)

I'm currently programming an Android camera application (on my Samsung S9 running on Android 9) and I am noticing an issue for the outputted azimuth value when the device has been tilted (is not flat). The difference in degrees from when the device starts off flat and rises to near a 90 degree angle for example can vary from up to 10 or 20 so degrees.
I have been scouring various StackOverflow threads but none seem to have a solution which involves the device potentially using all possible rotations in combination with the ROTATION_VECTOR sensor. The reason why I went with ROTATION_VECTOR is mainly because from one of the threads I've read, users mentioned it taking in to consideration of potential changes in pitch/tilt for azimuth calculations but this is clearly not the case for my issue here.
I also understand that this sensor requires devices to have the Gyroscope sensor available but for my purposes, this will always be the case.
The outputted azimuth value appears to be correct when the device is parallel to the ground (layed flat on the floor) but as soon as it becomes tilted but kept in the same facing direction, the azimuth begins to gradually increase/decrease. This is the same for any rotation case whether the phone is in Portrait or Landscape mode.
The method below is called inside the onSensorChanged event:
private float[] calculateOrientation(float[] values) {
float[] rotationMatrix = new float[9];
float[] remappedMatrix = new float[9];
float[] orientation = new float[3];
// Determine the rotation matrix
SensorManager.getRotationMatrixFromVector(rotationMatrix, values);
// Remap the coordinates based on the natural device orientation.
int x_axis = SensorManager.AXIS_X;
int y_axis = SensorManager.AXIS_Y;
switch (screenRotation) {
case (Surface.ROTATION_90):
x_axis = SensorManager.AXIS_Y;
y_axis = SensorManager.AXIS_MINUS_X;
break;
case (Surface.ROTATION_180):
y_axis = SensorManager.AXIS_MINUS_Y;
break;
case (Surface.ROTATION_270):
x_axis = SensorManager.AXIS_MINUS_Y;
y_axis = SensorManager.AXIS_X;
break;
default: break;
}
SensorManager.remapCoordinateSystem(rotationMatrix, x_axis, y_axis, remappedMatrix);
// Obtain the current, corrected orientation.
SensorManager.getOrientation(remappedMatrix, orientation);
angleLowPassFilter.add(orientation[0]);
// Convert from Radians to Degrees.
values[0] = (float) Math.toDegrees(orientation[0]);
values[1] = (float) Math.toDegrees(orientation[1]);
values[2] = (float) Math.toDegrees(orientation[2]);
int correctedAzimuth = (int) ((values[0] + 360) % 360);
TextView textView = findViewById(R.id.textView);
textView.setText(String.format(Locale.US, "Azimuth: [%s]", correctedAzimuth));
return values;
}

Optimal Path-finding technique in a non-standard maze

So, my problem is that I'm currently working on a path-finding technique in an open world with tiles of various sizes. The object needs to find an optimal path to a destination inside an infinite world (it generates on the fly), which is filled with tiles of various sizes (which are not located on a set grid - they can have any location and size - and neither have to be integers). (The object has access to the data of all the tiles via and ArrayList). Now some factors that make this problem more difficult:
The objects itself has a size and cannot move through tiles. Therefore, it is possible for a path to exist that is too narrow for the object to move through.
The target destination may itself be a moving object.
It is possible for there to be dozens of such objects at the same time - so it is necessary for the algorithm to either be light on the system or for the path to be calculated in a few separate ticks of the program.
I tried implementing solutions for maze-solving techniques, but the main problem is the in most mazes, the tiles can only have very specific coordinates (such as whole numbers) and are always the same size.
I also tried rendering the scene as a giant conventional maze where tiles are actually pixels of tiles (so if i have a 20x40 tile it becomes a 20x40 block of 1x1 tiles), but ran into performance issues and the still didn't solve the issue with a path potentially being to narrow for the object to fit through.
EDIT:
Terribly sorry for my poor wording before, that happens when I'm trying to rush to a solution without fully understanding the question. So what I'm using the algorithm for at the moment is for NPC enemies to find their way to the player around obstacles. Here is an example of a scene:
The black circle with an arrow is the player, the black bush-like blobs are the NPC enemies. So this my my current algorithm I'm using for the enemy AI:
void move() { //part of the Enemy class, this function is called once each tick for every enemy
PVector velocity = new PVector(speed*game.dt, 0); //speed is a pre-set float denoting the enemy's speed, game.dt is deltatim
velocity.rotate(atan2(game.player.location.y-location.y, game.player.location.x-location.x)); //game.player.location is a PVector of the player's position, location is a PVector of this enemy's position
boolean init_collision = getTileCollision(); //getTileCollision is a boolean of whether this enemy is colliding with any tile
location.add(velocity);
boolean collision = getTileCollision();
if (!init_collision && collision) { //if the enemy happens to spawn inside a tile, let is move out of it before checking for collision
location.sub(velocity);
if (desired_heading != -1) { //desired heading is the angle, in radians, of which 90-degree angle the enemy wants to move in, by default set to -1 (see my descrition of this algorithm below)
velocity = new PVector(speed*game.dt, 0);
velocity.rotate(desired_heading);
location.add(velocity);
if (getTileCollision()) {
location.sub(velocity);
velocity = new PVector(speed*game.dt, 0);
velocity.rotate(current_heading); //current heading the an angle, in radians, of which 90-degree angle the enemy is currently moving in. set to -1 by default but can not equal -1 if desired_heading is not -1
location.add(velocity);
if (getTileCollision()) {
location.sub(velocity);
desired_heading = -1;
current_heading = -1;
}
} else {
desired_heading = -1;
current_heading = -1;
}
} else {
float original_heading = velocity.heading();
desired_heading = radians(round(degrees(velocity.heading())/90.0)*90.0); //round to the nearest 90 degrees
velocity = new PVector(speed*game.dt, 0);
velocity.rotate(desired_heading);
location.add(velocity);
if (getTileCollision()) {
location.sub(velocity);
}
float turn = radians(90);
while (true) { //if it cant move, try rotating 90 degrees and moving
velocity.rotate(turn);
location.add(velocity);
if (!getTileCollision() && abs(round(degrees(current_heading)) - round(degrees(velocity.heading()))) != 180) {
current_heading = velocity.heading();
break;
} else {
location.sub(velocity);
}
}
}
} else {
desired_heading = -1;
current_heading = -1;
}
}
So what my terrible code hopes to accomplish is the the enemy first tries to move directly at the player. If it encounters an obstacle, it will round its angle to the nearest 90 degrees, set desired_heading to this and try to move through. If it cant, it will rotate another 90 degrees and so forth, always keeping the original rounded angle in mind.
This doesn't work remotely well as first of all, rotating 90 degrees has a 50% chance to go in the exact wrong diretion, so I tried adding
if (abs(original_heading - velocity.heading()+turn) < abs(original_heading - velocity.heading()-turn)) {
turn = radians(-90);
}
right before the while (true) but that broke the algorithm completely (sometimes the enemy will freeze in deep thought and not move ever again).
What am I doing terribly wrong? Should I try a different algorithm or does this one have potential?
I hope this is a better question now...

Animating translation between two fixed points (Libgdx)

I'm making a 2d game in libgdx and I would like to know what the standard way of moving (translating between two known points) on the screen is.
On a button press, I am trying to animate a diagonal movement of a sprite between two points. I know the x and y coordinates of start and finish point. However I can't figure out the maths that determines where the texture should be in between on each call to render. At the moment my algorithm is sort of like:
textureProperty = new TextureProperty();
firstPtX = textureProperty.currentLocationX
firstPtY = textureProperty.currentLocationY
nextPtX = textureProperty.getNextLocationX()
nextPtX = textureProperty.getNextLocationX()
diffX = nextPtX - firstPtX
diffY = nextPtY - firstPtY
deltaX = diffX/speedFactor // Arbitrary, controlls speed of the translation
deltaX = diffX/speedFactor
renderLocX = textureProperty.renderLocX()
renderLocY = textureProperty.renderLocY()
if(textureProperty.getFirstPoint() != textureProperty.getNextPoint()){
animating = true
}
if (animating) {
newLocationX = renderLocX + deltaX
newLocationY = renderLocY + deltaY
textureProperty.setRenderPoint(renderLocX, renderLocY)
}
if (textureProperty.getRenderPoint() == textureProperty.getNextPoint()){
animating = false
textureProperty.setFirstPoint(textureProperty.getNextPoint())
}
batch.draw(texture, textureProperty.renderLocX(), textureProperty.renderLocY())
However, I can foresee a few issues with this code.
1) Since pixels are integers, if I divide that number by something that doesn't go evenly, it will round. 2) as a result of number 1, it will miss the target.
Also when I do test the animation, the objects moving from point1, miss by a long shot, which suggests something may be wrong with my maths.
Here is what I mean graphically:
Desired outcome:
Actual outcome:
Surely this is a standard problem. I welcome any suggestions.
Let's say you have start coordinates X1,Y1 and end coordinates X2,Y2. And let's say you have some variable p which holds percantage of passed path. So if p == 0 that means you are at X1,Y1 and if p == 100 that means you are at X2, Y2 and if 0<p<100 you are somewhere in between. In that case you can calculate current coordinates depending on p like:
X = X1 + ((X2 - X1)*p)/100;
Y = Y1 + ((Y2 - Y1)*p)/100;
So, you are not basing current coords on previous one, but you always calculate depending on start and end point and percentage of passed path.
First of all you need a Vector2 direction, giving the direction between the 2 points.
This Vector should be normalized, so that it's length is 1:
Vector2 dir = new Vector2(x2-x1,y2-y1).nor();
Then in the render method you need to move the object, which means you need to change it's position. You have the speed (given in distance/seconds), a normalized Vector, giving the direction, and the time since the last update.
So the new position can be calculated like this:
position.x += speed * delta * dir.x;
position.y += speed * delta * dir.y;
Now you only need to limit the position to the target position, so that you don't go to far:
boolean stop = false;
if (position.x >= target.x) {
position.x = target.x;
stop = true;
}
if (position.y >= target.y) {
position.y = target.y;
stop = true;
}
Now to the pixel-problem:
Do not use pixels! Using pixels will make your game resolution dependent.
Use Libgdx Viewport and Camera instead.
This alows you do calculate everything in you own world unit (for example meters) and Libgdx will convert it for you.
I didn't saw any big errors, tho' i saw some like you are comparing two objects using == and !=, But i suggest u to use a.equals(b) and !a.equals(b) like that. And secondly i found that your renderLock coords are always being set same in textureProperty.setRenderPoint(renderLocX, renderLocY) you are assigning the same back. Maybe you were supposed to use newLocation coords.
BTW Thanks for your code, i was searching Something that i got by you <3

How to check if user is inside a Circle Google maps v2

I have a circle on my map. Now I want to detect if the user (or me) is inside the circle.
Circle circle = map.addCircle(new CircleOptions()
.center(new LatLng(14.635594, 121.032962))
.radius(55)
.strokeColor(Color.RED)
);
I have this code:
LocationManager lm = (LocationManager)getSystemService(Context.LOCATION_SERVICE);
LocationListener ll = new myLocationListener();
lm.requestLocationUpdates(LocationManager.GPS_PROVIDER,0,0,ll);
Location.distanceBetween( pLat,pLong,
circle.getCenter().latitude, circle.getCenter().longitude, distance);
if( distance[0] > circle.getRadius() ){
Toast.makeText(getBaseContext(), "Outside", Toast.LENGTH_LONG).show();
} else {
Toast.makeText(getBaseContext(), "Inside", Toast.LENGTH_LONG).show();
}
And on myLocationListener I have this:
public void onLocationChanged(Location location) {
// TODO Auto-generated method stub
pLong = location.getLongitude();
pLat = location.getLatitude();
}
It works correctly if I parameter inside distanceBetween is the coordinates of marker, however, the toast displays Outside even though my location is inside the radius.
Any ideas how to do this correctly? Please help. Thanks!
EDIT
I discovered something odd.
On the picture, you can see I have a textView above which has 5 numbers (circle Latitude, circle longitude, distance at index 0 , distance at index 1 , distance2). distance is a float array to store the distance between the center of the circle and the user location. I set the radius to 100, and I think the unit is meters, however, as you can see, the values at the distance array are : 1.334880E7 , -81.25308990478516 , -10696092987060547 . What is the formula for the computation of the distance? And also, 1.something times 10 raise to 7 is about 13 million which is really greater than 100. Please help its really confusing right now. According to documentation of Circle (The radius of the circle, specified in meters. It should be zero or greater.) and distanceBetween (Computes the approximate distance in meters between two locations) so I don't know why is this the result.
tl;dr? jsFiddle here - look at your console output.
Basically there're two ways to do this:
Check if the (marker of) the user is inside the Circle Bounds
Compute the distance between the user and the center of the Circle. Then check if it is equal or smaller than the Circle Radius. This solution needs the spherical library to work.
Circle Bounds
Just add a circle:
circle = new google.maps.Circle( {
map : map,
center : new google.maps.LatLng( 100, 20 ),
radius : 2000,
strokeColor : '#FF0099',
strokeOpacity : 1,
strokeWeight : 2,
fillColor : '#009ee0',
fillOpacity : 0.2
} )
and then check if the marker is inside:
circle.getBounds().contains( new google.maps.LatLng( 101, 21 ) );
At a first glance you might think this works. But it doesn't. In the background google (still) uses a rectangle, so everything inside the rectangular bounding box, but outside the circle will be recognized as inside the latLng bounds. It's wrong and a known problem, but it seems Google doesn't care.
If you now think that it would work with rectangular bounds, then you're wrong. Those don't work either.
Spherical Distance
The easiest and best way is to measure the distance. Include the spherical library by appending &library=spherical to your google maps script call. Then go with
google.maps.geometry.spherical.computeDistanceBetween(
new google.maps.LatLng( 100, 20 ),
new google.maps.LatLng( 101, 21 )
) <= 2000;
I know this question had been asked more than a year ago but I have
the same problem and fixed it using the distanceBetween static function of Location.
float[] distance = new float[2];
Location.distanceBetween(latLng.latitude, latLng.longitude, circle.getCenter().latitude,circle.getCenter().longitude,distance);
if ( distance[0] <= circle.getRadius())
{
// Inside The Circle
}
else
{
// Outside The Circle
}
Use GoogleMap.setOnMyLocationChange(OnMyLocationChangeListener) instead of LocationManager. This way you will get Locations that are the same as blue dot locations.

Categories

Resources