I'm working in Eclipse, using the Libgdx game framework to develop an Android application, where I'm attempting to move a 2D cursor based on the general movements of the Android device. As I've read from countless sources already, this is a much harder task than it sounds, due to the wide margin of error that appears during calculation.
What I've done so far is pretty primitive; pass across a composite "linear acceleration sensor" from Android, use the acceleration values to calculate velocity, use velocity to calculate displacement, add that onto my cursor's x,y coordinates, and lastly make sure the cursor does not exceed the screen's dimensions. I don't think it really needs a code example, but here's a watered down version of the core code;
//class attributes
Vector2 accel, prevAccel;
Vector2 velocity;
Vector2 cursorPos;
IMyLinearAccelerationSensor las;
//update code
accel = new Vector2(las.getAccelXY); //returns a Vector2 with the x and y acceleration values
velocity.x = (accel.x + prevAccel.x) /2 * deltaTime;
velocity.y = (accel.y + prevAccel.y) /2 * deltaTime;
cursorPos.x += tempAccel.x * deltaTime;
cursorPos.y += tempAccel.y * deltaTime;
prevAccel = accel; //update the "previous acceleration"
if (cursorPos.x < 0)
cursorPos.x = 0;
if (cursorPos.y < 0)
cursorPos.y = 0; //etc.
The main issues I'm encountering are drift, and the fact that the sensor appears to be more sensitive to movements in the positive directions of x, and y (I think). In other words, the cursor accelerates/ drifts off, gets stuck in the top left corner of the screen, and the respective velocities for x and y never really fall back to 0. Even when moving the device in the opposite direction, the velocity only decreases slightly.
Given all of this information, I just wanted to ask some questions, and for some advice. Firstly, is this is normal behavior, or are there any glaring problems with my code?
If not, are there any steps that can be taken with my current setup, to resolve these problems with drift, and remove some margin of error? e.g. applying different filters to remove noise, applying a resistive force to the velocity, etc.
I don't need the movement of the cursor to follow the device's movements too accurately; so long as it moves when the device moves, and stops generally when the device stops, I don't mind if there's some small drifting, or lag.
I was considering to at some point implement more interfaces to get the raw data from the Android hardware sensors (compass, gyroscope, and accelerometer), to generate more accurate results, but even if I do go ahead with this, would this improve much? Knowing that this is a complex problem, I'd rather not spend too much time on it without good reason. (sensor fusion from scratch definitely seems hard...)
Lastly, I'm sorry about the lengthy post, and also sorry if this is considered a repeat question. I've done research, and have seen plenty of other people with similar issues, but I'm not quite sure how their solutions might be applied to my own problem. Given my limited knowledge/ experience, I'd appreciate any helpful insight that can be offered.
Related
In Java, I'm writing a mobile app for Android to interact with some dynamic balls with some classes I wrote myself. Gravity is determined on the tilt of the phone.
I noticed when I have a bunch of balls bunched up in a corner that some of them will begin to jitter, or sometimes slide while colliding with other balls. Could this be because I'm executing steps in the wrong order?
Right now I have a single loop going through each ball to:
Sim an iteration
Check collisions with other balls
Check collisions against scene bounds
I should add that I have friction with the bounds and when a ball to ball collision occurs, just to lose energy.
Here's a portion of code of how collision is being handled:
// Sim an iteration
for (Ball ball : balls) {
ball.gravity.set(gravity.x, gravity.y);
if (ball.active) {
ball.sim();
// Collide against other balls
for (Ball otherBall : balls) {
if (ball != otherBall) {
double dist = ball.pos.distance(otherBall.pos);
boolean isColliding = dist < ball.radius + otherBall.radius;
if (isColliding) {
// Offset so they aren't touching anymore
MVector dif = otherBall.pos.copy();
dif.sub(ball.pos);
dif.normalize();
double difValue = dist - (ball.radius + otherBall.radius);
dif.mult(difValue);
ball.pos.add(dif);
// Change this velocity
double mag = ball.vel.mag();
MVector newVel = ball.pos.copy();
newVel.sub(otherBall.pos);
newVel.normalize();
newVel.mult(mag * 0.9);
ball.vel = newVel;
// Change other velocity
double otherMag = otherBall.vel.mag();
MVector newOtherVel = otherBall.pos.copy();
newOtherVel.sub(ball.pos);
newOtherVel.normalize();
newOtherVel.mult(otherMag * 0.9);
otherBall.vel = newOtherVel;
}
}
}
}
}
If this is the only code that checks for interactions between balls, then the problem seems pretty clear. There is no way for a ball to rest atop another ball, in equilibrium.
Let's say that you have one ball directly on top of another. When you compute the acceleration of the top ball due to gravity, you should also be doing a collision check like the one you posted, except this time checking for dist <= ball.radius + otherBall.radius. If this is the case, then you should assume a normal force between the balls equal to that of gravity, and negate the component of gravity in line with the vector connecting the two balls' centers. If you fail to do this, then the top ball will accelerate into the bottom one, triggering the collision code you posted, and you'll get the jitters.
Similar logic must be used when a ball is in contact with a scene bound.
Since I've been experimenting with my own Phys2D engine (just for fun), I know you're talking about. (Just in case - you may check my demo here: http://gwt-dynamic-host.appspot.com/ - select "Circle Collisions Demo" there, and corresponding code here: https://github.com/domax/gwt-dynamic-plugins/tree/master/gwt-dynamic-main/gwt-dynamic-module-bar).
The problem is in a nature of iterations and infinite loop of colliding consequences. When e.g. ball is reached the scene corner, it experiences at least 3 vectors of force: impulse of bounce from the wall, impulse of bounce from the floor and impulse of gravity - after you summarize all 3 impulses, reduce it according loosing energy algorithm, you have to have the new vector where your ball should be. But, e.g. this impulse directs it into wall - then you have to recompute the set of vectors again according to all the stuff: energy of bounce, impulses, gravity, etc. Even in case if all these impulses are small, you never get all of them 0, because of precision of doubles and your tolerance comparison constants - that because you have the "jitter" and "sliding" effects.
Actually, most of existing 2D engines have these effects one kind or another: you may see them here: http://brm.io/matter-js/demo/#wreckingBall or here: http://box2d-js.sourceforge.net/index2.html - they actually just make the small impulses to be absorbed faster and stop iterating when the whole system becomes more or less stable, but it is not always possible.
Anyway, I'd just recommend do not reinvent your own wheel unless it is just for your fun - or for your better understanding this stuff.
For last one (JFF) - here is good tutorial: http://gamedevelopment.tutsplus.com/tutorials/how-to-create-a-custom-2d-physics-engine-the-basics-and-impulse-resolution--gamedev-6331
For real things, I'd recommend to use the existing engines, e.g. Unity (https://unity3d.com/learn/tutorials/modules/beginner/2d/physics2d) or Box2d (http://box2d.org/)
Hope this helps.
Iterating through all the balls and changing the balls position for each iteration is probably the cause for the instability, you move a ball left to avoid collision on the right, and then you pushed the ball into another ball on the left, and then the left ball tries to push it back again.
From the top of my head I could recommend trying to sum up all the forces on each ball before doing anything about positioning. And if you iterate from the ball "on top" (furthest away from gravity source/direction) you can probably achieve a stable situation.
Basically, the top ball needs to first calculate forces between itself and the ball(s) under it, plus gravity, then the ball under will know how much force is coming from the top ball, and added with gravity it would also add to the force of which it is pushing the balls under it. When all balls know the forces they're pushed with you can transform that force into motion.
The way you are simulating the physics of the balls is bound to cause instabilities. Your collision resolution tries to separate the balls by projecting one of them in the opposite direction by the collision depth. This may fix the overlap for those two balls but chances are(especially when the balls are stacked) that the ball is now overlapping with another ball.
There are many ways to fix penetration. One of the simplest ways is to add a "bias" or a bit of a push to both bodies to force them to separate over the next couple of frames. This allows that energy to propagate and force all of the bodies apart. Problem is, the bias will often overestimate and cause a bit of a bounce. To fix that problem I'd recommend reading up on sequential impulse.
Making physics look realistic is not as easy as it may seem. Unless you don't mind instabilities I'd recommend spending some time reading up on different techniques or using an engine such as Box2D.
I'm in the process of creating a sidescrolling action game where you play a wizard that can cast spells. The spells are Box2D bodies that start in the middle of your player (who is also a Box2D body) and go outward in the direction you clicked on the screen.
I have all the collision detection working to where the spells you cast don't collide with each other and they don't collide with you. This works more-or-less flawlessly -- except for one instance. The first spell I cast creates a collision with the player that causes a brief period of physical knockback. The spell's path is not affected otherwise, and I don't notice a major change in the player's position. The only reason I know there is knockback at all is because the camera I have following the player suddenly shakes in that instant (and I've gone through the contact listener to verify that these particular bodies are forming a contact).
This is not the case for subsequent spells that are spawned from the same position, and it doesn't seem to be a problem that is related to the player's move speed or the projectile's cast angle. It may have something to do with how Box2D initializes items, but I couldn't promise that.
Does anyone know how I could fix this?
As weston mentioned, some more code may be needed, but it does sound odd that "the first spell" does this, but subsequent spawnings don't. Do you reuse bodies for subsequent spawns?
In box2d if you create 2 bodies that occupy the same space they will "knockback" from each other so that they DON'T occupy the same space. This is by design, and only happens on creation. I'm sure this is what you're seeing. Why it doesn't happen "afterwards" may be dependent on your code.
Ways around this might be to set some of your fixtures as sensors for a bit (collisions are ignored), or when you create a body, set active=false (again, collisions ignored). By "for a bit" I mean you may have to write some code to move your spell, ignoring collisions, until your spell has "cleared" your wizard. Then remove the sensor or make it active so collisions are back in play.
But that may be overkill, figuring out the subtlety of subsequent "spans" vs. your first one may be a better use of effort.
I managed to figure out where the problem was and how to solve it, but I wasn't able to fight out why it was happening.
I was using an OrthographicCamera to lerp to my player's position.
position.x += (universe.entity_handler.player.parts[0].body.getPosition().x - position.x) * Gdx.graphics.getDeltaTime() *
position.y += (universe.entity_handler.player.parts[0].body.getPosition().y - position.y) * Gdx.graphics.getDeltaTime() * LERP + zoom * viewportHeight / WORLD_PLAYER_Y_SKEW;
And when I fixed this by removing all lerping, the jitter was gone.
position.x = universe.entity_handler.player.parts[0].body.getPosition().x;
position.y = universe.entity_handler.player.parts[0].body.getPosition().y;
I have zero idea why linear interpolation was causing spastic movements like that, but hey - I got it to work. Maybe I'll go back and revisit the problem later.
I am developing a simple game which involves a character moving up and down only along the Y axis.
Currently I am using the accelerometer readings to change the Y velocity of the character. The game works fine but the biggest problem is that you have to keep the device horizontal in order to play the game properly.
What I really want is to change the characters Y velocity only when there is a change in the rate of rotation along the Y axis. I need to be able to translate this rate of change into the Y velocity of the character. In this way it will not matter how much the device is tilted and the user can play the game while holding the device normally.
Since the accelerometer is mandatory in every device, therefore older devices can run my game I want to be able to calculate this rate of change using the data retrieved from the accelerometer.
I found this link which explains how to get pitch and roll from accelerometer data.
I used the exact code and came up with this,
final double alpha = 0.5;
double fXg = 0;
double fYg = 0;
double fZg = 0;
fXg = game.getInput().getAccelX() * alpha + (fXg * (1.0 - alpha));
fYg = game.getInput().getAccelY() * alpha + (fYg * (1.0 - alpha));
fZg = game.getInput().getAccelZ() * alpha + (fZg * (1.0 - alpha));
double pitch = (Math.atan2(-fYg, fZg) * 180.0) / Math.PI;
double roll = (Math.atan2(fXg, Math.sqrt(fYg * fYg + fZg * fZg)) * 180.0) / Math.PI;
pitch = (pitch >= 0) ? (180 - pitch) : (-pitch - 180);
With this piece of code I am unable grasp how to calculate the RATE of change.
Am I going in the right direction with this or is this completely different from what I want?
Also is it better if I just use the gyroscope instead of relying on the accelerometer?
Thanx in advance.
There are two parts to this answer. (1) What sensor data to read. (2) How to get the motion change info that you want.
(1) What sensor data to read
You'll get much better data from Android's software-derived Motion Sensors (than from just the accelerometers) since they combine signals form the accelerometers, magnetometers, and (if available) gyroscopes. The motion sensors might also correct for known biases and use a Kalman filter to remove noise. Gyros are more accurate and faster at detecting rotational changes than the other two sensors, but they only detect changes faster than some threshold. They don't detect absolute position or orientation.
The "rotation sensor" is a software-derived indication of Android's best information about the device's rotation. My recollection is that it includes the gyroscope signals if available, else it just uses the accelerometers (to measure linear forces including gravity) and magnetometers (to measure the orientation within the earth's magnetic field).
Caveat: This paragraph in the Motion Sensors Guide says something tricky:
The Android Open Source Project (AOSP) provides three software-based
motion sensors: a gravity sensor, a linear acceleration sensor, and a
rotation vector sensor. These sensors were updated in Android 4.0 and
now use a device's gyroscope (in addition to other sensors) to improve
stability and performance. ... All three of these sensors rely on a
gyroscope: if a device does not have a gyroscope, these sensors do not
show up and are not available for use.
I think that means that on Android 4.0 and later, these three software-based sensors are only available on devices that have gyros, while they're always available on earlier versions of Android. Do some quick tests on various devices to find out, and please report back here.
Note: It helps to calibrate the magnetometers. Do this by moving the device in a large figure-8 shape for several cycles. The magnetometer hardware will detect this, track what range of values that it measures, then adjust its calibration parameters.
(2) How to get the motion change info that you want
To control your game character, consider reading the Rotation Vector Sensor. The rotation about the X-axis minus the initial X-axis rotation should indicate how much the device has been tilted up or down relative to when you captured the initial rotation value.
Or you can use Acceleration Sensor to gauge how much the user has moved the device up and down along the device's Y-axis. The Y-axis points up to the top of the screen. The code sample on this Guide page shows how to use a high-pass filter to subtract out the force of gravity.
What I really want is to change the characters Y velocity only when
there is a change in the rate of rotation along the Y axis.
That would measure how fast (not how much) one tilts the phone left and right (not up and down). If you need some background on the mathematics of change in a value, see the Khan Academy for short lectures on derivatives [in calculus, not finance] and integrals. Wikipedia should also be helpful. In short, compute a value's change by subtracting its reference value, compute the rate of change (e.g. compute acceleration from velocity) by subtracting the previous value sampled at a regular time interval, and compute the integral (e.g. compute velocity from acceleration) by summing the sampled values.
See also the Sensor API reference and the Sensor Event reference.
I need to make a wheel fall on 1 of five angles and I want it to teeter when it gets to the angle. After the user spins the wheel, I have it slow down by multiplying the rotation velocity by .98 per tick. I sort of have it working by finding the closest of the angles and adding a small value in its direction to the velocity. However this looks unrealistic and can be glitchy.
I was thinking of implementing a damped sine wave but i'm not sure how I would do this.
Current Pseudocode:
var rotation, rotationVelocity, stoppingPoints[5];
update(deltaT) {
velocity -= rotationVelocity * 0.5 * dt;
closestAngle = findClosestAngle().angle;
rotationVelocity += (closestAngle - rotation) / 36 * dt;
rotation += rotationVelocity;
}
Edit:
Teeter: move or balance unsteadily; sway back and forth:
subtract a constant amount from it's velocity every iteration until it reaches zero
not only does this actually represent how friction works in real life, but it's easier too.
If you want it to move as though it were connected to a spring:
Hooke's law for springs is F = -kx where k is a constant and x is the distance from the origin if you want it to sway back and forth as though it were on a spring. keep track of it's rotation from an origin, and add -kx where x is it's rotational distance(or angle) from the origin.
Now, if you apply both friction and hooke's law to the spring, it should look realistic.
I think the closest angle that you want is the cloest to where it will stop. You can simulate where it will end, how long it takes to end, and use that to determine how much extra(or less) velocity you'll need.
Not sure what you mean by teetering exactly.
It sounds like you want to model a wheel with weights attached at the stoppingPoints. I mean, from a physics viewpoint. There is the existing rotational velocity, then deceleration from friction, and an unknown acceleration/deceleration caused by the effects of gravity on those points (as translated to a rotational velocity based on the position of the weights). Anyway, that's my interpretation and would probably be my approach (to model the gravity).
I think the teetering you speak of will be achieved when the acceleration caused by the weights exceeds the existing rotational velocity.
I'm trying to implement 'lever' into my android game, here's image showing what do I want, showing how does it work:
1)
2)
I managed to do basic of it by using joint:
final RevoluteJointDef revoluteJointDef = new RevoluteJointDef();
revoluteJointDef.initialize(anchorBody, movingBody, anchorBody.getWorldCenter());
revoluteJointDef.enableLimit = true;
revoluteJointDef.upperAngle = MathUtils.degToRad(40);
revoluteJointDef.lowerAngle = MathUtils.degToRad(-40);
physicsWorld.createJoint(revoluteJointDef);
And it works, I can move lever stick in left/right direction, and as it should you can not exceed proper angle, so this part is done. But now I'm looking for a way, when execute actions after moving this lever (for example open some doors/gate)
Here's my basic idea, how to check which part of the stick has been touched by player (left or right) by creating stick's body with this way:
So explaining, by adding 2 sensors, one on left side and one on right side, so in contact listener I would check which side has been touched.
But still I have no idea how to check if action should be performed, I know I could check on every update if stick angle is 40 or -40, but is it effective way? Or maybe there is better? I will be greatly thankful for any tips! Thanks
You don't need to worry about efficiency here, the performance penalty for checking the angle is absolutely negligible. I measured the times needed to get the angle of a Sprite and a Body using the following code snippet:
double testEnd;
double testStart;
testStart = System.nanoTime();
for (int i = 0; i<1000000; i++) {doStuff()}
testEnd = System.nanoTime();
Log.v("speed test", "it takes "+ (testEnd - testStart)/1000000 + " nanoseconds");
So, on Desire Z it takes only 157 ns to find the angle of a Sprite and 393 ns to do the same with Body. It is also much simpler than using contact listeners. Just a side note, the angles of Sprites can be outside of (-360, +360) degrees if you rotate the Sprite.
Both the methods you mention would work, it's just a matter of which you find more convenient. It also seems like you already understand the implementation, so I don't think anyone could answer this question any better than to say, 'whichever suits you better'.
Putting a sensor at each end of the lever's range and using a contact listener to detect when the lever touches them is the most 'correct' way. It conveniently gives you events both when the lever touches the sensors and when they stop touching, but it's a little more work to set up.
Checking the joint angle after every time step will work fine too, but you'll need to compare the angle with the value in the previous step if you want to detect a start/finish touching 'event'. If you want a lever that can give continuous values (eg. speed control for some other object) instead of a simple on/off switch then you would be better off with this method.