I have run into a problem making a first person camera on LWJGL 2. I am using the following code to rotate the camera (up down left and right) based on how the mouse moves. This is basically what every other tutorial has, however, its movement is flawed and ends up spiraling out of control.
float mouseDX = Mouse.getDX();
float mouseDY = Mouse.getDY();
rotation.x = mouseDX;
rotation.y = mouseDY;
glRotatef(rotation.y, 1, 0, 0);
glRotatef(rotation.x, 0, 1, 0);
Rotation is a Vector3f
I am aware that the rotation.y is rotating the x access and the x is rotating the y. I am not totally sure why but it doesn't work for me unless its this way. The problem may be related to this.
Here is a video I made showing what I mean:
https://www.youtube.com/watch?v=V6Iu5oQuWo4&feature=youtu.be
In the video I attempt to show that both the x and y rotation work fine separately, but when used together they don't work at all.
I know this is only a small section of my code, but it is the only part dealing with rotation so the problem must be there somewhere.
The flaw that stands out to me is the value by which you rotate.
Mouse.getDY returns the change in y pixels so if you move your mouse half way down the screen you will move typically 300 pixels (800x600).
Now you also have glRotatef which rotates by radians which compared are tiny compared to degrees.(360 degrees -> 6.28 radians)
Now take 300 hundred pixels, use it as the number of radians to rotate by and you get 17188.7 degrees of rotation.
And that's the cause of your spiralling (47 revs/few milliseconds)
What you will need to do if divide your dy and dx by a good couple of hundred.
And you can also still use degrees by using Math.toRadians in the glRotatef method
Related
i have this Problem now for over 2 days. Constantly tweaking. I just cant get it done.
I have a Player texture ( Player is facing the left on it ) which i want to rotate using the touchpad. So the player will be facing his running direction.
So far i have this :
double facerotation = Math.atan2(touchpad.getKnobPercentY(), touchpad.getKnobPercentX());
spriteBatch.draw(runningFrame, player.getPosition().x, player.getPosition().y, Player.getSize() / 2, Player.getSize() / 2, Player.getSize(), Player.getSize(), 1, 1, facerotation * 100, false);
But with "roation*100" he spins like 2 times around and without he barely rotates. I even tried switching the X and Y values for the atan2 function above. But i never got him rotate only in the direction i am Moving. I also tried the atan function, also with swapping the X and Y values.
Please help me. I tried thousands of ways, Different calculations and things i saw on google. Nothing brought me the desired effect.
Just use a Vector2. Use it to store your knob percent y and x. Then you can get the rotatation in degrees with vector2.angle().
Vector2 v = new Vector2(touchpad.getKnobPercentX(), touchpad.getKnobPercentY());
float angle = v.angle();
runningFrame.setRotation(angle);
I've created a simple planetary simulation where a planet orbits a star.
The code for the orbit is this:
a = a + vel * delta;
planetX = Math.cos(a) * orbitRadius + parentStar.getX();
planetY = Math.sin(a) * orbitRadius + parentStar.getY();
Now that works just fine, but my problem is that the orbit is not from the center of the planet around the center of the star.
This is what happens
As you can see, the first red dot on the small circle is the Position of the planet wich orbits around the second small red dot, this is because the circle is drawn from (0,0), so both the planets (0,0) circles around the (0,0) of the star.
I need the the center of the planet to circle the stars center, not their origin point.
Is there a good fix for this?
Your calculation of the orbit is fine. The only problem seems to be that you treat "position" differently when calculating orbits and when drawing the planets: When you draw them, you treat x and y as one of the corner points, but when you calculate the oribit, you treat them as the centre of the body. The simplest way would be to change the visualisation, not the calculation.
Since you did not post the code you use to draw the shapes, I can only guess, but I assume it looks somewhat like this (obviously Pseudocode):
for (Planet p : starsAndPlanets) {
drawCircle(p.x, p.y, p.radius * 2, p.radius * 2);
}
Change this to something like this:
for (Planet p : starsAndPlanets) {
drawCircle(p.x - p.radius, p.y - p.radius, p.radius * 2, p.radius * 2);
}
This way, x and y are the position of the centre of the planet, and with p.x - p.radius and p.y - p.radius you get the corner point. Of course, you could in a similar way change all your orbital mechanic formulas to calculate the centre from the corner point, but IMHO it is much simpler and more natural to treat x and y as the centre.
For now the most suitable way I can think of is getting the star's world coordnates and passing them every frame to the child's coordinates. As you do so, the child would have the same coordinates everyframe.
The next part is translating it and rotating it around the Star - the way you can achieve that is by setting the planet's position to be transposed by the Star's position with a sin(x)*cos(x).
Let me show you an example:
planet[0] = star[0] + sin(angle)*scale
planet[1] = star[1] + cos(angle)*scale
Where the angle would change incrementally and the scale will just shift the child object further from its parent, keeping it a constant (or modifying it if you wish) thus increasing the radius from its 'new' center.
I know some people may mention matrices or other types of transformations, but for this situation I think the above solution would be most relevant and cleanest in my opinionp
The way it works is you take the parent's 'WORLD coordinates' and set them to be the child's. By modifying the Scale value you increase the distance of the object from the center (so they won't overlap) and you multiply this with the sin and cos of the angle you specified to make it rotate.
P.S. Keep in mind that if you're dealing an FPS-dependant engine to render, the more FPS the faster the simulation will be, and vice-versa, because if you render at 1000 fps, this means you execute your code 1000 times, compared to 100 for example. Therefore, you will increment the angle 1000 times or 100 respectively. If you have this issue, try setting a constant framerate if you can - it's the simplest workaround for lightweight simulations.
Edit: I forgot to mention that the concept works for all objects in your case. You just have to work our the relationships and use the function for eqch object seperately where each object has a position and angle of orbit (if it orbits around a different object).
Recently, I sought help regarding 3d camera rotations in OpenGL. This answer and the comments that followed helped me greatly, but there is still one major issue: when moving the camera, the motion is often, but not always, in exactly the opposite direction it should be. For instance, when the camera's orientation matrix is the identity, the camera moves perfectly. However, if it is rotated in any direction, its motion on the axis perpendicular to the axis of rotation will have the opposite sign of the intended motion.
With this said, I think I have an idea why this inconsistent behavior is happening:
As we all know, OpenGL uses a Right-Handed coordinate system:
If I understand this diagram correctly, when the camera is oriented at the identity the z axis should point INTO the camera, and z-values should decrease as one moves away from the camera (apparently affirmed here). (coordinates measured in world space).
However, in my program, the Z axis points AWAY from the camera and z values increase as one moves away from the camera. Here is an example:
The camera has moved forward, along what should be the negative z axis but appears to be the positive z axis.
If I am correct in interpreting this behavior as abnormal, it would explain all of my problems with the sign of my camera motion, as the motion that currently appears "correct" would in fact be erroneous and I have consistent signs that I could simply invert to result in correct motion.
So the question is:
Is my Z axis backwards, or is it supposed to be this way?
If it is backwards, why? Judging by multiple discussions on the topic (1, 2, 3), the error is likely to lie where I define my perspective frustum, so I'll put that here:
public static final int P_ZNEAR = 1, P_ZFAR = 500;
public static void perspective()
{
int i = GL11.glGetInteger(GL11.GL_MATRIX_MODE);
GL11.glMatrixMode(GL11.GL_PROJECTION);
GL11.glLoadIdentity();
double ymax, xmax;
ymax = P_ZNEAR * Math.tan(FOV / 2);
xmax = ymax * ASPECT_RATIO;
GL11.glFrustum(xmax, -xmax, -ymax, ymax, P_ZNEAR, P_ZFAR);
GL11.glMatrixMode(i);
}
I am developing an augmented reality application for android and trying to use openGl to place cubes at locations in the world. My current method can be seen in the code below:
for(Marker ma: ARData.getMarkerlist().values()) {
Log.d("populating", "");
gl.glPushMatrix();
Location maLoc = new Location("loc");
maLoc.setLatitude(ma.lat);
maLoc.setLongitude(ma.lng);
maLoc.setAltitude(ma.alt);
float distance = currentLoc.distanceTo(maLoc);
float bearing = currentLoc.bearingTo(maLoc);
Log.d("distance", String.valueOf(distance));
Log.d("bearing", String.valueOf(bearing));
gl.glRotatef(bearing,0,0,1);
gl.glTranslatef(0,0,-distance);
ma.cube.draw(gl);
gl.glPopMatrix();
}
gl.glRotatef(y, 0, 1, 0);
gl.glRotatef(x, 1, 0, 0);`
Where y is yaw and x is the pitch. currently I am getting a single cube on the screen at a 45 degree angle someway in the distance. It looks like I am getting sensible bearing and distance values. Could it have something to do with the phones orientation? If you need more code let me know.
EDIT: I updated bearing rotation to gl.glRotatef(bearing,0,1,0); I am now getting my cubes mapped horizontally along the screen at different depths. Still no movement using heading and pitch but #Mirkules has identified some reasons why that might be.
EDIT 2: I am now attempting to place the cubes by rotating the matrix by the difference in angle between heading and bearing to a marker. However, all I get is a sort of jittering where the cubes appear to be rendered in a new position and then jump back to there old position. Code as above except for the following:
float angleDiff = bearing - y;
gl.glRotatef((angleDiff),0,1,0);
gl.glTranslatef(0,0,-distance);
bearing and y are both normalised to a 0 - 360 scale. Also, I moveed my "camera rotation" to above the code where I set the markers.
EDIT 3: I have heading working now using, float angleDiff = (bearing + y)/2;. However, I cant seem to get pitch working. I have attempted to use gl.glRotatef(-x,1,0,0); but that doesn't seem to work.
It's tricky to tell exactly what you're trying to do here, but there are a few things that stick out as potential problems.
Firstly, your final two rotations don't seem to actually apply to anything. If these are supposed to represent a movement of the world or camera (which mostly amounts to much the same thing) then they need to happen before drawing anything.
Then your rotations themselves perhaps won't entirely do what you intend.
Your cube is rotated around the Z axis. The usual convention in GL is for the camera to look down the Z axis, with the Y axis being considered 'up'. You can naturally interpret axes however you like, but a rotation around 'Z' would not typically be 'bearing', but 'roll'. 'Bearing' to me would be analogous to 'yaw'.
As you translate along the Z axis, I assume you are trying to position the object by rotating and translating, but obviously if the rotation is around the same axis as you translate along, it won't actually alter the position of the cube - it will always just be directly in front of the camera, spinning on its axis.
I'm not really clear on why you're trying to position the cube like that when it seems like you start off with a more specific location. You could probably directly construct a more appropriate matrix.
Finally, your camera/world rotation is two concatenated rotations around Y and X. You call these pitch and roll, but typically using euler angles for a camera rotation does not result in an intuitive result where terms like pitch and roll make complete sense. It is common to maintain an orientation and apply individual rotations to that in order to update it, rather than attempting to update several dependent rotations.
So yes, I would expect that this code, in the absence of other matrix operations, would likely result in drawing one or more cubes straight ahead which are simply rotated by some angle around the view direction.
I'm writing a game in Java using OpenGL (the LWJGL binding, to be specific). Each entity, including the camera, has a quaternion that represents it's rotation. I've figured out how to apply the quaternion to the current OpenGL matrix and everything rotates just fine. The issue I'm having is getting the camera to rotate with the mouse.
Right now, every frame, the game grabs the amount that the mouse has moved on one axis, then it applies that amount onto the quaternion for the camera's rotation. Here is the code that rotates the quaternion, I'll post it since I think it's where the problem lies (although I'm always wrong about this sort of stuff):
public void rotateX(float amount){
Quaternion rot = new Quaternion(1.0f, 0.0f, 0.0f, (float)Math.toRadians(amount));
Quaternion.mul(rot, rotation, rotation);
rotation.normalise();
}
This method is supposed to rotate the quaternion around the X axis. 'rotation' is the quaternion representing the entity's rotation. 'amount' is the amount that I want to rotate the quaternion (aka the amount that the mouse was moved). 'rot' is a normalized vector along the X axis with a w value of the amount converted to radians (I guess the goal here is to give it an angle- say, 10 degrees- and have it rotate the quaternion along the given axis by that angle). Using Quaternion.mul takes the new quaternion, multiplies it by the rotation quaternion, and then stores the result as the rotation quaternion. I don't know if the normalization is necessary, since 'rot' is normal and 'rotation' should already by normalized.
The rotateY and rotateZ methods do the same thing, except for changing the vector for 'rot' (0.0, 1.0, 0.0 for y and 0.0, 0.0, 1.0 for z).
The code appears to work fine when the game starts and the camera is looking down the negative Z axis. You can spin all the way around on the Y axis OR all the way around the X axis. But as soon as you try to rotate the camera while not looking down the Z axis, everything gets really screwy (I can't even describe it, it rotates very oddly).
My end goal here is to have something to use for controlling a ship in a space with no up vector. So when you move the mouse on the Y axis, no matter what angle the ship is at, it changes the pitch of the ship (rotation along the X axis). Similarly, when you move the mouse on the X axis, it changes the yaw (rotation along the Y axis). I might be going about this the wrong way and I probably just need a push (or shove) in the right direction.
If you need more details on anything (how my rendering is done, any other maths that I'm trying to do) just ask and I'll put it up. I understood everything when I was using euler angles (which apparently are a big no-no for 3D application development... wish somebody would have told me that before I sunk a lot of time into getting them to work) but as soon as I switched over to quaternions, I got in over my head really fast. I've spent the past few months just playing with this code and reading about quaternions trying to get it to work, but I haven't really gotten anywhere at all :'(
Very, very frustrating... starting to regret trying to make something in 3D >_<
Quaternion rot = new Quaternion(1.0f, 0.0f, 0.0f, (float)Math.toRadians(amount));
OK, this is flat-out wrong.
The constructor that takes four floats assumes that they represent an actual quaternion. What you give that constructor is not a quaternion; it's a vec3 axis and an angle that you expect to rotate around.
You can't shove those into a quaternion class and expect to get a legitimate quaternion out of it.
Your quaternion class should have a constructor or some other means of creating a quaternion from an angle and an axis of rotation. But according to the documentation you linked to, it does not. So you have to do it yourself.
A quaternion is not a vec3 axis with a fourth value that is an angle. A unit quaternion representation a change in orientation is a vec3 that is the axis of rotation * the sine of half of the angle of rotation, and a scalar component that is the cosine of half the angle of rotation. This assumes that the angle of rotation is clamped on the range [-pi/2, pi/2].
Therefore, what you want is this:
float radHalfAngle = ... / 2.0; //See below
float sinVal = Math.Sin(radHalfAngle);
float cosVal = Math.Cos(radHalfAngle);
float xVal = 1.0f * sinVal;
float yVal = 0.0f * sinVal; //Here for completeness.
float zVal = 0.0f * sinVal; //Here for completeness.
Quaternion rot = new Quaternion(xVal, yVal, zVal, cosVal);
Also, converting amount to radians directly doesn't make sense, particularly so if amount is just a pixel-coordinate delta that the mouse moved. You need some kind of conversion scale between the distance the mouse moves and how much you want to rotate. And toRadians is not the kind of scale you want.
One more thing. Left-multiplying rot, as you do here, will perform a rotation about the camera space X axis. If you want a rotation about the world-space X axis, you need to right-multiply it.