Local rotation in specific implementation openGL and LWJGL - java

I am also fiddling with the global/local rotation problem and I cannnot put my finger on it. I used a lwjgl book for the implementation of my game, using openGL and LWJGL. I am using the JOML lib for vectors and matrices.
The modelview matrix construction is below. By the book, this is originally without local rotations, I added them myself. The idea is that each object has a global and local rotation. Those rotations get individually calculated and then are multiplied left/right side with the modelview matrix.
public Matrix4f getModelViewMatrix(Object obj, Matrix4f viewMatrix) {
Vector3f rotation = obj.getRot();
Vector3f localRot = obj.getLocalRot();
Matrix4f localRotMat = new Matrix4f().identity();
Matrix4f worldRotMat = new Matrix4f().identity();
localRotMat.rotateLocalX((float)Math.toRadians(localRot.x)).
rotateLocalY((float)Math.toRadians(localRot.y)).
rotateLocalZ((float)Math.toRadians(localRot.z));
worldRotMat.rotateX((float)Math.toRadians(-rotation.x)).
rotateY((float)Math.toRadians(-rotation.y)).
rotateZ((float)Math.toRadians(-rotation.z));
modelViewMatrix.identity().translate(obj.getPos());
modelViewMatrix.mulLocal(localRotMat);
modelViewMatrix.mul(worldRotMat);
modelViewMatrix.scale(obj.getScale());
Matrix4f viewCurr = new Matrix4f(viewMatrix);
return viewCurr.mul(modelViewMatrix);
}
This still results in local rotations around the 'wrong' axes. I've seen implementations using quaternions and read about gimbal lock and the like, but either the answers are very specific or too general for me. Furthermore, it would be great if I wouldn't need to use a quaternions implementation, as I would have to refactor a lot of code possibly.
Relevant code for the object class:
// Object class
private final Vector3f rot;
private final Vector3f localRot;
public Object() {
pos = new Vector3f(0, 0, 0);
scale = 1;
rot = new Vector3f(0, 0, 0);
localRot = new Vector3f(0, 0, 0);
}
// getters and setters for above
Can somebody explain what is wrong about the calculation of the rotations for the modelview matrix?
EDIT:
I can rewrite the code like below, which is a bit more in line with the hints from #GeestWagen. However, the 'local rotation' of my object is still displayed as global, so it indeed seems like it is applied 'the same' rotation twice. However, now I am stuck, because I cant find more documentation on these functions (rotateLocal/rotate).
modelViewMatrix.identity().translate(obj.getPos()).
rotateLocalX((float)Math.toRadians(-localRot.x)).
rotateLocalY((float)Math.toRadians(-localRot.y)).
rotateLocalZ((float)Math.toRadians(-localRot.z)).
rotateX((float)Math.toRadians(-rotation.x)).
rotateY((float)Math.toRadians(-rotation.y)).
rotateZ((float)Math.toRadians(-rotation.z)).
scale(obj.getScale());

Okay, I finally fixed it. It resulted in me doing a bunch more research. What I came up with was the following:
Vector3f rotation = obj.getRot();
Vector3f localRot = obj.getLocalRot();
Quaternionf rotationQ = new Quaternionf().
rotateAxis((float)Math.toRadians(-localRot.z), new Vector3f(0f, 0f, 1f)).
rotateAxis((float)Math.toRadians(-localRot.y), new Vector3f(0f, 1f, 0f)).
rotateAxis((float)Math.toRadians(-localRot.x), new Vector3f(1f, 0f, 0f)).
premul(new Quaternionf().
rotateX((float)Math.toRadians(-rotation.x)).
rotateY((float)Math.toRadians(-rotation.y)).
rotateZ((float)Math.toRadians(-rotation.z))
);
modelViewMatrix.identity().
translate(obj.getPos()).
rotate(rotationQ).
scale(obj.getScale());
This is inspired by among others this and this. What confused me a lot was the lack of hits on doing local and global rotations. Most stuff I was able to find was either. This creates a quaternion and sets the x, y, and z axes to the local rotation of the object. Then it pre-multiplies by a quaternion which axes are set to the global rotation of the object. Then this resulting quaternion is used for the modelView matrix.
Thus, for combining local and global rotations, a quaternion seems to be necessary. I thought it was used to make sure the axes do not change in a local/global rotation, but they should also be used when combining both.

Related

How to optimize 3D AABB Rotate (Java)

recently I have been implementing 3D AABB's into my game engine, to accomplish rotations I use a simple method of rotating all 8 calculated corners around the center of the box using my Vector3f.rotate() method. But as you may notice below it is very inefficient. If you want to sort though the whole class here is the github (https://github.com/EquilibriumGames/Flounder-Engine/blob/master/src/flounder/physics/AABB.java) otherwise here is the snipit I need help with, I beleve there could be simpler methods out there but I want to know what you think. Thank you!
// Creates the 8 AABB corners and rotates them.
Vector3f FLL = new Vector3f(destination.minExtents.x, destination.minExtents.y, destination.minExtents.z);
Vector3f.rotate(FLL, rotation, FLL);
Vector3f FLR = new Vector3f(destination.maxExtents.x, destination.minExtents.y, destination.minExtents.z);
Vector3f.rotate(FLR, rotation, FLR);
Vector3f FUL = new Vector3f(destination.minExtents.x, destination.maxExtents.y, destination.minExtents.z);
Vector3f.rotate(FUL, rotation, FUL);
Vector3f FUR = new Vector3f(destination.maxExtents.x, destination.maxExtents.y, destination.minExtents.z);
Vector3f.rotate(FUR, rotation, FUR);
Vector3f BUR = new Vector3f(destination.maxExtents.x, destination.maxExtents.y, destination.maxExtents.z);
Vector3f.rotate(BUR, rotation, BUR);
Vector3f BUL = new Vector3f(destination.minExtents.x, destination.maxExtents.y, destination.maxExtents.z);
Vector3f.rotate(BUL, rotation, BUL);
Vector3f BLR = new Vector3f(destination.maxExtents.x, destination.minExtents.y, destination.maxExtents.z);
Vector3f.rotate(BLR, rotation, BLR);
Vector3f BLL = new Vector3f(destination.minExtents.x, destination.minExtents.y, destination.maxExtents.z);
Vector3f.rotate(BLL, rotation, BLL);
destination.minExtents = Maths.min(FLL, Maths.min(FLR, Maths.min(FUL, Maths.min(FUR, Maths.min(BUR, Maths.min(BUL, Maths.min(BLR, BLL)))))));
destination.maxExtents = Maths.max(FLL, Maths.max(FLR, Maths.max(FUL, Maths.max(FUR, Maths.max(BUR, Maths.max(BUL, Maths.max(BLR, BLL)))))));

libgdx Fixed point after camera.rotateAround

Good night friends.
I'm having trouble drawing a fixed point on the screen when the screen is rotated. I used the method "rotateAround" from the position of the player.
It seems to me. I have to rotate this fixed point also from the position of the player. I use this stretch learned here in stackoverflow.
public void rotate(Vector3 position, Vector3 centerPoint){
this.cosTemp = MathUtils.cosDeg(this.anguloAtual);
this.senTemp = MathUtils.sinDeg(this.anguloAtual);
this.xTemp = centerPoint.x + ((position.x - centerPoint.x) * this.cosTemp) - ((position.y - centerPoint.y) * this.senTemp);
this.yTemp = centerPoint.y + ((position.y - centerPoint.y) * this.cosTemp) + ((position.x - centerPoint.x) * this.senTemp);
position.set(this.xTemp, this.yTemp, 0);
}
In the drawing that the player on the screen. I used the position of the player, then called "camera.project" then the method "rotate". The fixed point appears, however it is not exactly fixed.
I used the example of a fixed point slightly ahead of the player.
public void meDesenhar(SpriteBatch spriteBatch) {
spriteBatch.begin();
this.spritePlayer.setPosition(this.positionPlayer.x - (this.spritePlayer.getWidth() / 2),
this.positionPlayer.y - this.spritePlayer.getHeight() / 2);
this.spritePlayer.draw(spriteBatch);
spriteBatch.end();
originPosition.set(positionPlayer, 0);
fixedPosition.set(positionPlayer.x, positionPlayer.y + 10, 0);
cameraTemp.project(fixedPosition);
cameraTemp.project(originPosition);
cameraManagerTemp.rotate(fixedPosition, originPosition);
Debugagem.drawPointInScreen(Color.BLUE, fixedPosition);
}
My questions:
1 - I am doing something wrong, or just it is a result of rounding? I realized when debugging. The position of the player changed a little every rotation after the "camera.project". Example position (540, 320) turned (539.99, 320.013)
2 - I tried using and enjoying the SpriteBatch the draw method to perform the rotation however, could not make the rotation from the player. I would arrive at the same result.
3 - Can I use two cameras? Each camera would be a layer. A camera at the map and the player would be. The other for fixed point. It's viable? I could not find any example that works with more than one camera at the same time. Anyone know any examples please. I'm not talking about huds or cameras to stage.
Video follows.
https://www.youtube.com/watch?v=1Vg8haN5ULE
Thank you.
It can be result of rounding because its moving a pixel.
You can calculate rotation from the player but its not necessary.
Of course you can use multiple cameras in your game and you should also in this case.
Its few screenshot from my old projects that i used multiple cameras
As you can see you can even use different type of cameras like ortho and perspective both 2D and 3D.
Just create new camera like first one and change projection matrix
camrotate = new OrthographicCamera(540, 960);
//...
camfixed = new OrthographicCamera(540, 960);
//...
And in render method
batch.setProjectionMatrix(camrotate.combined);
batch.begin();
//draw in camrotate now
//...
//...
batch.end();
batch.setProjectionMatrix(camfixed.combined);
batch.begin();
//draw fixed elements now
//...
//...
batch.end();
//add one more camera if you need
Edit:
Change projection matrix outside of batch.begin()/end() otherwise the current batch will flushed.

Why doesn't gluUnProject() seem to behave?

I am trying to find the coordinates of my mouse on a flat 3D surface. After googling a bit on that, I found out that you use gluUnProject for doing so. So, I implemented that. Here is my code (when taking away the parts that are not interesting):
public class Input {
private FloatBuffer modelView = BufferUtils.createFloatBuffer(16);
private FloatBuffer projection = BufferUtils.createFloatBuffer(16);
private IntBuffer viewport = BufferUtils.createIntBuffer(16);
private FloatBuffer location = BufferUtils.createFloatBuffer(3);
private FloatBuffer winZ = BufferUtils.createFloatBuffer(1);
public float[] getMapCoords(int x, int y)
{
modelView.clear().rewind();
projection.clear().rewind();
viewport.clear().rewind();
location.clear().rewind();
winZ.clear().rewind();
glGetFloat(GL_MODELVIEW_MATRIX, modelView);
glGetFloat(GL_PROJECTION_MATRIX, projection);
glGetInteger(GL_VIEWPORT, viewport);
float winX = (float)x;
float winY = (float)viewport.get(3) - (float)y;
glReadPixels(x, (int)winY, 1, 1, GL_DEPTH_COMPONENT, GL_FLOAT, winZ);
gluUnProject(winX, winY, winZ.get(0), modelView, projection, viewport, location);
return new float[] {location.get(0), location.get(1), location.get(2)};
}
}
When I call this function, passing in the X and Y coordinates of the mouse, I get some numbers that increase by 100 for every pixel I move with my mouse (it should be around 1). After doing some prints from various variables, I found out that the winZ buffer contains the value 1.0. My gluPerspective is set up in such a way that its near clipping point it at 0.1 and its far point at 10000, which would explain why the number is increasing that rapidly. Yet I don't know how to force openGl to use my flat plane instead for finding this distance.
So now I am wondering; if this is the correct/best/easiest method for finding the mouse coordinates on a surface in the 3D world, what could I be doing wrong? If it is not, what is a better way of doing it?
Yes, this is the correct method to use. You are probably just calling it at the point where the matrices do not contain "good values" (might be caused by glPush/PopMatrix() functions in your code).
To confirm you are getting good coordinates, draw a single GL_POINT at the coordinates you get, after you get them (disable depth test and increase point size to, say, 10 pixels). If the point is moving under your mouse, then the calculated coords are correct but the matrices are not.

Getting object coordinates from camera

I've implemented a camera in Java using a position vector and three direction vectors so I can use gluLookAt(); moving around in `ghost mode' works fine enough, but I want to add collision detection. I can't seem to figure out how to transform my position vector to coordinates in which OpenGL draws my objects.
A rough sketch of my drawing loop is this:
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glLoadIdentity();
camera.setView();
drawer.drawTheScene();
I'm at a loss of how to proceed; looking at the ModelView matrix between calls and my position vector, I haven't found any kind of correlation.
Finally figured it out by reviewing http://fly.cc.fer.hr/~unreal/theredbook/chapter03.html again. To get from eye space (camera) to object space, you have to multiply that vector with the inverse of the ModelView matrix, or in code:
Vector4f vpos = new Vector4f(0, 0, 0, 1);
// (0,0,0,1) because it's relative to the cam
float mv[]=new float[16];
ByteBuffer temp = ByteBuffer.allocateDirect(64);
temp.order(ByteOrder.nativeOrder());
GL11.glGetFloat(GL11.GL_MODELVIEW_MATRIX, (FloatBuffer)temp.asFloatBuffer());
temp.asFloatBuffer().get(mv);
Matrix4f m4 = new Matrix4f();
m4.load((FloatBuffer)temp.asFloatBuffer());
m4.invert();
vpos = Matrix4f.transform(m4, vpos, vpos);

Rotation matrix for direction vector

I've been playing with some algorithms on the internet for a while and I can't seem to get them to work, so I'm tossing the question out here;
I am attempting to render a velocity vector line from a point. Drawing the line isn't difficult: just insert a line with length velocity.length into the graph. This puts the line centered at the point in the y-axis direction. We need to get this now in the proper rotation and translation.
The translational vector is not difficult to calculate: it is half the velocity vector. The rotational matrix, however, is being exceedingly elusive to me. Given a directional vector <x, y, z>, what's the matrix I need?
Edit 1: Look; if you don't understand the question, you probably won't be able to give me an answer.
Here is what I currently have:
Vector3f translation = new Vector3f();
translation.scale(1f/2f, body.velocity);
Vector3f vec_z = (Vector3f) body.velocity.clone();
vec_z.normalize();
Vector3f vec_y; // reference vector, will correct later
if (vec_z.x == 0 && vec_z.z == 0) {
vec_y = new Vector3f(-vec_z.y, 0f, 0f); // could be optimized
} else {
vec_y = new Vector3f(0f, 1f, 0f);
}
Vector3f vec_x = new Vector3f();
vec_x.cross(vec_y, vec_z);
vec_z.normalize();
vec_y.cross(vec_x, vec_z);
vec_y.normalize();
vec_y.negate();
Matrix3f rotation = new Matrix3f(
vec_z.z, vec_z.x, vec_z.y,
vec_x.z, vec_x.x, vec_x.y,
vec_y.z, vec_y.x, vec_y.y
);
arrowTransform3D.set(rotation, translation, 1f);
based off of this article. And yes, I've tried the standard rotation matrix (vec_x.x, vec_y.x, etc) and it didn't work. I've been rotating the columns and rows to see if there's any effect.
Edit 2:
Apologies about the rude wording of my comments.
So it looks like there were a combination of two errors; one of which House MD pointed out (really bad naming of variables: vec_z was actually vec_y, and so on), and the other was that I needed to invert the matrix before passing it off to the rendering engine (transposing was close!). So the modified code is:
Vector3f vec_y = (Vector3f) body.velocity.clone();
vec_y.normalize();
Vector3f vec_x; // reference vector, will correct later
if (vec_y.x == 0 && vec_y.z == 0) {
vec_x = new Vector3f(-vec_y.y, 0f, 0f); // could be optimized
} else {
vec_x = new Vector3f(0f, 1f, 0f);
}
Vector3f vec_z = new Vector3f();
vec_z.cross(vec_x, vec_y);
vec_z.normalize();
vec_x.cross(vec_z, vec_y);
vec_x.normalize();
vec_x.negate();
Matrix3f rotation = new Matrix3f(
vec_x.x, vec_x.y, vec_x.z,
vec_y.x, vec_y.y, vec_y.z,
vec_z.x, vec_z.y, vec_z.z
);
rotation.invert();
This should do you
Dupe.
The question there involves getting a rotation to a certain axis, whereas I'm concerned with getting a rotation matrix.
Gee, I wonder if you could turn convert one to the other?
BTW, your current solution of picking an arbitrary y axis and then reorthogonalising should work fine; it looks bugged though, or at least badly written. 'z_vec' is not a good variable-name for the y-axis. What's with the 'z,x,y' ordering, anyway?
If it still doesn't work, try making random changes until it does - transpose the matrix, negate vectors until you have an even number of sign errors, that kind of thing.
Also your tone of voice comes across as sort-of rude, given that you're asking strangers to spend their time helping you.

Categories

Resources