I have a problem with an Orthographic projection, isometric camera view, where objects I draw far away from center 0, 0, 0 on a scene is clipped even if the objects is placed in the view.
In the onSurfaceChanged method I have:
GLES20.glViewport(0, 0, width, height);
final float ratio = (float) width / height;
final float left = -ratio;
final float right = ratio;
final float bottom = -1.0f;
final float top = 1.0f;
final float near = 1.0f;
final float far = 10.0f;
int useForOrtho = Math.min(width, height);
mProjectionMatrix = createOrthoProjectionMatrix(left, right, top, bottom, near, far);
Matrix.orthoM(mViewMatrix, 0,
-useForOrtho / 2,
useForOrtho / 2,
-useForOrtho / 2,
useForOrtho / 2, 0.01f, 100.0f);
Matrix.rotateM(mViewMatrix, 0, 45, 0.0f, 0.0f, 1.0f);
Matrix.rotateM(mViewMatrix, 0, 35, 1.0f, -1.0f, 0.0f);
Here is the function for the projection matrix:
public float[] createOrthoProjectionMatrix(float left, float right, float top, float bottom, float near, float far)
{
float[] m = new float[16];
m[0] = 2.0f / (right - left);
m[1] = 0.0f;
m[2] = 0.0f;
m[3] = 0.0f;
m[4] = 0.0f;
m[5] = 2.0f / (top - bottom);
m[6] = 0.0f;
m[7] = 0.0f;
m[8] = 0.0f;
m[9] = 0.0f;
m[10] = -2.0f / (far - near);
m[11] = 0.0f;
m[12] = -(right + left ) / (right - left );
m[13] = -(top + bottom) / (top - bottom);
m[14] = -(far + near ) / (far - near );
m[15] = 1.0f;
return m;
}
Where can I control how far out on x, y, z away from 0, 0, 0 that objects can be drawn without being clipped? The problem objects isn't placed "out of view", but more like out of the current rendervolume - out of distance - having triangles clipped.
Related
I am working on an Android application using OpenGL.
In a database, I store the rotation of objects using local Euler rotation, x, y, then z, but in the editor, I would like to apply a global rotation by the x, y or z global axis. I took two approaches, outlined below.
I've simplified these methods to remove irrelevant Android code.
I've tried taking the matrix approach, but the object appears to rotate in an axis not aligned with the global x, y or z after calling the method a second time. I've read somewhere that the floating point error builds up over time making the rotation matrix "numerically unstable", which I assume is what's happening in the first method.
// rotAxis = 0 means rotation around the X global axis
// rotAxis = 1 means rotation around the Y global axis
// rotAxis = 2 means rotation around the Z global axis
public void executeRotationWithMatrix(float rotAngle, int rotAxis){
float[] rotationMatrix = new float[16];
// Matrix class is in android.opengl
Matrix.setIdentityM(rotationMatrix, 0);
switch (rotAxis){
case 0:
Matrix.rotateM(rotationMatrix, 0, rotAngle, 1.f, 0.f, 0.f);
break;
case 1:
Matrix.rotateM(rotationMatrix, 0, rotAngle, 0.f, 1.f, 0.f);
break;
case 2:
Matrix.rotateM(rotationMatrix, 0, rotAngle, 0.f, 0.f, 1.f);
break;
}
float rotx = getLocalRotationOfObjectOnX(); // Pseudocode
float roty = getLocalRotationOfObjectOnY(); // Pseudocode
float rotz = getLocalRotationOfObjectOnZ(); // Pseudocode
Matrix.rotateM(rotationMatrix, 0, rotx, 1.f, 0.f, 0.f);
Matrix.rotateM(rotationMatrix, 0, roty, 0.f, 1.f, 0.f);
Matrix.rotateM(rotationMatrix, 0, rotz, 0.f, 0.f, 1.f);
Vector3f rotationVector = rotationMatrixToEulerAngles(rotationMatrix);
saveLocalRotationOfObjectOnX(rotationVector.x); // Pseudocode
saveLocalRotationOfObjectOnY(rotationVector.y); // Pseudocode
saveLocalRotationOfObjectOnZ(rotationVector.z); // Pseudocode
}
In the second method, I tried to take the rotation quaternion approach by applying the rotations, but I get even weirder results whenever I try to use this method.
// rotAxis = 0 means rotation around the X global axis
// rotAxis = 1 means rotation around the Y global axis
// rotAxis = 2 means rotation around the Z global axis
public void executeRotationWithQuat(float rotAngle, int rotAxisInd){
Quat4f rotationQuat = new Quat4f(0, 0, 0, 1);
Quat4f tempQuat = new Quat4f(0, 0, 0, 1);
switch (rotAxisInd){
case 0:
QuaternionUtil.setRotation(tempQuat, new Vector3f(1, 0, 0), rotAngle);
break;
case 1:
QuaternionUtil.setRotation(tempQuat, new Vector3f(0, 1, 0), rotAngle);
break;
case 2:
QuaternionUtil.setRotation(tempQuat, new Vector3f(0, 0, 1), rotAngle);
break;
}
tempQuat.normalize();
rotationQuat.mul(tempQuat);
rotationQuat.normalize();
float rotx = getLocalRotationOfObjectOnX(); // Pseudocode
float roty = getLocalRotationOfObjectOnY(); // Pseudocode
float rotz = getLocalRotationOfObjectOnZ(); // Pseudocode
QuaternionUtil.setRotation(tempQuat, new Vector3f(1, 0, 0), rotx); tempQuat.normalize();
rotationQuat.mul(tempQuat);
rotationQuat.normalize();
QuaternionUtil.setRotation(tempQuat, new Vector3f(0, 1, 0), roty); tempQuat.normalize();
rotationQuat.mul(tempQuat);
rotationQuat.normalize();
QuaternionUtil.setRotation(tempQuat, new Vector3f(0, 0, 1), rotz); tempQuat.normalize();
rotationQuat.mul(tempQuat);
rotationQuat.normalize();
float qw = rotationQuat.w;
float qx = rotationQuat.x;
float qy = rotationQuat.y;
float qz = rotationQuat.z;
float[] rotationMatrix = new float[]{
1.0f - 2.0f*qy*qy - 2.0f*qz*qz, 2.0f*qx*qy - 2.0f*qz*qw, 2.0f*qx*qz + 2.0f*qy*qw, 0.0f,
2.0f*qx*qy + 2.0f*qz*qw, 1.0f - 2.0f*qx*qx - 2.0f*qz*qz, 2.0f*qy*qz - 2.0f*qx*qw, 0.0f,
2.0f*qx*qz - 2.0f*qy*qw, 2.0f*qy*qz + 2.0f*qx*qw, 1.0f - 2.0f*qx*qx - 2.0f*qy*qy, 0.0f,
0.0f, 0.0f, 0.0f, 1.0f
};
Vector3f rotationVector = rotationMatrixToEulerAngles(rotationMatrix);
saveLocalRotationOfObjectOnX(rotationVector.x); // Pseudocode
saveLocalRotationOfObjectOnY(rotationVector.y); // Pseudocode
saveLocalRotationOfObjectOnZ(rotationVector.z); // Pseudocode
}
The following are helper methods used in the above two methods.
public Vector3f rotationMatrixToEulerAngles(float[] m){
float sy = (float)Math.sqrt(m[6]*m[6] + m[10]*m[10]);
float x, y, z;
x = (float)Math.atan2(m[6], m[10]);
y = (float)Math.atan2(-m[2], sy);
z = (float)Math.atan2(m[1], m[0]);
//convert angles from radians to degrees
float conFactor = (float)(180/Math.PI);
x *= conFactor;
y *= conFactor;
z *= conFactor;
return new Vector3f(x, y, z);
}
public class QuaternionUtil {
public static void setRotation(Quat4f q, Vector3f axis, float angle) {
float d = axis.length();
assert (d != 0f);
float s = (float)Math.sin(angle * 0.5f) / d;
q.set(axis.x * s, axis.y * s, axis.z * s, (float) Math.cos(angle * 0.5f));
}
}
public class Vector3f{
public final float length() {
return (float)Math.sqrt((double)(this.x * this.x + this.y * this.y + this.z * this.z));
}
}
Any help would be greatly appreciated!
I'm trying to implement a way to sample a texture in Java. I want this to work the same way it would by using GLSL's "texture" function. I want the neighboring pixels to be taken into account when the color is calcuated.
For example, say I use a float to get the image color at a certain pixel, and the number falls between two pixels. How can I calculate a mix of the neighboring pixels? Here's an image showing my goal.
Is there an easy way to do this using java's BufferedImage class?
Here's the code I have so far, it only works for the x position at the moment.
public static final java.awt.Color getColor(final BufferedImage image, final float x, final int y) {
final float imageX = x * image.getWidth();
final float decimal = imageX % 1f;
final java.awt.Color left = new java.awt.Color(image.getRGB(Maths.clamp((int) imageX - 1, 0, image.getWidth() - 1), y));
final java.awt.Color center = new java.awt.Color(image.getRGB(Maths.clamp((int) imageX, 0, image.getWidth() - 1), y));
final java.awt.Color right = new java.awt.Color(image.getRGB(Maths.clamp((int) imageX + 1, 0, image.getWidth() - 1), y));
if (decimal == 0.5f) return center;
if (decimal < 0.5f) {
final float distanceFromCenter = 0.5f - decimal;
final float distanceFromLeft = decimal + 0.5f;
final Vector3f leftColor = new Vector3f(left.getRed() / 255f, left.getGreen() / 255f, left.getBlue() / 255f);
final Vector3f centerColor = new Vector3f(center.getRed() / 255f, center.getGreen() / 255f, center.getBlue() / 255f);
leftColor.scale(1f - distanceFromLeft);
centerColor.scale(1f - distanceFromCenter);
final Vector3f color = Vector3f.add(leftColor, centerColor, null);
return new java.awt.Color(color.getX(), color.getY(), color.getZ());
} else {
final float distanceFromCenter = decimal - 0.5f;
final float distanceFromRight = 1f - decimal + 0.5f;
final Vector3f rightColor = new Vector3f(right.getRed() / 255f, right.getGreen() / 255f, right.getBlue() / 255f);
final Vector3f centerColor = new Vector3f(center.getRed() / 255f, center.getGreen() / 255f, center.getBlue() / 255f);
rightColor.scale(1f - distanceFromRight);
centerColor.scale(1f - distanceFromCenter);
final Vector3f color = Vector3f.add(rightColor, centerColor, null);
return new java.awt.Color(color.getX(), color.getY(), color.getZ());
}
}
hope this code help, if you use this appropriately. This function extract pixels from buffered image and we can replace it what ever we want:
private BufferedImage changeColor(BufferedImage image, int srcColor, int replaceColor)
{
BufferedImage destImage = new BufferedImage(image.getWidth(), image.getHeight(), BufferedImage.TYPE_INT_ARGB);
Graphics2D g = destImage.createGraphics();
g.drawImage(image, null, 0, 0);
g.dispose();
for (int width = 0; width < image.getWidth(); width++)
{
for (int height = 0; height < image.getHeight(); height++)
{
if (destImage.getRGB(width, height) == srcColor)
{
destImage.setRGB(width, height, replaceColor);
}
}
}
return destImage;
}
When I create a Body and draw a Texture where that body is, if the angle is set to 0.0f, it appears exactly where expected. Upright and at the centre x and y of the Player (the black circle). When this angle changes, the x and y of the texture appear to be completely off. In fact they tend to be off the screen as I can only see them visibly sometimes as they fall with the gravity.
Here's the method where I create a new bullet:
private void shoot(final float delta) {
gameTime += delta;
final float timeSinceLastShot = gameTime - lastBulletShotTime;
//5 bullets a second, kerpow
if (timeSinceLastShot > 0.2f) {
lastBulletShotTime = gameTime;
shot.play();
final float shotX = player.getX() + ((player.getWidth() / 2) - (bulletTexture.getWidth() / 2));
final float shotY = player.getY() + ((player.getHeight() / 2) - (bulletTexture.getHeight() / 2));
BodyDef bodyDef = new BodyDef();
bodyDef.type = BodyDef.BodyType.DynamicBody;
bodyDef.position.set(shotX, shotY);
Body body = world.createBody(bodyDef);
float angle = player.getRotation() * MathUtils.degreesToRadians;
body.setTransform(body.getPosition().x, body.getPosition().y, angle);
System.out.println("angle rad: " + angle);
bullets.add(body);
PolygonShape shape = new PolygonShape();
shape.setAsBox(bulletTexture.getWidth() / 2, bulletTexture.getHeight() / 2);
FixtureDef fixtureDef = new FixtureDef();
fixtureDef.shape = shape;
fixtureDef.density = 1f;
Fixture fixture = body.createFixture(fixtureDef);
shape.dispose();
}
}
And here's the part of the render method where I loop and draw the bullets:
for (Body bullet : bullets) {
final float x = bullet.getPosition().x;
final float y = bullet.getPosition().y;
final float originX = x + (bulletTexture.getWidth() / 2);
final float originY = y + (bulletTexture.getHeight() / 2);
final float width = bulletTexture.getWidth();
final float height = bulletTexture.getHeight();
final float scaleX = 1.0f;
final float scaleY = 1.0f;
final float rotation = bullet.getAngle() * MathUtils.radiansToDegrees;
final int sourceX = 0;
final int sourceY = 0;
final int sourceWidth = bulletTexture.getTextureData().getWidth();
final int sourceHeight = bulletTexture.getTextureData().getHeight();
final boolean flipX = false;
final boolean flipY = false;
batch.draw(bulletTexture,
x, y,
originX, originY,
width, height,
scaleX, scaleY,
rotation,
sourceX, sourceY,
sourceWidth, sourceHeight,
flipX, flipY);
}
Any tips for what I'm doing wrong? I want the bullets to always start at the centre of the Player Circle, but also at the correct angle. I then intend to add some velocity so they 'shoot'.
The full code can be found here on Github https://github.com/SamRuffleColes/PlayerCircle/blob/master/core/src/es/rufflecol/sam/playercircle/PlayerCircle.java
I've also attached a screenshot, everything has fallen a little with the gravity between shooting and taking this, but the bottom left bullets appeared in the right place initially, while all the others have fallen from off the screen (some others which I 'shot' were never visible).
My mistake was with the originX and originY. Corrected to the below, it now works:
final float originX = bulletTexture.getWidth() / 2;
final float originY = bulletTexture.getHeight() / 2;
I`m using Vertex Buffers in JOGL. I have a few hundred thousand triangles. Each triangle contains :
9 floats for the vertices - 3 for each edge
3 floats for the surface normal
3 floats for the colors.
I can`t seem to display the triangles or the colors. I know the normals are being calculated correctly.
This doesn`t work.
gl.glDrawArrays(GL2.GL_TRIANGLES, 0, vertxcnt);
But, the below snippet works - however I don`t see the colors. So, I know the points that are making up the triangles are correct.
gl.glDrawArrays(GL2.GL_POINTS, 0, vertxcnt);
So, if the points and the normals are correctly being calculated, I thinking is I`m going wrong in the render(gl) function. The code for that is below. What am I doing wrong? I cant post SSCCE now due to the complexity, but would like to know if anything is glaringly wrong.
private void render(GL2 gl) {
// VBO
// Enable Pointers
gl.glBindBuffer(GL2.GL_ARRAY_BUFFER, VBOVertices[0]); // Set Pointers To Our Data
gl.glEnableClientState(GL2.GL_VERTEX_ARRAY); // Enable Vertex Arrays
gl.glVertexPointer(3, GL.GL_FLOAT, BufferUtil.SIZEOF_FLOAT * 15, 0); //15 = 9 vertices of triangles + 3 normal + 3 colors
gl.glEnableClientState(GL2.GL_NORMAL_ARRAY);
gl.glNormalPointer(GL.GL_FLOAT, BufferUtil.SIZEOF_FLOAT * 15, BufferUtil.SIZEOF_FLOAT * 9);
gl.glEnableClientState(GL2.GL_COLOR_ARRAY);
gl.glColorPointer(3, GL.GL_FLOAT, BufferUtil.SIZEOF_FLOAT * 15, BufferUtil.SIZEOF_FLOAT * 12);
// Render
// Draw All Of The Triangles At Once
gl.glPointSize(4);
gl.glDrawArrays(GL2.GL_POINTS, 0, vertxcnt);
// Disable Pointers
// Disable Vertex, Normals and Color Arrays
gl.glDisableClientState(GL2.GL_VERTEX_ARRAY);
gl.glDisableClientState(GL2.GL_NORMAL_ARRAY);
gl.glDisableClientState(GL2.GL_COLOR_ARRAY);
}
Here is the init and display functions.
#Override
public void init(GLAutoDrawable drawable) {
GL2 gl = drawable.getGL().getGL2();
gl.glClearColor(0.0f, 0.0f, 6.0f, 0.5f);
gl.glClearDepth(1.0f); // Depth Buffer Setup
gl.glDepthFunc(GL.GL_LEQUAL); // The Type Of Depth Testing (Less Or
// Equal)
gl.glEnable(GL.GL_DEPTH_TEST); // Enable Depth Testing
gl.glDepthFunc(GL2.GL_LESS);
gl.glEnable(GL2.GL_LIGHTING);
gl.glEnable(GL2.GL_LIGHT0);
gl.glEnable(GL2.GL_AUTO_NORMAL);
gl.glEnable(GL2.GL_NORMALIZE);
gl.glEnable(GL2.GL_CULL_FACE);
gl.glFrontFace(GL2.GL_CCW);
gl.glCullFace(GL2.GL_BACK);
gl.glHint(GL2.GL_PERSPECTIVE_CORRECTION_HINT, GL2.GL_NICEST);
gl.glShadeModel(GL2.GL_SMOOTH);
buildVBOs(gl);
}
#Override
public void display(GLAutoDrawable drawable) {
GL2 gl = drawable.getGL().getGL2();
gl.glClearColor(.0f, .0f, .2f, 0.9f);
gl.glClear(GL2.GL_COLOR_BUFFER_BIT | GL2.GL_DEPTH_BUFFER_BIT);
gl.glLoadIdentity();
glu.gluLookAt(45, 0, 0, 0, 0, 0, 0.0, 1.0, 0.0);
float ma_x = (float) getMax(fx0);
float mi_x = (float) getMin(fx0);
float tr_x = (ma_x + mi_x) / 2;
float ma_y = (float) getMax(fy0);
float mi_y = (float) getMin(fy0);
float tr_y = (ma_y + mi_y) / 2;
float ma_z = (float) getMax(fz0);
float mi_z = (float) getMin(fz0);
float tr_z = (ma_z + mi_z) / 2;
gl.glScalef(scaleFac, scaleFac, scaleFac);
gl.glRotatef(rotFac, 0, 1, 0);
gl.glTranslatef(-tr_x, -tr_y, -tr_z);
for (int i = 0; i < 30; i++) {
render(gl);
gl.glRotatef(12, 0, 0, 1);
}
}
*/
private void createVects(double ang) {
int cnt = fx0.size();
for (int i = 0; i < cnt - 1; i++) {
// Triangle 1 and 2 [Top]
float x0 = (float) (fx0.get(i) * Math.cos(ang) - fy0.get(i) * Math.sin(ang));
float y0 = (float) (fx0.get(i) * Math.sin(ang) + fy0.get(i) * Math.cos(ang));
float z0 = fz0.get(i).floatValue();
Vect3D v0 = new Vect3D(x0, y0, z0);
fvert.add(v0); // 0
float x1 = (float) (fx0.get(i + 1) * Math.cos(ang) - fy0.get(i + 1) * Math.sin(ang));
float y1 = (float) (fx0.get(i + 1) * Math.sin(ang) + fy0.get(i + 1) * Math.cos(ang));
float z1 = fz0.get(i + 1).floatValue();
Vect3D v1 = new Vect3D(x1, y1, z1);
fvert.add(v1);// 1
float x2 = (float) (fx1.get(i + 1) * Math.cos(ang) - fy1.get(i + 1) * Math.sin(ang));
float y2 = (float) (fx1.get(i + 1) * Math.sin(ang) + fy1.get(i + 1) * Math.cos(ang));
float z2 = fz1.get(i + 1).floatValue();
Vect3D v2 = new Vect3D(x2, y2, z2);
fvert.add(v2);// 2
Vect3D n0 = calcNormal(v0, v1, v2);
fnorm.add(n0);
// VBO
vertices.put(x0); //vertices of the triangle
vertices.put(y0);
vertices.put(z0);
vertices.put(x1);
vertices.put(y1);
vertices.put(z1);
vertices.put(x2);
vertices.put(y2);
vertices.put(z2);
vertices.put(n0.x); // normals
vertices.put(n0.y);
vertices.put(n0.z);
vertices.put(0.5f); // colors // for now
vertices.put(0.0f);
vertices.put(0.0f);
}
}
I am using a camera that has a yaw, a pitch, and a roll. When yaw == 0 the camera is looking down the -z axis(yaw == 90 is positive x), when pitch == 270 the camera is looking up(pitch == 0 is looking straight), and when roll == 180 the camera is upside down.
The camera's yaw, pitch, and roll values are never less than zero or greater than 360(when any value approaches 0 or 360 when it passes that amount it is automatically moved to the 'other side').
I have implemented 3DoF and it works quite nicely; however, when I implemented 6DoF, everything appears to work until the roll is around 90 or 270, then strange things occur to the up and right vectors(forward always seems to work because roll rotates around that axis?)
The scene I am rendering is just a bunch of blocks(in minecraft-style chunks) and I am always able to move forward/backward and use the forward vector to target a block so I know that the forward vector is done.
Here is my initGL:
public void initGL() {
GL11.glEnable(GL11.GL_TEXTURE_2D);
GL11.glShadeModel(GL11.GL_SMOOTH);
GL11.glClearColor(0.0f, 0.0f, 0.0f, 0.0f);
GL11.glClearDepth(1.0);
GL11.glEnable(GL11.GL_DEPTH_TEST);
GL11.glDepthFunc(GL11.GL_LEQUAL);
GL11.glMatrixMode(GL11.GL_PROJECTION);
GL11.glLoadIdentity();
GLU.gluPerspective(fov, ((float) Display.getWidth()) / ((float) Display.getHeight() != 0 ? Display.getHeight() : 1), 0.1f, 100.0f);//fov is 45.0f
GL11.glMatrixMode(GL11.GL_MODELVIEW);
GL11.glHint(GL11.GL_PERSPECTIVE_CORRECTION_HINT, GL11.GL_NICEST);
}
Here is where I rotate and translate to my camera's view:
public final void lookThrough() {
GL11.glRotatef(this.roll, 0.0f, 0.0f, 1.0f);
GL11.glRotatef(this.pitch, 1.0f, 0.0f, 0.0f);
GL11.glRotatef(this.yaw, 0.0f, 1.0f, 0.0f);
GL11.glTranslatef(-this.position.x, -this.position.y, -this.position.z);
}
And here are my six degrees of freedom calculations:
public static final double zeroRad = Math.toRadians(0);
public static final double ninetyRad = Math.toRadians(90);
public static final double oneEightyRad = Math.toRadians(180);
public static final double twoSeventyRad = Math.toRadians(270);
public static final strictfp void updateLookVectorsIn6DoF(Vector3f yawPitchAndRoll, Vector3f forward, Vector3f up, Vector3f right) {
final double yaw = Math.toRadians(yawPitchAndRoll.getX());
final double pitch = Math.toRadians(yawPitchAndRoll.getY());
final double roll = Math.toRadians(yawPitchAndRoll.getZ());
final float sinYaw = ((float) Math.sin(yaw));
final float cosYaw = ((float) Math.cos(yaw));
final float sinYaw90 = ((float) Math.sin(yaw + ninetyRad));
//final float sinYaw180 = ((float) Math.sin(yaw + oneEightyRad));
final float cosYaw270 = ((float) Math.cos(yaw - ninetyRad));
final float sinRoll = ((float) Math.sin(roll));
final float cosRoll = ((float) Math.cos(roll));
//final float sinRoll180 = ((float) Math.sin(roll + oneEightyRad));
final float cosPitch90 = ((float) Math.cos(pitch + ninetyRad));
//final float cosPitch270 = ((float) Math.cos(pitch + twoSeventyRad));
final float sinPitch90 = ((float) Math.sin(pitch + ninetyRad));
final float sinPitch270 = ((float) Math.sin(pitch - ninetyRad));
//Forward:(No roll because roll goes around the Z axis and forward movement is in that axis.)
float x = sinYaw * ((float) Math.cos(pitch));
float y = -((float) Math.sin(pitch));
float z = cosYaw * ((float) Math.cos(pitch - oneEightyRad));
forward.set(x, y, z);
//cos(90) = 0, cos(180) = -1, cos(270) = 0, cos(0) = 1
//sin(90) = 1, sin(180) = 0, sin(270) = -1, sin(0) = 0
//Up: Strange things occur when roll is near 90 or 270 and yaw is near 0 or 180
x = -(sinYaw * cosPitch90) * cosRoll - (sinRoll * sinYaw90);
y = -sinPitch270 * cosRoll;
z = (cosYaw * cosPitch90) * cosRoll + (sinRoll * cosYaw270);
up.set(x, y, z);
//Right: Strange things occur when roll is near 90 or 270 and pitch is near 90 or 270
x = (cosRoll * sinYaw90) - (sinRoll * (sinYaw * cosPitch90));
y = 0 - (sinRoll * sinPitch90);//This axis works fine
z = (cosRoll * cosYaw270) + (sinRoll * (sinYaw * cosPitch90));
right.set(x, y, z);
}
I did find a very similar question here, but it uses matrices and quaternions and I don't want to have to do that unless I absolutely have to(and I was careful to try to multiply roll pitch and yaw in the correct order): LWJGL - Problems implementing 'roll' in a 6DOF Camera using quaternions and a translation matrix
So I finally got the hang of the meaning of cos and sin(but don't ask me to teach it) and was able to get this working!
Here is the new and improved code:
public static final double zeroRad = Math.toRadians(0);
public static final double ninetyRad = Math.toRadians(90);
public static final double oneEightyRad = Math.toRadians(180);
public static final double twoSeventyRad = Math.toRadians(270);
public static final strictfp void updateLookVectorsIn6DoF(Vector3f yawPitchAndRoll, Vector3f forward, Vector3f up, Vector3f right) {
final double yaw = Math.toRadians(yawPitchAndRoll.getX());
final double pitch = Math.toRadians(yawPitchAndRoll.getY());
final double roll = Math.toRadians(yawPitchAndRoll.getZ());
final float sinYaw = ((float) Math.sin(yaw));
final float cosYaw = ((float) Math.cos(yaw));
final float sinYaw90 = ((float) Math.sin(yaw + ninetyRad));
final float sinYaw270 = ((float) Math.sin(yaw - ninetyRad));//+ twoSeventyRad));
final float cosYaw90 = ((float) Math.cos(yaw + ninetyRad));
final float cosYaw180 = ((float) Math.cos(yaw + oneEightyRad));
final float cosYaw270 = ((float) Math.cos(yaw - ninetyRad));//+ twoSeventyRad));
final float sinRoll = ((float) Math.sin(roll));
final float cosRoll = ((float) Math.cos(roll));
final float cosRoll180 = ((float) Math.cos(roll + oneEightyRad));
final float cosPitch90 = ((float) Math.cos(pitch + ninetyRad));
final float sinPitch90 = ((float) Math.sin(pitch + ninetyRad));
final float sinPitch270 = ((float) Math.sin(pitch - ninetyRad));
//Forward:(No roll because roll goes around the Z axis and forward movement is in that axis.)
float x = sinYaw * ((float) Math.cos(pitch));
float y = -((float) Math.sin(pitch));
float z = cosYaw * ((float) Math.cos(pitch - oneEightyRad));
forward.set(x, y, z);
//Multiply in this order: roll, pitch, yaw
//cos(90) = 0, cos(180) = -1, cos(270) = 0, cos(0) = 1
//sin(90) = 1, sin(180) = 0, sin(270) = -1, sin(0) = 0
//hmm... gimbal lock, eh? No!
//Up://
x = (cosRoll180 * cosPitch90 * sinYaw) - (sinRoll * cosYaw180);
y = -sinPitch270 * cosRoll;
z = (cosRoll * cosPitch90 * cosYaw) + (sinRoll * sinYaw);
up.set(x, y, z);
//Right:
x = (cosRoll * sinYaw90) - (sinRoll * cosPitch90 * cosYaw90);
y = 0 - (sinRoll * sinPitch90);//This axis works fine
z = (cosRoll * cosYaw270) + (sinRoll * cosPitch90 * sinYaw270);
right.set(x, y, z);
}