Applying map of the earth texture a Sphere - java

i been trying to implement a 3D animation in openGL (using JOGL) of a solar system so far i have 5 planets of different sizes but the problem i seem to be having is i cant add a map of the earth texture on a Sphere can anybody help me on how its done?
This is the code i have so far in my Display method:
#Override
public void display(GLAutoDrawable drawable) {
GL2 gl = drawable.getGL().getGL2();
GLU glu = new GLU();
gl.glClear(GL.GL_COLOR_BUFFER_BIT);
//make sure we are in model_view mode
gl.glMatrixMode(GL2.GL_MODELVIEW);
gl.glLoadIdentity();
glu.gluLookAt(10,20,20,0,3,0,0, 20, 0);
//gl.glMatrixMode(GL2.GL_PROJECTION);
//glu.gluPerspective(45,1,1,25);
//render ground plane
gl.glPushMatrix();
gl.glTranslatef(-10.75f, 3.0f, -1.0f);
gl.glColor3f(0.3f, 0.5f, 1f);
GLUquadric earth = glu.gluNewQuadric();
glu.gluQuadricDrawStyle(earth, GLU.GLU_FILL);
glu.gluQuadricNormals(earth, GLU.GLU_FLAT);
glu.gluQuadricOrientation(earth, GLU.GLU_OUTSIDE);
final float radius = 3.378f;
final int slices = 89;
final int stacks = 16;
glu.gluSphere(earth, radius, slices, stacks);
glu.gluDeleteQuadric(earth);
Texture earths;
try {
earths = TextureIO.newTexture(new File("earth.png"), true);
}
catch (IOException e) {
javax.swing.JOptionPane.showMessageDialog(null, e);
}
gl.glPopMatrix();
//gl.glEnd();
gl.glPushMatrix();
gl.glTranslatef(2.75f, 3.0f, -0.0f);
gl.glColor3f(0.3f, 0.5f, 1f);
GLUquadric earth1 = glu.gluNewQuadric();
glu.gluQuadricDrawStyle(earth1, GLU.GLU_FILL);
glu.gluQuadricNormals(earth1, GLU.GLU_FLAT);
glu.gluQuadricOrientation(earth1, GLU.GLU_OUTSIDE);
final float radius1 = 3.378f;
final int slices1 = 90;
final int stacks1 = 63;
glu.gluSphere(earth1, radius1, slices1, stacks1);
glu.gluDeleteQuadric(earth1);
gl.glPopMatrix();
gl.glPushMatrix();
gl.glTranslatef(3.75f, 6.0f, -7.20f);
gl.glColor3f(0.3f, 0.5f, 1f);
GLUquadric earth3 = glu.gluNewQuadric();
glu.gluQuadricDrawStyle(earth3, GLU.GLU_FILL);
glu.gluQuadricNormals(earth3, GLU.GLU_FLAT);
glu.gluQuadricOrientation(earth1, GLU.GLU_OUTSIDE);
final float radius3 = 1.878f;
final int slices3 = 89;
final int stacks3 = 16;
glu.gluSphere(earth3, radius3, slices3, stacks3);
glu.gluDeleteQuadric(earth3);
gl.glPopMatrix();
gl.glPushMatrix();
gl.glTranslatef(12.75f, 2.0f, -7.20f);
gl.glColor3f(0.3f, 0.5f, 1f);
GLUquadric earth4 = glu.gluNewQuadric();
glu.gluQuadricDrawStyle(earth4, GLU.GLU_FILL);
glu.gluQuadricNormals(earth4, GLU.GLU_FLAT);
glu.gluQuadricOrientation(earth4, GLU.GLU_OUTSIDE);
final float radius4 = 1.078f;
final int slices4 = 89;
final int stacks4 = 16;
glu.gluSphere(earth4, radius4, slices4, stacks4);
glu.gluDeleteQuadric(earth4);
gl.glPopMatrix();
gl.glPushMatrix();
gl.glTranslatef(2.75f, -6.0f, -0.0f);
gl.glColor3f(0.3f, 0.5f, 1f);
GLUquadric earth5 = glu.gluNewQuadric();
glu.gluQuadricDrawStyle(earth5, GLU.GLU_FILL);
glu.gluQuadricNormals(earth5, GLU.GLU_FLAT);
glu.gluQuadricOrientation(earth5, GLU.GLU_OUTSIDE);
final float radius5 = 3.778f;
final int slices5 = 90;
final int stacks5 = 63;
glu.gluSphere(earth5, radius5, slices5, stacks5);
glu.gluDeleteQuadric(earth5);
gl.glPopMatrix();
}

create your own sphere mesh
simple 2D loop through 2 angles (spherical coordinate system 2 Cartesian). You can easily add ellipsoid properties (earth is not a sphere) if you want more precision. If not then you can use single sphere mesh for all planets and just scale it before use ...
let a be the longitude and b the latitude so loop a from 0 to 2*PI [rad] and b from -0.5*PI to +0.5*PI [rad] where PI=3.1415... is the Pi (in C++ math.h it is called M_PI). If your math api uses degrees then convert to degrees PI [rad] = 180.0 [deg]
add necessary info per vertex
normals for lighting
// just unit sphere
nx=cos(b)*cos(a);
ny=cos(b)*sin(a);
nz=sin(b);
texture coordinate (assuming rectangle non distorted image)
// just convert a,b to <0,1> range
tx=a/(2.0*PI)
ty=(b/PI)+0.5;
vertex position
// just sphere(rx=ry=rz=r) or ellipsoid (rx=ry=equatorial and rz=polar radius)
// can also use rx*nx,ry*ny,rz*nz instead ...
x=rx*cos(b)*cos(a);
y=ry*cos(b)*sin(a);
z=rz*sin(b);
send all of this to OpenGL
so all above store in some memory space (CPU or GPU) and then send to rendering. You can use legacy glBegin(QUAD_STRIP); ... glEnd(); or displaylist/VBO/VAO. Bind the right texture before each planet/body and do not forget to update ModelView matrix too. This is how mine coordinate systems looks like:
Also have a look at these related Q/As:
realistic n-body solar system
sphere mesh by subdivision
[edit1] C++ example
//---------------------------------------------------------------------------
const int nb=15; // slices
const int na=nb<<1; // points per equator
class planet
{
public:
bool _init; // has been initiated ?
GLfloat x0,y0,z0; // center of planet [GCS]
GLfloat pos[na][nb][3]; // vertex
GLfloat nor[na][nb][3]; // normal
GLfloat txr[na][nb][2]; // texcoord
GLuint txrid; // texture id
GLfloat t; // dayly rotation angle [deg]
planet() { _init=false; txrid=0; x0=0.0; y0=0.0; z0=0.0; t=0.0; }
~planet() { if (_init) glDeleteTextures(1,&txrid); }
void init(GLfloat r,AnsiString texture); // call after OpenGL is already working !!!
void draw();
};
void planet::init(GLfloat r,AnsiString texture)
{
if (!_init) { _init=true; glGenTextures(1,&txrid); }
GLfloat x,y,z,a,b,da,db;
GLfloat tx0,tdx,ty0,tdy;// just correction if CLAMP_TO_EDGE is not available
int ia,ib;
// a,b to texture coordinate system
tx0=0.0;
ty0=0.5;
tdx=0.5/M_PI;
tdy=1.0/M_PI;
// load texture to GPU memory
if (texture!="")
{
Byte q;
unsigned int *pp;
int xs,ys,x,y,adr,*txr;
union { unsigned int c32; Byte db[4]; } c;
Graphics::TBitmap *bmp=new Graphics::TBitmap; // new bmp
bmp->LoadFromFile(texture); // load from file
bmp->HandleType=bmDIB; // allow direct access to pixels
bmp->PixelFormat=pf32bit; // set pixel to 32bit so int is the same size as pixel
xs=bmp->Width; // resolution should be power of 2
ys=bmp->Height;
txr=new int[xs*ys];
for(adr=0,y=0;y<ys;y++)
{
pp=(unsigned int*)bmp->ScanLine[y];
for(x=0;x<xs;x++,adr++)
{
// rgb2bgr and copy bmp -> txr[]
c.c32=pp[x];
q =c.db[2];
c.db[2]=c.db[0];
c.db[0]=q;
txr[adr]=c.c32;
}
}
glEnable(GL_TEXTURE_2D);
glBindTexture(GL_TEXTURE_2D,txrid);
glPixelStorei(GL_UNPACK_ALIGNMENT, 4);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S,GL_CLAMP);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T,GL_CLAMP);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER,GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER,GL_LINEAR);
glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE,GL_MODULATE);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, xs, ys, 0, GL_RGBA, GL_UNSIGNED_BYTE, txr);
glDisable(GL_TEXTURE_2D);
delete bmp;
delete[] txr;
// texture coordinates by 1 pixel from each edge (GL_CLAMP_TO_EDGE)
tx0+=1.0/GLfloat(xs);
ty0+=1.0/GLfloat(ys);
tdx*=GLfloat(xs-2)/GLfloat(xs);
tdy*=GLfloat(ys-2)/GLfloat(ys);
}
// correct texture coordinate system (invert x)
tx0=1.0-tx0; tdx=-tdx;
da=(2.0*M_PI)/GLfloat(na-1);
db= M_PI /GLfloat(nb-1);
for (ib=0,b=-0.5*M_PI;ib<nb;ib++,b+=db)
for (ia=0,a= 0.0 ;ia<na;ia++,a+=da)
{
x=cos(b)*cos(a);
y=cos(b)*sin(a);
z=sin(b);
nor[ia][ib][0]=x;
nor[ia][ib][1]=y;
nor[ia][ib][2]=z;
pos[ia][ib][0]=r*x;
pos[ia][ib][1]=r*y;
pos[ia][ib][2]=r*z;
txr[ia][ib][0]=tx0+(a*tdx);
txr[ia][ib][1]=ty0+(b*tdy);
}
}
void planet::draw()
{
if (!_init) return;
int ia,ib0,ib1;
glMatrixMode(GL_MODELVIEW);
glPushMatrix();
glLoadIdentity();
glTranslatef(x0,y0,z0);
glRotatef(90.0,1.0,0.0,0.0); // rotate planets z axis (North) to OpenGL y axis (Up)
glRotatef(-t,0.0,0.0,1.0); // rotate planets z axis (North) to OpenGL y axis (Up)
glEnable(GL_TEXTURE_2D);
glBindTexture(GL_TEXTURE_2D,txrid);
glColor3f(1.0,1.0,1.0);
for (ib0=0,ib1=1;ib1<nb;ib0=ib1,ib1++)
{
glBegin(GL_QUAD_STRIP);
for (ia=0;ia<na;ia++)
{
glNormal3fv (nor[ia][ib0]);
glTexCoord2fv(txr[ia][ib0]);
glVertex3fv (pos[ia][ib0]);
glNormal3fv (nor[ia][ib1]);
glTexCoord2fv(txr[ia][ib1]);
glVertex3fv (pos[ia][ib1]);
}
glEnd();
}
glDisable(GL_TEXTURE_2D);
glMatrixMode(GL_MODELVIEW);
glPopMatrix();
}
//---------------------------------------------------------------------------
usage:
// variable to store planet (global)
planet earth;
// init after OpenGL initialisation
earth.init(1.0,"earth.bmp");
// position update
earth.x0= 0.0;
earth.y0= 0.0;
earth.z0=-20.0;
// add this to render loop
earth.draw(); // draws the planet
earth.t+=2.5; // just rotate planet by 2.5 deg each frame...
I know its ugly but it does not use any funny stuff just legacy OpenGL and Math.h (cos(),sin(),M_PI) and VCL for bitmap loading. So rewrite to your environment and you will be fine. Do not forget that each planet has its own texture so you need to have one txrid per planet so either have each planet as separate planet variable or rewrite ...

Related

How do I change my OpenGL object color only when I touch it?

I am trying to change my OpenGL square's color only when it is touched. I looked around online at some good sources to see how I could find the coordinates to change its color, Converting pixel co-ordinates to normalized co-ordinates at draw time in OpenGL 3.0. However, I am still confused about how to get my square's or onTouchEvent inputs coordinates to be translated in OpenGL code(vertexShaderCode). I have tried to directly track my square coordinates in the onTouchEvent activity, but it wrongly tracks the position since I am working with two different coordinate systems(OpenGl, Android Studios).
//THIS IS NOT MY FULL CODE
public boolean onTouchEvent(MotionEvent e) {
// MotionEvent reports input details from the touch screen
// and other input controls. In this case, you are only
// interested in events where the touch position changed.
float x = e.getX();
float y = e.getY();
colorHolder = renderer.getmSquare().getColor();
switch (e.getAction()) {
case MotionEvent.ACTION_DOWN:
//THIS IS MY PROBLEM. I DON'T KNOW A GOOD WAY OF TRACKING THE SQUARE'S POSITION BESIDES
//ADDING VARIBLE TO IT'S MAIN CLASS THEN REFERENCING THEM HERE
if(renderer.mSquareY > (y / getHeight()) && renderer.mSquareX > (x / getWidth()))
renderer.getmSquare().color = tempColor;
case MotionEvent.ACTION_MOVE:
float dx = x - previousX;
float dy = y - previousY;
float tempHeight = y / getHeight();
float tempWidth = x / getWidth();
//THIS IS MY PROBLEM. I DON'T KNOW A GOOD WAY OF TRACKING THE SQUARE'S POSITION BESIDES ADDING VARIBLE TO IT'S MAIN CLASS THEN REFERENCING THEM HERE
if(renderer.mSquareY < (y / getHeight()) && renderer.mSquareX < (x / getWidth()))
renderer.getmSquare().color = tempColor;
renderer.mSquareX = (x / getWidth());
renderer.mSquareY = (y / getHeight());
...
I have three classes that handle creating the square, handles rendering, and the main activity in the corresponding order: Square.java, MyGLRenderer.java, MyGLSurfaceView.java.
public class Square {
private final String vertexShaderCode =
// This matrix member variable provides a hook to manipulate
// the coordinates of the objects that use this vertex shader
"uniform mat4 uMVPMatrix;" +
"attribute vec4 vPosition;" +
"void main() {" +
// The matrix must be included as a modifier of gl_Position.
// Note that the uMVPMatrix factor *must be first* in order
// for the matrix multiplication product to be correct.
" gl_Position = uMVPMatrix * vPosition;" +
"}";
private final String fragmentShaderCode =
"precision mediump float;" +
"uniform vec4 vColor;" +
"void main() {" +
" gl_FragColor = vColor;" +
"}";
private FloatBuffer vertexBuffer;
private ShortBuffer drawListBuffer;
private final int mProgram;
private int mPositionHandle;
private int mColorHandle;
private int mMVPMatrixHandle;
// number of coordinates per vertex in this array
static final int COORDS_PER_VERTEX = 3;
static float squareCoords[] = {
0.5f, 0.5f, 0.0f, // top left
0.5f, -0.5f, 0.0f, // bottom left
-0.5f, -0.5f, 0.0f, // bottom right
-0.5f, 0.5f, 0.0f }; // top right
private short drawOrder[] = { 0, 1, 2, 0, 2, 3 }; // order to draw vertices
private final int vertexStride = COORDS_PER_VERTEX * 4; // 4 bytes per vertex
public float[] getColor() {
return color;
}
public void setColor(float[] color) {
this.color = color;
}
float color[] = { 0.2f, 0.709803922f, 0.898039216f, 1.0f };
public Square() {
// initialize vertex byte buffer for shape coordinates
ByteBuffer bb = ByteBuffer.allocateDirect(
// (# of coordinate values * 4 bytes per float)
squareCoords.length * 4);
bb.order(ByteOrder.nativeOrder());
vertexBuffer = bb.asFloatBuffer();
vertexBuffer.put(squareCoords);
vertexBuffer.position(0);
// initialize byte buffer for the draw list
ByteBuffer dlb = ByteBuffer.allocateDirect(
// (# of coordinate values * 2 bytes per short)
drawOrder.length * 2);
dlb.order(ByteOrder.nativeOrder());
drawListBuffer = dlb.asShortBuffer();
drawListBuffer.put(drawOrder);
drawListBuffer.position(0);
// prepare shaders and OpenGL program
int vertexShader = MyGLRenderer.loadShader(
GLES20.GL_VERTEX_SHADER,
vertexShaderCode);
int fragmentShader = MyGLRenderer.loadShader(
GLES20.GL_FRAGMENT_SHADER,
fragmentShaderCode);
mProgram = GLES20.glCreateProgram(); // create empty OpenGL Program
GLES20.glAttachShader(mProgram, vertexShader); // add the vertex shader to program
GLES20.glAttachShader(mProgram, fragmentShader); // add the fragment shader to program
GLES20.glLinkProgram(mProgram); // create OpenGL program executables
}
public void draw(float[] mvpMatrix) {
// Add program to OpenGL environment
GLES20.glUseProgram(mProgram);
// get handle to vertex shader's vPosition member
mPositionHandle = GLES20.glGetAttribLocation(mProgram, "vPosition");
// Enable a handle to the triangle vertices
GLES20.glEnableVertexAttribArray(mPositionHandle);
// Prepare the triangle coordinate data
GLES20.glVertexAttribPointer(
mPositionHandle, COORDS_PER_VERTEX,
GLES20.GL_FLOAT, false,
vertexStride, vertexBuffer);
// get handle to fragment shader's vColor member
mColorHandle = GLES20.glGetUniformLocation(mProgram, "vColor");
// Set color for drawing the triangle
GLES20.glUniform4fv(mColorHandle, 1, color, 0);
// get handle to shape's transformation matrix
mMVPMatrixHandle = GLES20.glGetUniformLocation(mProgram, "uMVPMatrix");
//MyGLRenderer.checkGlError("glGetUniformLocation");
// Apply the projection and view transformation
GLES20.glUniformMatrix4fv(mMVPMatrixHandle, 1, false, mvpMatrix, 0);
//MyGLRenderer.checkGlError("glUniformMatrix4fv");
// Draw the square
GLES20.glDrawElements(
GLES20.GL_TRIANGLES, drawOrder.length,
GLES20.GL_UNSIGNED_SHORT, drawListBuffer);
// Disable vertex array
GLES20.glDisableVertexAttribArray(mPositionHandle);
}
}
public class MyGLRenderer implements GLSurfaceView.Renderer {
private Triangle mTriangle;
public Square getmSquare() {
return mSquare;
}
public void setmSquare(Square mSquare) {
this.mSquare = mSquare;
}
private Square mSquare;
private Circle mCircle;
// vPMatrix is an abbreviation for "Model View Projection Matrix"
private final float[] vPMatrix = new float[16];
private final float[] projectionMatrix = new float[16];
private final float[] viewMatrix = new float[16];
private float[] rotationMatrix = new float[16];
private float[] translationMatrix = new float[16];
private float[] scaleMatrix = new float[16];
public volatile float mAngle;
public float mSquareX = 1.5f;
public float mSquareY = 0.0f;
public float mRadius = 1.0f;
public float getAngle() {
return mAngle;
}
public void setAngle(float angle) {
mAngle = angle;
}
public void onSurfaceCreated(GL10 unused, EGLConfig eglconfig) {
// Set the background frame color
GLES20.glClearColor(0.0f, 0.0f, 1.0f, 1.0f);
// initialize a triangle
mTriangle = new Triangle();
// initialize a square
mSquare = new Square();
// initialize a square
mCircle = new Circle();
}
#Override
public void onDrawFrame(GL10 unused) {
float[] scratch = new float[16];
float[] movementSquare = new float[16];
float[] scaleCircle = new float[16];
float tempscaleFactor = 1.0f * mRadius;
// Redraw background color
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
// Set the camera position (View matrix)
Matrix.setLookAtM(viewMatrix, 0, 0, 0, -3, 0f, 0f, 0f, 0f, 1.0f, 0.0f);
// Calculate the projection and view transformation
Matrix.multiplyMM(vPMatrix, 0, projectionMatrix, 0, viewMatrix, 0);
// Create a rotation transformation for the triangle
//long time = SystemClock.uptimeMillis() % 4000L;
//float angle = 0.090f * ((int) time);
Matrix.setRotateM(rotationMatrix, 0, mAngle, 0, 0, -1.0f);
Matrix.setIdentityM(translationMatrix,0);
Matrix.translateM(translationMatrix, 0, mSquareX, mSquareY,0);
//THIS PROBLEM HERE IS THAT MY CIRCLE TRANSLATE'S ON THE X-AXIS WHEN SCALING. MY GOAL IS TO TRY AND KEEP IT IN PLACE WHILE IT'S BEING SCALED. Y-AXIS HAS NOT ISSUES
Matrix.setIdentityM(scaleMatrix, 0);
Matrix.scaleM(scaleMatrix, 0, mRadius, mRadius, 0);
if(mRadius != 1f)
Matrix.translateM(scaleMatrix, 0, -(1 + (mRadius / 2)),0,0);
// Combine the rotation matrix with the projection and camera view
// Note that the vPMatrix factor *must be first* in order
// for the matrix multiplication product to be correct.
Matrix.multiplyMM(movementSquare, 0, vPMatrix, 0, translationMatrix, 0);
Matrix.multiplyMM(scratch, 0, vPMatrix, 0, rotationMatrix, 0);
Matrix.multiplyMM(scaleCircle, 0, vPMatrix, 0, scaleMatrix, 0);
// Draw shape
mTriangle.draw(scratch);
mSquare.draw(movementSquare);
mCircle.draw(scaleCircle);
}
#Override
public void onSurfaceChanged(GL10 unused, int width, int height) {
GLES20.glViewport(0, 0, width, height);
float ratio = (float) width / height;
// this projection matrix is applied to object coordinates
// in the onDrawFrame() method
Matrix.frustumM(projectionMatrix, 0, -ratio, ratio, -1, 1, 2, 7);
}
public static int loadShader(int type, String shaderCode){
// create a vertex shader type (GLES20.GL_VERTEX_SHADER)
// or a fragment shader type (GLES20.GL_FRAGMENT_SHADER)
int shader = GLES20.glCreateShader(type);
// add the source code to the shader and compile it
GLES20.glShaderSource(shader, shaderCode);
GLES20.glCompileShader(shader);
return shader;
}
}

OpenGL first steps with light and normals

I'm playing around with some basic OpenGL stuff and I'm trying to set up a simple square with lighting enabled, but the lighting is not correct so there is something wrong with my normals i guess.
Or is my understanding of normals totally wrong?
Here's my rendering code (btw I'm using lwjgl):
public class Renderer {
DisplayMode displayMode;
int i;
int width;
int height;
private boolean drawAxes = false;
private float rotation = 40.0f;
private float zoom = -20f;
// ----------- Variables added for Lighting Test -----------//
private FloatBuffer matSpecular;
private FloatBuffer lightPosition;
private FloatBuffer whiteLight;
private FloatBuffer lModelAmbient;
public Renderer(int width, int height) {
this.width = width;
this.height = height;
}
public static Renderer start() throws LWJGLException {
Renderer r = new Renderer(800, 600);
r.initContext();
r.run();
return r;
}
private void initContext() throws LWJGLException {
Display.setFullscreen(false);
DisplayMode d[] = Display.getAvailableDisplayModes();
for (int i = 0; i < d.length; i++) {
if (d[i].getWidth() == width && d[i].getHeight() == height && d[i].getBitsPerPixel() == 32) {
displayMode = d[i];
break;
}
}
Display.setDisplayMode(displayMode);
Display.create();
}
private void run() {
initGL();
while (!Display.isCloseRequested()) {
preRender();
render();
Display.update();
Display.sync(60);
}
Display.destroy();
}
private void initGL() {
GL11.glClearColor(0.0f, 0.0f, 0.0f, 0.0f); // Black Background
GL11.glClearDepth(1.0); // Depth Buffer Setup
GL11.glEnable(GL11.GL_DEPTH_TEST); // Enables Depth Testing
GL11.glDepthFunc(GL11.GL_LEQUAL); // The Type Of Depth Testing To Do
GL11.glMatrixMode(GL11.GL_PROJECTION); // Select The Projection Matrix
GL11.glLoadIdentity(); // Reset The Projection Matrix
// Calculate The Aspect Ratio Of The Window
GLU.gluPerspective(45.0f, (float) displayMode.getWidth() / (float) displayMode.getHeight(), 0.1f, 100.0f);
GL11.glMatrixMode(GL11.GL_MODELVIEW); // Select The Modelview Matrix
// Really Nice Perspective Calculations
GL11.glHint(GL11.GL_PERSPECTIVE_CORRECTION_HINT, GL11.GL_NICEST);
GL11.glPolygonMode(GL11.GL_FRONT_AND_BACK, GL11.GL_FILL);
initLightArrays();
glShadeModel(GL_SMOOTH);
glMaterial(GL_FRONT, GL_SPECULAR, matSpecular); // sets specular material color
glMaterialf(GL_FRONT, GL_SHININESS, 100.0f); // sets shininess
glLight(GL_LIGHT0, GL_POSITION, lightPosition); // sets light position
glLight(GL_LIGHT0, GL_SPECULAR, whiteLight); // sets specular light to white
glLight(GL_LIGHT0, GL_DIFFUSE, whiteLight); // sets diffuse light to white
glLightModel(GL_LIGHT_MODEL_AMBIENT, lModelAmbient); // global ambient light
glEnable(GL_LIGHTING); // enables lighting
glEnable(GL_LIGHT0); // enables light0
glEnable(GL_COLOR_MATERIAL); // enables opengl to use glColor3f to define material color
glColorMaterial(GL_FRONT, GL_AMBIENT_AND_DIFFUSE); // tell opengl glColor3f effects the ambient and diffuse properties of material
}
private void preRender() {
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
GL11.glTranslatef(0f, 0f, zoom);
GL11.glRotatef(-60f, 1f, 0f, 0f);
GL11.glRotatef(rotation, 0f, 0f, 1f);
}
private void render() {
FloatBuffer cBuffer = BufferUtils.createFloatBuffer(6*3);
float[] cArray = { 1f,1f,1f,
1f,1f,1f,
1f,1f,1f,
1f,1f,1f,
1f,1f,1f,
1f,1f,1f};
cBuffer.put(cArray);
cBuffer.flip();
FloatBuffer vBuffer = BufferUtils.createFloatBuffer(6*3);
float[] vArray = { 1f,1f,0f,
-1f,-1f,0,
1f,-1f,0,
1f,1f,0f,
-1f,1f,0,
-1f,-1f,0};
vBuffer.put(vArray);
vBuffer.flip();
FloatBuffer nBuffer = BufferUtils.createFloatBuffer(6*3);
float[] nArray = { 0f,0f,1f,
0f,0f,1f,
0f,0f,1f,
0f,0f,1f,
0f,0f,1f,
0f,0f,1f};
nBuffer.put(nArray);
nBuffer.flip();
glEnableClientState(GL_VERTEX_ARRAY);
glEnableClientState(GL_COLOR_ARRAY);
glEnableClientState(GL_NORMAL_ARRAY);
glColorPointer(3, 0, cBuffer);
glVertexPointer(3, 0, vBuffer);
glNormalPointer(3, nBuffer);
glDrawArrays(GL_TRIANGLES, 0, 6);
glDisableClientState(GL_COLOR_ARRAY);
glDisableClientState(GL_VERTEX_ARRAY);
glDisableClientState(GL_NORMAL_ARRAY);
if (drawAxes) {
drawAxes(6);
}
glTranslatef(0.0f, 0.0f, 3);
glColor3f(0.1f, 0.4f, 0.9f);
}
public static void main(String[] args) throws LWJGLException {
System.setProperty("org.lwjgl.opengl.Display.allowSoftwareOpenGL", "true");
Renderer.start();
}
You are setting your normal pointer wrong:
glColorPointer(3, 0, cBuffer);
glVertexPointer(3, 0, vBuffer);
glNormalPointer(3, nBuffer);
The fixed-function GL might always expects normals to be 3-dimensional vectors, henze the size parameter (which tells the GL how many values are there in every vector) is not present in glNormalPointer. The 3 you are setting here is the stride parameter, which specifies the byte offset between consecutive array elements. Now 3 does not make any sence, it will interpret the second normal as to beginning 3 bytes into the arry, which means it combines the last byte of your first normal's x component together with 3 bytes from your first normal's y component when it reads the second normal'x s component, and so on...
Since your array is tightly packed, you can use the shortcut 0 here, like you do with the other pointers.
However, you must be aware that all of that is deprecated since almost a decade in OpenGL, modern core versions of OpenGL do not support the fixed function pipeline at all. If you are learning OpenGL nowadays, I strongly recommend you to learn modern, shader-based GL instead.
Without seeing more of your code, it's very difficult to see exactly what's going wrong.
However, I do see one thing that could be a problem:
FloatBuffer vBuffer = BufferUtils.createFloatBuffer(6*3);
float[] vArray = { 1f,1f,0f,
1f,-1f,0,
-1f,-1f,0,
1f,1f,0f,
-1f,1f,0,
-1f,-1f,0};
vBuffer.put(vArray);
vBuffer.flip();
The winding order on your triangles are not the same. The first triangle winds clockwise, whereas the second triangle winds counter-clockwise. You'll need to reorder the vertices to make sure that they wind in the same direction. OpenGL usually prefers things to wind counter-clockwise, so if I were you, I'd flip the first triangle.
If you're still getting the problem after you've done this, then post the rest of your draw code, as what you're showing here doesn't give a lot of information.

OpenGL ES 3.0 call object by reference To add gravity and OnCLick Event

I am new to OpenGL and I am trying to find a way to give a drawn object such as a triangle an ID of some kind. This way I can call the Id and give it motion as well as ad touch events.
I am not sure if this is the correct way or if there is a much better way to do this. I have objects made but do not know how to call them and give them motion or onClick event. I have looked around however a lot of the ways seem to be outdated and do not work or the link is now dead.
I have a Renderer as such:
public class MyGLRenderer implements GLSurfaceView.Renderer {
private Triangle mTriangle;
// Called once to set up the view's opengl es environment
public void onSurfaceCreated(GL10 unused, EGLConfig config){
//Set the background frame color
GLES30.glClearColor(255.0f,255.0f,255.0f,0.0f);
mTriangle = new Triangle();
}
// Called for each redraw of the view
public void onDrawFrame(GL10 gl){
gl.glClear(GL10.GL_COLOR_BUFFER_BIT | GL10.GL_DEPTH_BUFFER_BIT);
//Redraw background color
GLES30.glClear(GLES30.GL_COLOR_BUFFER_BIT);
mTriangle.draw();
}
// Called if the geometry of the view changes (example is when the screen orientation changes from landscape to portrait
public void onSurfaceChanged(GL10 unused, int width, int height){
// Called if the geometry of the viewport changes
GLES30.glViewport(0, 0, width, height);
}
public static int loadShader(int type, String shaderCode){
// create a vertex shader type (GLES30.GL_VERTEX_SHADER)
// or a fragment shader type (GLES30.GL_FRAGMENT_SHADER)
int shader = GLES30.glCreateShader(type);
// add the source code to the shader and compile it
GLES30.glShaderSource(shader, shaderCode);
GLES30.glCompileShader(shader);
return shader;
}
}
Surface View as Such:
public class MyGLSurfaceView extends GLSurfaceView {
private final MyGLRenderer mRenderer;
public MyGLSurfaceView(Context context, AttributeSet attrs){
super(context, attrs);
//Create an OpenGl 3.0 context
setEGLContextClientVersion(3);
mRenderer = new MyGLRenderer();
//Set the Renderer for drawing on the GLSurfaceView
setRenderer(mRenderer);
//Render the view only when there is a change in the drawing data
setRenderMode(GLSurfaceView.RENDERMODE_WHEN_DIRTY);
}
}
And a Triangle Class as such:
public class Triangle {
private FloatBuffer vertexBuffer;
private final String vertexShaderCode =
"attribute vec4 vPosition;" +
"void main() {" +
" gl_Position = vPosition;" +
"}";
private final String fragmentShaderCode =
"precision mediump float;" +
"uniform vec4 vColor;" +
"void main() {" +
" gl_FragColor = vColor;" +
"}";
private final int mProgram;
// number of coordinates per vertex in this array
static final int COORDS_PER_VERTEX = 3;
static float triangleCoords[] = { // in counterclockwise order:
0.0f, 0.622008459f, 0.0f, // top
-0.5f, -0.311004243f, 0.0f, // bottom left
0.5f, -0.311004243f, 0.0f // bottom right
};
// Set color with red, green, blue and alpha (opacity) values
float color[] = { 0.63671875f, 0.76953125f, 0.22265625f, 1.0f };
public Triangle() {
// initialize vertex byte buffer for shape coordinates
ByteBuffer bb = ByteBuffer.allocateDirect(
// (number of coordinate values * 4 bytes per float)
triangleCoords.length * 4);
// use the device hardware's native byte order
bb.order(ByteOrder.nativeOrder());
// create a floating point buffer from the ByteBuffer
vertexBuffer = bb.asFloatBuffer();
// add the coordinates to the FloatBuffer
vertexBuffer.put(triangleCoords);
// set the buffer to read the first coordinate
vertexBuffer.position(0);
int vertexShader = MyGLRenderer.loadShader(GLES30.GL_VERTEX_SHADER,
vertexShaderCode);
int fragmentShader = MyGLRenderer.loadShader(GLES30.GL_FRAGMENT_SHADER,
fragmentShaderCode);
// create empty OpenGL ES Program
mProgram = GLES30.glCreateProgram();
// add the vertex shader to program
GLES30.glAttachShader(mProgram, vertexShader);
// add the fragment shader to program
GLES30.glAttachShader(mProgram, fragmentShader);
// creates OpenGL ES program executables
GLES30.glLinkProgram(mProgram);
}
private int mPositionHandle;
private int mColorHandle;
private final int vertexCount = triangleCoords.length / COORDS_PER_VERTEX;
private final int vertexStride = COORDS_PER_VERTEX * 4; // 4 bytes per vertex
public void draw() {
// Add program to OpenGL ES environment
GLES30.glUseProgram(mProgram);
// get handle to vertex shader's vPosition member
mPositionHandle = GLES30.glGetAttribLocation(mProgram, "vPosition");
// Enable a handle to the triangle vertices
GLES30.glEnableVertexAttribArray(mPositionHandle);
// Prepare the triangle coordinate data
GLES30.glVertexAttribPointer(mPositionHandle, COORDS_PER_VERTEX,
GLES30.GL_FLOAT, false,
vertexStride, vertexBuffer);
// get handle to fragment shader's vColor member
mColorHandle = GLES30.glGetUniformLocation(mProgram, "vColor");
// Set color for drawing the triangle
GLES30.glUniform4fv(mColorHandle, 1, color, 0);
// Draw the triangle
GLES30.glDrawArrays(GLES30.GL_TRIANGLES, 0, vertexCount);
// Disable vertex array
GLES30.glDisableVertexAttribArray(mPositionHandle);
}
I would like something to do the following:
drawnTriangleObject_ID.Add gravity to move down or up
drawnTriangleObject_ID.OnClick( // Do Something when this object is clicked )
In your triangle class, add positioning data
float x = 0.0f;
float y = 0.0f;
float z = 0.0f;
and when drawing the triangle, you have to apply the translation
Matrix.setIdentityM(modelmatrix, 0);
Matrix.translateM(modelmatrix, 0, x, y, z);
then multiply the model matrix by the view matrix
Matrix.multiplyMM(resultmodelview, 0, viewmatrix, 0, modelmatrix, 0);
then multiply the result with projection matrix
Matrix.multiplyMM(resultresultprojection, 0, ProjectionMatrix, 0, resultmodelview, 0);
and publish it
World.glUniformMatrix4fv(mMVPMatrixHandle, 1, false, resultresultprojection, 0);
This all supposing you have the frustum and projection and view matrix built already (which if you see the triangle, could be already done)..
Good luck and have fun coding !
As for the "OnClick" it is a bit more tricky:
- Once the screen has been clicked, cast a ray from your XY plance (screen) to the 3D world. the ray (line) will have 2 coordinates, a start point and end point in form of X,Y,Z, or might have a start point and a vector(direction of the line)... On each frame you have to check if this ray is created, if it is, you need to use some Math to check if that line intersects your triangle (Don't forget to apply the rotation and translation of the triangle to it's vertices before checking intersection with the Ray). when the frame has been drawn don't forget to delete the ray

Animating Multiple sprites in Android OpenGL ES 2.0

I've spent days searching, trying tutorials, and not actually getting results in this, so here I am.
I'm trying, simply put, to animate a collection of objects (Android Studio) on the screen, in a 2D format, with each independent movements and rotations. However, when I try this, I'm either not getting the object rendered, or its rendering skewed (as if rotated through the vertical Y-axis)
I know the importance of the order in which objects are drawn too (to give correct Z-ordering appearance) however, I'm at a bit of a loss with the matrix manipulation.
Here is what I have so far:
Main Activity - standard stuff
private GLSurfaceView mGLSurfaceView;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
mGLSurfaceView = new GLSurfaceView(this);
//check if device supports ES 2.0
final ActivityManager activityManager = (ActivityManager) getSystemService(Context.ACTIVITY_SERVICE);
final ConfigurationInfo configurationInfo = activityManager.getDeviceConfigurationInfo();
final boolean supportsEs2 = configurationInfo.reqGlEsVersion >= 0x20000;
if (supportsEs2) {
//Get the ES2 compatible context
mGLSurfaceView.setEGLContextClientVersion(2);
//set renderer to my renderer below
mGLSurfaceView.setRenderer(new MyGL20Renderer(this));
} else {
//no support
return;
}
//setContentView(R.layout.activity_main);
setContentView(mGLSurfaceView);
}
GL20Renderer class - Notice I'm now just manually adding 2 objects to my collection to render
public class MyGL20Renderer implements GLSurfaceView.Renderer
{
private final Context mActivityContext;
//Matrix Initializations
private final float[] mMVPMatrix = new float[16];
private final float[] mProjMatrix = new float[16];
private final float[] mVMatrix = new float[16];
private float[] mRotationMatrix = new float[16];
private final float[] mRotateMatrix = new float[16];
private final float[] mMoveMatrix = new float[16];
private final float[] mTempMatrix = new float[16];
private final float[] mModelMatrix = new float[16];
private int numObjects = 2;
private ArrayList<Sprite> spriteList = new ArrayList<Sprite>();
//Declare as volatile because we are updating it from another thread
public volatile float mAngle;
//private Triangle triangle;
//private Sprite sprite;
public MyGL20Renderer(final Context activityContext)
{
mActivityContext = activityContext;
}
public void onSurfaceCreated(GL10 unused, EGLConfig config)
{
//Set the background frame color
GLES20.glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
//Set the camera position (View Matrix) //mtx, offset, eyex,y,z, centrex,y,z, upx,y,z
Matrix.setLookAtM(mVMatrix, 0,
0, 0, -1.5f, //Eye XYZ - position eye behind the origin
0f, 0f, -5.0f, //Look XYZ - We are looking toward the distance
0f, 1.0f, 0.0f); //Up XYZ - Up vector - where head would be pointing if holding the camera
//Initialize Shapes
//triangle = new Triangle();
//sprite = new Sprite(mActivityContext);
//Sprite newSprite;
float xMax = 2.0f;
float yMax = 2.0f;
//rand = 0->1
float newX = (new Random().nextFloat() * xMax * 2) - xMax; //2.0f; //-2 -> +2
float newY = (new Random().nextFloat() * yMax * 2) - yMax; //-3 -> +3
float newZ = 0f;
//for (int i=0; i<numObjects; i++) {
//newSprite = new Sprite(mActivityContext);
//spriteList.add(new Sprite(mActivityContext, newX, newY, newZ));
//}
spriteList.add(new Sprite(mActivityContext, -0.0f, -0.0f, 0.0f));
spriteList.add(new Sprite(mActivityContext, +0.5f, -0.5f, 0.0f));
//spriteList.add(new Sprite(mActivityContext, -1.0f, +1.0f, 0.0f));
//spriteList.add(new Sprite(mActivityContext, +1.0f, +1.0f, 0.0f));
}
public void onDrawFrame(GL10 unused)
{
//init
Sprite currSprite;
//Redraw background color
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
//timing
float jFactor = 0.1f;
long time = SystemClock.uptimeMillis() % 10000L;
float angleInDegrees = (360.0f / 1000.0f) * ((int) time) * jFactor;
/*
//number 1
//Matrix.setIdentityM(mModelMatrix, 0);
//currSprite = spriteList.get(0);
//Matrix.rotateM(mModelMatrix, 0, angleInDegrees, 0.0f, 0.0f, 1.0f);
//currSprite.Draw(mModelMatrix);
//number 2
Matrix.setIdentityM(mModelMatrix, 0);
currSprite = spriteList.get(1);
Matrix.translateM(mModelMatrix, 0, 0.0f, -0.1f, 0.0f);
//Matrix.rotateM(mModelMatrix, 0, 90.0f, 1.0f, 0.0f, 0.0f);
Matrix.rotateM(mModelMatrix, 0, angleInDegrees, 0.0f, 0.0f, 1.0f);
currSprite.Draw(mModelMatrix);
//Matrix.translateM(mModelMatrix, 0, 0, 0, 4.0f);
*/
//Calculate the projection and view transformation
Matrix.multiplyMM(mMVPMatrix, 0, mProjMatrix, 0, mVMatrix, 0);
//zoom out a bit?
Matrix.translateM(mMVPMatrix, 0, 0, 0, 4.0f);
//number 1
//currSprite = spriteList.get(0);
//Matrix.setIdentityM(mMVPMatrix, 0);
//Matrix.rotateM(mMVPMatrix, 0, angleInDegrees, 0.0f, 0.0f, 1.0f);
//Matrix.translateM(mMVPMatrix, 0, currSprite.coordX, 0.0f, 0.0f);
//currSprite.coordX += 0.01f;
//currSprite.Draw(mMVPMatrix);
//number 2
currSprite = spriteList.get(0);
Matrix.setIdentityM(mMVPMatrix, 0);
Matrix.translateM(mMVPMatrix, 0, 0.0f, 0.0f, 0.0f);
Matrix.rotateM(mMVPMatrix, 0, angleInDegrees, 0.0f, 0.0f, +1.0f);
//float[] mTempMatrix = new float[16];
//mTempMatrix = mModelMatrix.clone();
//Matrix.multiplyMM(mMVPMatrix, 0, mMVPMatrix, 0, mRotateMatrix, 0);
//mTempMatrix = mMVPMatrix.clone();
//Matrix.multiplyMM(mMVPMatrix, 0, mTempMatrix, 0, mModelMatrix, 0);
//Matrix.setIdentityM(mMVPMatrix, 0);
currSprite.Draw(mMVPMatrix);
/*
//Set the camera position (View Matrix) //mtx, offset, eyex,y,z, centrex,y,z, upx,y,z
Matrix.setLookAtM(mVMatrix, 0,
0, 0, -10,
0f, 0f, 0f,
0f, 1.0f, 0.0f);
//Calculate the projection and view transformation
Matrix.multiplyMM(mMVPMatrix, 0, mProjMatrix, 0, mVMatrix, 0);
//zoom out a bit?
Matrix.translateM(mMVPMatrix, 0, 0, 0, 4.0f);
for (int i=0; i<numObjects; i++) {
//Create a rotation transformation for the triangle
//Matrix.setRotateM(mRotationMatrix, 0, mAngle, 0, 0, -1.0f);
Matrix.setRotateM(mRotationMatrix, 0, 0, 0, 0, -1.0f); //-1.0 = Z, for some reason need this. Grr
//Combine the rotation matrix with the projection and camera view
Matrix.multiplyMM(mMVPMatrix, 0, mRotationMatrix, 0, mMVPMatrix, 0);
//Draw Shape
//triangle.Draw(mMVPMatrix);
//sprite.Draw(mMVPMatrix);
currSprite = spriteList.get(i);
//Move the object to the passed initial coordinates?
//Matrix.translateM(mMVPMatrix, 0, currSprite.coordX, currSprite.coordY, currSprite.coordZ);
currSprite.Draw(mMVPMatrix);
}
*/
}
public void onSurfaceChanged(GL10 unused, int width, int height)
{
GLES20.glViewport(0, 0, width, height);
if (height == 0) {
height = 1; //incase of div 0 errors
}
float ratio = (float) width / height;
final float left = -ratio;
final float right = ratio;
final float bottom = -1.0f;
final float top = 1.0f;
final float near = 1.0f;
final float far = 10.f;
//This Projection Matrix is applied to object coordinates in the onDrawFrame() method
//Matrix.frustumM(mProjMatrix, 0, -ratio, ratio, -1, 1, 3, 7);
Matrix.frustumM(mProjMatrix, 0, left, right, bottom, top, near, far);
}
public static int loadShader(int type, String shaderCode)
{
//Create a Vertex Shader Type Or a Fragment Shader Type (GLES20.GL_VERTEX_SHADER OR GLES20.GL_FRAGMENT_SHADER)
int shader = GLES20.glCreateShader(type);
//Add The Source Code and Compile it
GLES20.glShaderSource(shader, shaderCode);
GLES20.glCompileShader(shader);
return shader;
}
}
Please excuse the commented code in OnDrawFrame() where I've been experimenting, and failing.
Sprite Class
public class Sprite
{
//Reference to Activity Context
private final Context mActivityContext;
//Added for Textures
private final FloatBuffer mCubeTextureCoordinates;
private int mTextureUniformHandle;
private int mTextureCoordinateHandle;
private final int mTextureCoordinateDataSize = 2;
private int mTextureDataHandle;
private final String vertexShaderCode =
//Test
"attribute vec2 a_TexCoordinate;" +
"varying vec2 v_TexCoordinate;" +
//End Test
"uniform mat4 uMVPMatrix;" +
"attribute vec4 vPosition;" +
"void main() {" +
" gl_Position = vPosition * uMVPMatrix;" +
//Test
"v_TexCoordinate = a_TexCoordinate;" +
//End Test
"}";
private final String fragmentShaderCode =
"precision mediump float;" +
"uniform vec4 vColor;" +
//Test
"uniform sampler2D u_Texture;" +
"varying vec2 v_TexCoordinate;" +
//End Test
"void main() {" +
//"gl_FragColor = vColor;" +
"gl_FragColor = (vColor * texture2D(u_Texture, v_TexCoordinate));" +
"}";
private final int shaderProgram;
private final FloatBuffer vertexBuffer;
private final ShortBuffer drawListBuffer;
private int mPositionHandle;
private int mColorHandle;
private int mMVPMatrixHandle;
public float coordX;
public float coordY;
//public float coordZ;
// number of coordinates per vertex in this array
static final int COORDS_PER_VERTEX = 2;
static float spriteCoords[] = { -0.5f, 0.5f, // top left
-0.5f, -0.5f, // bottom left
0.5f, -0.5f, // bottom right
0.5f, 0.5f }; //top right
private short drawOrder[] = { 0, 1, 2, 0, 2, 3 }; //Order to draw vertices
private final int vertexStride = COORDS_PER_VERTEX * 4; //Bytes per vertex
// Set color with red, green, blue and alpha (opacity) values
//float color[] = { 0.63671875f, 0.76953125f, 0.22265625f, 1.0f };
float color[] = { 1f, 1f, 1f, 1.0f };
public Sprite(final Context activityContext, float initX, float initY, float initZ)
{
mActivityContext = activityContext;
this.coordX = initX;
this.coordY = initY;
//this.coordZ = initZ;
//ergh - will do manually for now. Paxo n00b
//just a 2D array, no need for Z nonsense
for (int i=0; i<spriteCoords.length; i++) {
spriteCoords[i] -= (i%2==0) ? coordX : coordY; //- works better than +
}
//float newPosMatrix[] = { initX, initY, 0f };
//adjust the vector coords accordingly
//Matrix.multiplyMV(spriteCoords, 0, newPosMatrix, 0, spriteCoords, 0);
//Initialize Vertex Byte Buffer for Shape Coordinates / # of coordinate values * 4 bytes per float
ByteBuffer bb = ByteBuffer.allocateDirect(spriteCoords.length * 4);
//Use the Device's Native Byte Order
bb.order(ByteOrder.nativeOrder());
//Create a floating point buffer from the ByteBuffer
vertexBuffer = bb.asFloatBuffer();
//Add the coordinates to the FloatBuffer
vertexBuffer.put(spriteCoords);
//Set the Buffer to Read the first coordinate
vertexBuffer.position(0);
// S, T (or X, Y)
// Texture coordinate data.
// Because images have a Y axis pointing downward (values increase as you move down the image) while
// OpenGL has a Y axis pointing upward, we adjust for that here by flipping the Y axis.
// What's more is that the texture coordinates are the same for every face.
final float[] cubeTextureCoordinateData =
{
//Front face
/*0.0f, 0.0f,
0.0f, 1.0f,
1.0f, 0.0f,
0.0f, 1.0f,
1.0f, 1.0f,
1.0f, 0.0f*/
/*-0.5f, 0.5f,
-0.5f, -0.5f,
0.5f, -0.5f,
0.5f, 0.5f*/
0f, 1f,
0f, 0f,
1f, 0f,
1f, 1f
};
mCubeTextureCoordinates = ByteBuffer.allocateDirect(cubeTextureCoordinateData.length * 4).order(ByteOrder.nativeOrder()).asFloatBuffer();
mCubeTextureCoordinates.put(cubeTextureCoordinateData).position(0);
//Initialize byte buffer for the draw list
ByteBuffer dlb = ByteBuffer.allocateDirect(spriteCoords.length * 2);
dlb.order(ByteOrder.nativeOrder());
drawListBuffer = dlb.asShortBuffer();
drawListBuffer.put(drawOrder);
drawListBuffer.position(0);
int vertexShader = MyGL20Renderer.loadShader(GLES20.GL_VERTEX_SHADER, vertexShaderCode);
int fragmentShader = MyGL20Renderer.loadShader(GLES20.GL_FRAGMENT_SHADER, fragmentShaderCode);
shaderProgram = GLES20.glCreateProgram();
GLES20.glAttachShader(shaderProgram, vertexShader);
GLES20.glAttachShader(shaderProgram, fragmentShader);
//Texture Code
GLES20.glBindAttribLocation(shaderProgram, 0, "a_TexCoordinate");
GLES20.glLinkProgram(shaderProgram);
//Load the texture
mTextureDataHandle = loadTexture(mActivityContext, R.drawable.cube);
}
public void Draw(float[] mvpMatrix)
{
//Add program to OpenGL ES Environment
GLES20.glUseProgram(shaderProgram);
//Get handle to vertex shader's vPosition member
mPositionHandle = GLES20.glGetAttribLocation(shaderProgram, "vPosition");
//Enable a handle to the triangle vertices
GLES20.glEnableVertexAttribArray(mPositionHandle);
//Prepare the triangle coordinate data
GLES20.glVertexAttribPointer(mPositionHandle, COORDS_PER_VERTEX, GLES20.GL_FLOAT, false, vertexStride, vertexBuffer);
//Get Handle to Fragment Shader's vColor member
mColorHandle = GLES20.glGetUniformLocation(shaderProgram, "vColor");
//Set the Color for drawing the triangle
GLES20.glUniform4fv(mColorHandle, 1, color, 0);
//Set Texture Handles and bind Texture
mTextureUniformHandle = GLES20.glGetAttribLocation(shaderProgram, "u_Texture");
mTextureCoordinateHandle = GLES20.glGetAttribLocation(shaderProgram, "a_TexCoordinate");
//Set the active texture unit to texture unit 0.
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
//Bind the texture to this unit.
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, mTextureDataHandle);
//Tell the texture uniform sampler to use this texture in the shader by binding to texture unit 0.
GLES20.glUniform1i(mTextureUniformHandle, 0);
//Pass in the texture coordinate information
mCubeTextureCoordinates.position(0);
GLES20.glVertexAttribPointer(mTextureCoordinateHandle, mTextureCoordinateDataSize, GLES20.GL_FLOAT, false, 0, mCubeTextureCoordinates);
GLES20.glEnableVertexAttribArray(mTextureCoordinateHandle);
//Get Handle to Shape's Transformation Matrix
mMVPMatrixHandle = GLES20.glGetUniformLocation(shaderProgram, "uMVPMatrix");
//Apply the projection and view transformation
GLES20.glUniformMatrix4fv(mMVPMatrixHandle, 1, false, mvpMatrix, 0);
//glTranslatef(0f, 0f, 0f);
//Draw the triangle
GLES20.glDrawElements(GLES20.GL_TRIANGLES, drawOrder.length, GLES20.GL_UNSIGNED_SHORT, drawListBuffer);
//Disable Vertex Array
GLES20.glDisableVertexAttribArray(mPositionHandle);
}
public static int loadTexture(final Context context, final int resourceId)
{
final int[] textureHandle = new int[1];
GLES20.glGenTextures(1, textureHandle, 0);
if (textureHandle[0] != 0)
{
final BitmapFactory.Options options = new BitmapFactory.Options();
options.inScaled = false; // No pre-scaling
// Read in the resource
final Bitmap bitmap = BitmapFactory.decodeResource(context.getResources(), resourceId, options);
// Bind to the texture in OpenGL
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureHandle[0]);
// Set filtering
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_NEAREST);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_NEAREST);
// Load the bitmap into the bound texture.
GLUtils.texImage2D(GLES20.GL_TEXTURE_2D, 0, bitmap, 0);
// Recycle the bitmap, since its data has been loaded into OpenGL.
bitmap.recycle();
}
if (textureHandle[0] == 0)
{
throw new RuntimeException("Error loading texture.");
}
return textureHandle[0];
}
}
Now, I don't know if I'm going about this the right way at all, but I simply want to just animate the collection of Sprite objects in spriteList.
More specifically, have a collection of 3 objects and then respond to screen touch, and animate the objects to that location (but that will come later)
Initially, I just want to be able to correctly render these objects (with initial locations) and then rotate them on the centre point (about the Z axis).
For some reason, TranslateM is warping the texture (as if about the Y axis) and not actually moving an object along the X/Y planes
Many thanks for any help you can offer. As you can see I'm fairly new to OpenGL and have had little luck with the limited tutorials out there that support Android Studio and GLES2.0.
Kind regards,
James
I think the problem is that you have not multiplied the translation matrices into your rotation matrices. A matrix multiply is required to combine those.

Android getOrientation() method returns bad results

I'm creating 3D Compass application.
I'm using getOrientation method to get orientation (almost same implementation like here). If I place phone on the table it works well, but when top of the phone points to the sky (minus Z axis on the picture; sphere is the Earth) getOrientation starts giving really bad results. It gives values for Z axis between 0 to 180 degrees in a few real degrees. Is there any way how to suppress this behavior? I created a little video what describes problem (sorry for bad quality). Thanks in advance.
Solution:
When you rotating model, there is difference between:
gl.glRotatef(_angleY, 0f, 1f, 0f); //ROLL
gl.glRotatef(_angleX, 1f, 0f, 0f); //ELEVATION
gl.glRotatef(_angleZ, 0f, 0f, 1f); //AZIMUTH
gl.glRotatef(_angleX, 1f, 0f, 0f); //ELEVATION
gl.glRotatef(_angleY, 0f, 1f, 0f); //ROLL
gl.glRotatef(_angleZ, 0f, 0f, 1f); //AZIMUTH
Well, I can see at least 1 problem with this approach of yours.
I assume that you combine a 3D vector corresponding to your magnetometer with an averaging low pass filter to smoothen the data. Although such approach would work great for a sensor value which varies without discontinuities, such as raw data from accelerometer, it doesn't work so great verbatim with angular variables fetched from your magnetometer. Why, one might ask?
Because those angular variables (azimuth, pitch, roll) have an upper-bound and a lower-bound, which means that any value above 180 degrees, say 181 degrees, would wrap around to 181-360 = -179 degrees, and any variable below -180 degrees would wrap around in the other direction. So when one of those angular variables get close to those thresholds (180 or -180), this variable will tend to oscillate to values close to those 2 extremes. When you blindly apply a low-pass filter to those values, you get either a smooth decreasing from 180 degrees towards -180 degrees, or a smooth increasing from -180 towards 180 degrees. Either way, the result would look quite like your video above... As long as one directly applies an averaging buffer onto the raw angle data from getOrientation(...), this problem will be present (and should be present not only for the case where the phone is upright, but also in the cases where there are azimuth angle wraparounds too... Maybe you could test for those bugs as well...).
You say that you tested this with a buffer size of 1. Theoretically, the problem should not be present if there is no averaging at all, although in some implementations of a circular buffer I've seen in the past, it could mean that there is still averaging done with at least 1 past value, not that there is no averaging at all. If this is your case, we have found the root cause of your bug.
Unfortunately, there isn't much of an elegant solution that could be implemented while sticking with your standard averaging filter. What I usually do in this case is switch to another type of low pass filter, which doesn't need any deep buffer to operate: a simple IIR filter (order 1):
diff = x[n] - y[n-1]
y[n] - y[n-1] = alpha * (x[n] - y[n-1]) = alpha * diff
...where y is the filtered angle, x is the raw angle, and alpha<1 is analogous to a time constant, as alpha=1 corresponds to the no-filter case, and the frequency cutoff of the low-pass filter gets lowered as alpha approaches zero. An acute eye would probably have noticed by now that this corresponds to a simple Proportional Controller.
Such a filter allows the compensation of the wraparound of the angle value because we can add or subtract 360 to diff so as to ensure that abs(diff)<=180, which in turn ensures that the filtered angle value will always increase/decrease in the optimal direction to reach its "setpoint".
An example function call, which is to be scheduled periodically, that calculates a filtered angle value y for a given raw angle value x, could be something like this:
private float restrictAngle(float tmpAngle){
while(tmpAngle>=180) tmpAngle-=360;
while(tmpAngle<-180) tmpAngle+=360;
return tmpAngle;
}
//x is a raw angle value from getOrientation(...)
//y is the current filtered angle value
private float calculateFilteredAngle(float x, float y){
final float alpha = 0.1f;
float diff = x-y;
//here, we ensure that abs(diff)<=180
diff = restrictAngle(diff);
y += alpha*diff;
//ensure that y stays within [-180, 180[ bounds
y = restrictAngle(y);
return y;
}
The function calculateFilteredAngle(float x, float y) can then be called periodically using something like this (example for azimuth angle from getOrientation(...) function:
filteredAzimuth = calculateFilteredAngle(azimuth, filteredAzimuth);
Using this method, the filter would not misbehave like the averaging filter as mentioned by the OP.
As I could not load the .apk uploaded by the OP, I decided to implement my own test project in order to see if the corrections work. Here is the entire code (it does not use a .XML for the main layout, so I did not include it). Simply copy it to a test project to see if it works on a specific device (tested functional on a HTC Desire w/ Android v. 2.1):
File 1: Compass3DActivity.java:
package com.epichorns.compass3D;
import android.app.Activity;
import android.content.Context;
import android.hardware.Sensor;
import android.hardware.SensorEvent;
import android.hardware.SensorEventListener;
import android.hardware.SensorManager;
import android.os.Bundle;
import android.view.ViewGroup;
import android.widget.LinearLayout;
import android.widget.TextView;
public class Compass3DActivity extends Activity {
//Textviews for showing angle data
TextView mTextView_azimuth;
TextView mTextView_pitch;
TextView mTextView_roll;
TextView mTextView_filtered_azimuth;
TextView mTextView_filtered_pitch;
TextView mTextView_filtered_roll;
float mAngle0_azimuth=0;
float mAngle1_pitch=0;
float mAngle2_roll=0;
float mAngle0_filtered_azimuth=0;
float mAngle1_filtered_pitch=0;
float mAngle2_filtered_roll=0;
private Compass3DView mCompassView;
private SensorManager sensorManager;
//sensor calculation values
float[] mGravity = null;
float[] mGeomagnetic = null;
float Rmat[] = new float[9];
float Imat[] = new float[9];
float orientation[] = new float[3];
SensorEventListener mAccelerometerListener = new SensorEventListener(){
public void onAccuracyChanged(Sensor sensor, int accuracy) {}
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER){
mGravity = event.values.clone();
processSensorData();
}
}
};
SensorEventListener mMagnetometerListener = new SensorEventListener(){
public void onAccuracyChanged(Sensor sensor, int accuracy) {}
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD){
mGeomagnetic = event.values.clone();
processSensorData();
update();
}
}
};
private float restrictAngle(float tmpAngle){
while(tmpAngle>=180) tmpAngle-=360;
while(tmpAngle<-180) tmpAngle+=360;
return tmpAngle;
}
//x is a raw angle value from getOrientation(...)
//y is the current filtered angle value
private float calculateFilteredAngle(float x, float y){
final float alpha = 0.3f;
float diff = x-y;
//here, we ensure that abs(diff)<=180
diff = restrictAngle(diff);
y += alpha*diff;
//ensure that y stays within [-180, 180[ bounds
y = restrictAngle(y);
return y;
}
public void processSensorData(){
if (mGravity != null && mGeomagnetic != null) {
boolean success = SensorManager.getRotationMatrix(Rmat, Imat, mGravity, mGeomagnetic);
if (success) {
SensorManager.getOrientation(Rmat, orientation);
mAngle0_azimuth = (float)Math.toDegrees((double)orientation[0]); // orientation contains: azimut, pitch and roll
mAngle1_pitch = (float)Math.toDegrees((double)orientation[1]); //pitch
mAngle2_roll = -(float)Math.toDegrees((double)orientation[2]); //roll
mAngle0_filtered_azimuth = calculateFilteredAngle(mAngle0_azimuth, mAngle0_filtered_azimuth);
mAngle1_filtered_pitch = calculateFilteredAngle(mAngle1_pitch, mAngle1_filtered_pitch);
mAngle2_filtered_roll = calculateFilteredAngle(mAngle2_roll, mAngle2_filtered_roll);
}
mGravity=null; //oblige full new refresh
mGeomagnetic=null; //oblige full new refresh
}
}
/** Called when the activity is first created. */
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
LinearLayout ll = new LinearLayout(this);
LinearLayout.LayoutParams llParams = new LinearLayout.LayoutParams(LinearLayout.LayoutParams.FILL_PARENT, LinearLayout.LayoutParams.FILL_PARENT);
ll.setLayoutParams(llParams);
ll.setOrientation(LinearLayout.VERTICAL);
ViewGroup.LayoutParams txtParams = new ViewGroup.LayoutParams(ViewGroup.LayoutParams.WRAP_CONTENT, ViewGroup.LayoutParams.WRAP_CONTENT);
mTextView_azimuth = new TextView(this);
mTextView_azimuth.setLayoutParams(txtParams);
mTextView_pitch = new TextView(this);
mTextView_pitch.setLayoutParams(txtParams);
mTextView_roll = new TextView(this);
mTextView_roll.setLayoutParams(txtParams);
mTextView_filtered_azimuth = new TextView(this);
mTextView_filtered_azimuth.setLayoutParams(txtParams);
mTextView_filtered_pitch = new TextView(this);
mTextView_filtered_pitch.setLayoutParams(txtParams);
mTextView_filtered_roll = new TextView(this);
mTextView_filtered_roll.setLayoutParams(txtParams);
mCompassView = new Compass3DView(this);
ViewGroup.LayoutParams compassParams = new ViewGroup.LayoutParams(200,200);
mCompassView.setLayoutParams(compassParams);
ll.addView(mCompassView);
ll.addView(mTextView_azimuth);
ll.addView(mTextView_pitch);
ll.addView(mTextView_roll);
ll.addView(mTextView_filtered_azimuth);
ll.addView(mTextView_filtered_pitch);
ll.addView(mTextView_filtered_roll);
setContentView(ll);
sensorManager = (SensorManager) this.getSystemService(Context.SENSOR_SERVICE);
sensorManager.registerListener(mAccelerometerListener, sensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER), SensorManager.SENSOR_DELAY_UI);
sensorManager.registerListener(mMagnetometerListener, sensorManager.getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD), SensorManager.SENSOR_DELAY_UI);
update();
}
#Override
public void onDestroy(){
super.onDestroy();
sensorManager.unregisterListener(mAccelerometerListener);
sensorManager.unregisterListener(mMagnetometerListener);
}
private void update(){
mCompassView.changeAngles(mAngle1_filtered_pitch, mAngle2_filtered_roll, mAngle0_filtered_azimuth);
mTextView_azimuth.setText("Azimuth: "+String.valueOf(mAngle0_azimuth));
mTextView_pitch.setText("Pitch: "+String.valueOf(mAngle1_pitch));
mTextView_roll.setText("Roll: "+String.valueOf(mAngle2_roll));
mTextView_filtered_azimuth.setText("Azimuth: "+String.valueOf(mAngle0_filtered_azimuth));
mTextView_filtered_pitch.setText("Pitch: "+String.valueOf(mAngle1_filtered_pitch));
mTextView_filtered_roll.setText("Roll: "+String.valueOf(mAngle2_filtered_roll));
}
}
File 2: Compass3DView.java:
package com.epichorns.compass3D;
import android.content.Context;
import android.opengl.GLSurfaceView;
public class Compass3DView extends GLSurfaceView {
private Compass3DRenderer mRenderer;
public Compass3DView(Context context) {
super(context);
mRenderer = new Compass3DRenderer(context);
setRenderer(mRenderer);
}
public void changeAngles(float angle0, float angle1, float angle2){
mRenderer.setAngleX(angle0);
mRenderer.setAngleY(angle1);
mRenderer.setAngleZ(angle2);
}
}
File 3: Compass3DRenderer.java:
package com.epichorns.compass3D;
import java.nio.ByteBuffer;
import java.nio.ByteOrder;
import java.nio.FloatBuffer;
import java.nio.ShortBuffer;
import javax.microedition.khronos.egl.EGLConfig;
import javax.microedition.khronos.opengles.GL10;
import android.content.Context;
import android.opengl.GLSurfaceView;
public class Compass3DRenderer implements GLSurfaceView.Renderer {
Context mContext;
// a raw buffer to hold indices
ShortBuffer _indexBuffer;
// raw buffers to hold the vertices
FloatBuffer _vertexBuffer0;
FloatBuffer _vertexBuffer1;
FloatBuffer _vertexBuffer2;
FloatBuffer _vertexBuffer3;
FloatBuffer _vertexBuffer4;
FloatBuffer _vertexBuffer5;
int _numVertices = 3; //standard triangle vertices = 3
FloatBuffer _textureBuffer0123;
//private FloatBuffer _light0Position;
//private FloatBuffer _light0Ambient;
float _light0Position[] = new float[]{10.0f, 10.0f, 10.0f, 0.0f};
float _light0Ambient[] = new float[]{0.05f, 0.05f, 0.05f, 1.0f};
float _light0Diffuse[] = new float[]{0.5f, 0.5f, 0.5f, 1.0f};
float _light0Specular[] = new float[]{0.7f, 0.7f, 0.7f, 1.0f};
float _matAmbient[] = new float[] { 0.6f, 0.6f, 0.6f, 1.0f };
float _matDiffuse[] = new float[] { 0.6f, 0.6f, 0.6f, 1.0f };
private float _angleX=0f;
private float _angleY=0f;
private float _angleZ=0f;
Compass3DRenderer(Context context){
super();
mContext = context;
}
public void setAngleX(float angle) {
_angleX = angle;
}
public void setAngleY(float angle) {
_angleY = angle;
}
public void setAngleZ(float angle) {
_angleZ = angle;
}
FloatBuffer InitFloatBuffer(float[] src){
ByteBuffer bb = ByteBuffer.allocateDirect(4*src.length);
bb.order(ByteOrder.nativeOrder());
FloatBuffer inBuf = bb.asFloatBuffer();
inBuf.put(src);
return inBuf;
}
ShortBuffer InitShortBuffer(short[] src){
ByteBuffer bb = ByteBuffer.allocateDirect(2*src.length);
bb.order(ByteOrder.nativeOrder());
ShortBuffer inBuf = bb.asShortBuffer();
inBuf.put(src);
return inBuf;
}
//Init data for our rendered pyramid
private void initTriangles() {
//Side faces triangles
float[] coords = {
-0.25f, -0.5f, 0.25f,
0.25f, -0.5f, 0.25f,
0f, 0.5f, 0f
};
float[] coords1 = {
0.25f, -0.5f, 0.25f,
0.25f, -0.5f, -0.25f,
0f, 0.5f, 0f
};
float[] coords2 = {
0.25f, -0.5f, -0.25f,
-0.25f, -0.5f, -0.25f,
0f, 0.5f, 0f
};
float[] coords3 = {
-0.25f, -0.5f, -0.25f,
-0.25f, -0.5f, 0.25f,
0f, 0.5f, 0f
};
//Base triangles
float[] coords4 = {
-0.25f, -0.5f, 0.25f,
0.25f, -0.5f, -0.25f,
0.25f, -0.5f, 0.25f
};
float[] coords5 = {
-0.25f, -0.5f, 0.25f,
-0.25f, -0.5f, -0.25f,
0.25f, -0.5f, -0.25f
};
float[] textures0123 = {
// Mapping coordinates for the vertices (UV mapping CW)
0.0f, 0.0f, // bottom left
1.0f, 0.0f, // bottom right
0.5f, 1.0f, // top ctr
};
_vertexBuffer0 = InitFloatBuffer(coords);
_vertexBuffer0.position(0);
_vertexBuffer1 = InitFloatBuffer(coords1);
_vertexBuffer1.position(0);
_vertexBuffer2 = InitFloatBuffer(coords2);
_vertexBuffer2.position(0);
_vertexBuffer3 = InitFloatBuffer(coords3);
_vertexBuffer3.position(0);
_vertexBuffer4 = InitFloatBuffer(coords4);
_vertexBuffer4.position(0);
_vertexBuffer5 = InitFloatBuffer(coords5);
_vertexBuffer5.position(0);
_textureBuffer0123 = InitFloatBuffer(textures0123);
_textureBuffer0123.position(0);
short[] indices = {0, 1, 2};
_indexBuffer = InitShortBuffer(indices);
_indexBuffer.position(0);
}
public void onSurfaceCreated(GL10 gl, EGLConfig config) {
gl.glEnable(GL10.GL_CULL_FACE); // enable the differentiation of which side may be visible
gl.glShadeModel(GL10.GL_SMOOTH);
gl.glFrontFace(GL10.GL_CCW); // which is the front? the one which is drawn counter clockwise
gl.glCullFace(GL10.GL_BACK); // which one should NOT be drawn
initTriangles();
gl.glEnableClientState(GL10.GL_VERTEX_ARRAY);
gl.glEnableClientState(GL10.GL_TEXTURE_COORD_ARRAY);
}
public void onDrawFrame(GL10 gl) {
gl.glPushMatrix();
gl.glClearColor(0, 0, 0, 1.0f); //clipping backdrop color
// clear the color buffer to show the ClearColor we called above...
gl.glClear(GL10.GL_COLOR_BUFFER_BIT);
// set rotation
gl.glRotatef(_angleY, 0f, 1f, 0f); //ROLL
gl.glRotatef(_angleX, 1f, 0f, 0f); //ELEVATION
gl.glRotatef(_angleZ, 0f, 0f, 1f); //AZIMUTH
//Draw our pyramid
//4 side faces
gl.glColor4f(0.5f, 0f, 0f, 0.5f);
gl.glVertexPointer(3, GL10.GL_FLOAT, 0, _vertexBuffer0);
gl.glDrawElements(GL10.GL_TRIANGLES, _numVertices, GL10.GL_UNSIGNED_SHORT, _indexBuffer);
gl.glColor4f(0.5f, 0.5f, 0f, 0.5f);
gl.glVertexPointer(3, GL10.GL_FLOAT, 0, _vertexBuffer1);
gl.glDrawElements(GL10.GL_TRIANGLES, _numVertices, GL10.GL_UNSIGNED_SHORT, _indexBuffer);
gl.glColor4f(0f, 0.5f, 0f, 0.5f);
gl.glVertexPointer(3, GL10.GL_FLOAT, 0, _vertexBuffer2);
gl.glDrawElements(GL10.GL_TRIANGLES, _numVertices, GL10.GL_UNSIGNED_SHORT, _indexBuffer);
gl.glColor4f(0f, 0.5f, 0.5f, 0.5f);
gl.glVertexPointer(3, GL10.GL_FLOAT, 0, _vertexBuffer3);
gl.glDrawElements(GL10.GL_TRIANGLES, _numVertices, GL10.GL_UNSIGNED_SHORT, _indexBuffer);
//Base face
gl.glColor4f(0f, 0f, 0.5f, 0.5f);
gl.glVertexPointer(3, GL10.GL_FLOAT, 0, _vertexBuffer4);
gl.glDrawElements(GL10.GL_TRIANGLES, _numVertices, GL10.GL_UNSIGNED_SHORT, _indexBuffer);
gl.glVertexPointer(3, GL10.GL_FLOAT, 0, _vertexBuffer5);
gl.glDrawElements(GL10.GL_TRIANGLES, _numVertices, GL10.GL_UNSIGNED_SHORT, _indexBuffer);
gl.glPopMatrix();
}
public void onSurfaceChanged(GL10 gl, int w, int h) {
gl.glViewport(0, 0, w, h);
gl.glViewport(0, 0, w, h);
}
}
Please note that this code does not compensate for tablet default landscape orientation, so it is only expected to work correctly on a phone (I didn't have a tablet close by to test any correction code).
You should probably try a longer delay like Game and/or keep/increase the size of your circular buffer. The sensors (accelerometer, compass, etc.) on mobile devices are inherently noisy so when I asked about 'low pass filter', I meant do you use more data to decrease the frequency of your app usable updates. Your video was done inside, I would also recommend going to a place with less EM interference such as a park just to check that the behavior is consistent as well as the standard compass reset action (rotate device in figure-8). In the end you may have to apply some heuristics to throw out the 'bad' data to make a smoother experience for the user.
Well I had exactly the same problem as I was retrieving orientation. Thing is that I didn't get is solved (I had to set a constraint when it comes to the device position when retrieving it), and I don't know if you'll ever be able to.
Pick a magnetical compass and try to get north orientation when the compass is in the situation you describe - you will get the same non-sense results. So you can't really expect the device's compass to do it any better !
Few words about filtering, with your permissions.
I would suggest to do averaging on Magnetic Field Vector itself before turning it into angles.
It is wrong to do averaging/smoothing only on angles without use of some sort of magnitude. Angles themselves are not providing enough data to detect direction/heading/bearing.
Example: When you want to know average wind direction during the whole day you must use the strength of the wind, not just only angles. If you will average only angles you will get absolutely wrong wind direction.
As for bearing direction I would use the speed for magnitude.

Categories

Resources