I'm creating 3D Compass application.
I'm using getOrientation method to get orientation (almost same implementation like here). If I place phone on the table it works well, but when top of the phone points to the sky (minus Z axis on the picture; sphere is the Earth) getOrientation starts giving really bad results. It gives values for Z axis between 0 to 180 degrees in a few real degrees. Is there any way how to suppress this behavior? I created a little video what describes problem (sorry for bad quality). Thanks in advance.
Solution:
When you rotating model, there is difference between:
gl.glRotatef(_angleY, 0f, 1f, 0f); //ROLL
gl.glRotatef(_angleX, 1f, 0f, 0f); //ELEVATION
gl.glRotatef(_angleZ, 0f, 0f, 1f); //AZIMUTH
gl.glRotatef(_angleX, 1f, 0f, 0f); //ELEVATION
gl.glRotatef(_angleY, 0f, 1f, 0f); //ROLL
gl.glRotatef(_angleZ, 0f, 0f, 1f); //AZIMUTH
Well, I can see at least 1 problem with this approach of yours.
I assume that you combine a 3D vector corresponding to your magnetometer with an averaging low pass filter to smoothen the data. Although such approach would work great for a sensor value which varies without discontinuities, such as raw data from accelerometer, it doesn't work so great verbatim with angular variables fetched from your magnetometer. Why, one might ask?
Because those angular variables (azimuth, pitch, roll) have an upper-bound and a lower-bound, which means that any value above 180 degrees, say 181 degrees, would wrap around to 181-360 = -179 degrees, and any variable below -180 degrees would wrap around in the other direction. So when one of those angular variables get close to those thresholds (180 or -180), this variable will tend to oscillate to values close to those 2 extremes. When you blindly apply a low-pass filter to those values, you get either a smooth decreasing from 180 degrees towards -180 degrees, or a smooth increasing from -180 towards 180 degrees. Either way, the result would look quite like your video above... As long as one directly applies an averaging buffer onto the raw angle data from getOrientation(...), this problem will be present (and should be present not only for the case where the phone is upright, but also in the cases where there are azimuth angle wraparounds too... Maybe you could test for those bugs as well...).
You say that you tested this with a buffer size of 1. Theoretically, the problem should not be present if there is no averaging at all, although in some implementations of a circular buffer I've seen in the past, it could mean that there is still averaging done with at least 1 past value, not that there is no averaging at all. If this is your case, we have found the root cause of your bug.
Unfortunately, there isn't much of an elegant solution that could be implemented while sticking with your standard averaging filter. What I usually do in this case is switch to another type of low pass filter, which doesn't need any deep buffer to operate: a simple IIR filter (order 1):
diff = x[n] - y[n-1]
y[n] - y[n-1] = alpha * (x[n] - y[n-1]) = alpha * diff
...where y is the filtered angle, x is the raw angle, and alpha<1 is analogous to a time constant, as alpha=1 corresponds to the no-filter case, and the frequency cutoff of the low-pass filter gets lowered as alpha approaches zero. An acute eye would probably have noticed by now that this corresponds to a simple Proportional Controller.
Such a filter allows the compensation of the wraparound of the angle value because we can add or subtract 360 to diff so as to ensure that abs(diff)<=180, which in turn ensures that the filtered angle value will always increase/decrease in the optimal direction to reach its "setpoint".
An example function call, which is to be scheduled periodically, that calculates a filtered angle value y for a given raw angle value x, could be something like this:
private float restrictAngle(float tmpAngle){
while(tmpAngle>=180) tmpAngle-=360;
while(tmpAngle<-180) tmpAngle+=360;
return tmpAngle;
}
//x is a raw angle value from getOrientation(...)
//y is the current filtered angle value
private float calculateFilteredAngle(float x, float y){
final float alpha = 0.1f;
float diff = x-y;
//here, we ensure that abs(diff)<=180
diff = restrictAngle(diff);
y += alpha*diff;
//ensure that y stays within [-180, 180[ bounds
y = restrictAngle(y);
return y;
}
The function calculateFilteredAngle(float x, float y) can then be called periodically using something like this (example for azimuth angle from getOrientation(...) function:
filteredAzimuth = calculateFilteredAngle(azimuth, filteredAzimuth);
Using this method, the filter would not misbehave like the averaging filter as mentioned by the OP.
As I could not load the .apk uploaded by the OP, I decided to implement my own test project in order to see if the corrections work. Here is the entire code (it does not use a .XML for the main layout, so I did not include it). Simply copy it to a test project to see if it works on a specific device (tested functional on a HTC Desire w/ Android v. 2.1):
File 1: Compass3DActivity.java:
package com.epichorns.compass3D;
import android.app.Activity;
import android.content.Context;
import android.hardware.Sensor;
import android.hardware.SensorEvent;
import android.hardware.SensorEventListener;
import android.hardware.SensorManager;
import android.os.Bundle;
import android.view.ViewGroup;
import android.widget.LinearLayout;
import android.widget.TextView;
public class Compass3DActivity extends Activity {
//Textviews for showing angle data
TextView mTextView_azimuth;
TextView mTextView_pitch;
TextView mTextView_roll;
TextView mTextView_filtered_azimuth;
TextView mTextView_filtered_pitch;
TextView mTextView_filtered_roll;
float mAngle0_azimuth=0;
float mAngle1_pitch=0;
float mAngle2_roll=0;
float mAngle0_filtered_azimuth=0;
float mAngle1_filtered_pitch=0;
float mAngle2_filtered_roll=0;
private Compass3DView mCompassView;
private SensorManager sensorManager;
//sensor calculation values
float[] mGravity = null;
float[] mGeomagnetic = null;
float Rmat[] = new float[9];
float Imat[] = new float[9];
float orientation[] = new float[3];
SensorEventListener mAccelerometerListener = new SensorEventListener(){
public void onAccuracyChanged(Sensor sensor, int accuracy) {}
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER){
mGravity = event.values.clone();
processSensorData();
}
}
};
SensorEventListener mMagnetometerListener = new SensorEventListener(){
public void onAccuracyChanged(Sensor sensor, int accuracy) {}
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD){
mGeomagnetic = event.values.clone();
processSensorData();
update();
}
}
};
private float restrictAngle(float tmpAngle){
while(tmpAngle>=180) tmpAngle-=360;
while(tmpAngle<-180) tmpAngle+=360;
return tmpAngle;
}
//x is a raw angle value from getOrientation(...)
//y is the current filtered angle value
private float calculateFilteredAngle(float x, float y){
final float alpha = 0.3f;
float diff = x-y;
//here, we ensure that abs(diff)<=180
diff = restrictAngle(diff);
y += alpha*diff;
//ensure that y stays within [-180, 180[ bounds
y = restrictAngle(y);
return y;
}
public void processSensorData(){
if (mGravity != null && mGeomagnetic != null) {
boolean success = SensorManager.getRotationMatrix(Rmat, Imat, mGravity, mGeomagnetic);
if (success) {
SensorManager.getOrientation(Rmat, orientation);
mAngle0_azimuth = (float)Math.toDegrees((double)orientation[0]); // orientation contains: azimut, pitch and roll
mAngle1_pitch = (float)Math.toDegrees((double)orientation[1]); //pitch
mAngle2_roll = -(float)Math.toDegrees((double)orientation[2]); //roll
mAngle0_filtered_azimuth = calculateFilteredAngle(mAngle0_azimuth, mAngle0_filtered_azimuth);
mAngle1_filtered_pitch = calculateFilteredAngle(mAngle1_pitch, mAngle1_filtered_pitch);
mAngle2_filtered_roll = calculateFilteredAngle(mAngle2_roll, mAngle2_filtered_roll);
}
mGravity=null; //oblige full new refresh
mGeomagnetic=null; //oblige full new refresh
}
}
/** Called when the activity is first created. */
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
LinearLayout ll = new LinearLayout(this);
LinearLayout.LayoutParams llParams = new LinearLayout.LayoutParams(LinearLayout.LayoutParams.FILL_PARENT, LinearLayout.LayoutParams.FILL_PARENT);
ll.setLayoutParams(llParams);
ll.setOrientation(LinearLayout.VERTICAL);
ViewGroup.LayoutParams txtParams = new ViewGroup.LayoutParams(ViewGroup.LayoutParams.WRAP_CONTENT, ViewGroup.LayoutParams.WRAP_CONTENT);
mTextView_azimuth = new TextView(this);
mTextView_azimuth.setLayoutParams(txtParams);
mTextView_pitch = new TextView(this);
mTextView_pitch.setLayoutParams(txtParams);
mTextView_roll = new TextView(this);
mTextView_roll.setLayoutParams(txtParams);
mTextView_filtered_azimuth = new TextView(this);
mTextView_filtered_azimuth.setLayoutParams(txtParams);
mTextView_filtered_pitch = new TextView(this);
mTextView_filtered_pitch.setLayoutParams(txtParams);
mTextView_filtered_roll = new TextView(this);
mTextView_filtered_roll.setLayoutParams(txtParams);
mCompassView = new Compass3DView(this);
ViewGroup.LayoutParams compassParams = new ViewGroup.LayoutParams(200,200);
mCompassView.setLayoutParams(compassParams);
ll.addView(mCompassView);
ll.addView(mTextView_azimuth);
ll.addView(mTextView_pitch);
ll.addView(mTextView_roll);
ll.addView(mTextView_filtered_azimuth);
ll.addView(mTextView_filtered_pitch);
ll.addView(mTextView_filtered_roll);
setContentView(ll);
sensorManager = (SensorManager) this.getSystemService(Context.SENSOR_SERVICE);
sensorManager.registerListener(mAccelerometerListener, sensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER), SensorManager.SENSOR_DELAY_UI);
sensorManager.registerListener(mMagnetometerListener, sensorManager.getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD), SensorManager.SENSOR_DELAY_UI);
update();
}
#Override
public void onDestroy(){
super.onDestroy();
sensorManager.unregisterListener(mAccelerometerListener);
sensorManager.unregisterListener(mMagnetometerListener);
}
private void update(){
mCompassView.changeAngles(mAngle1_filtered_pitch, mAngle2_filtered_roll, mAngle0_filtered_azimuth);
mTextView_azimuth.setText("Azimuth: "+String.valueOf(mAngle0_azimuth));
mTextView_pitch.setText("Pitch: "+String.valueOf(mAngle1_pitch));
mTextView_roll.setText("Roll: "+String.valueOf(mAngle2_roll));
mTextView_filtered_azimuth.setText("Azimuth: "+String.valueOf(mAngle0_filtered_azimuth));
mTextView_filtered_pitch.setText("Pitch: "+String.valueOf(mAngle1_filtered_pitch));
mTextView_filtered_roll.setText("Roll: "+String.valueOf(mAngle2_filtered_roll));
}
}
File 2: Compass3DView.java:
package com.epichorns.compass3D;
import android.content.Context;
import android.opengl.GLSurfaceView;
public class Compass3DView extends GLSurfaceView {
private Compass3DRenderer mRenderer;
public Compass3DView(Context context) {
super(context);
mRenderer = new Compass3DRenderer(context);
setRenderer(mRenderer);
}
public void changeAngles(float angle0, float angle1, float angle2){
mRenderer.setAngleX(angle0);
mRenderer.setAngleY(angle1);
mRenderer.setAngleZ(angle2);
}
}
File 3: Compass3DRenderer.java:
package com.epichorns.compass3D;
import java.nio.ByteBuffer;
import java.nio.ByteOrder;
import java.nio.FloatBuffer;
import java.nio.ShortBuffer;
import javax.microedition.khronos.egl.EGLConfig;
import javax.microedition.khronos.opengles.GL10;
import android.content.Context;
import android.opengl.GLSurfaceView;
public class Compass3DRenderer implements GLSurfaceView.Renderer {
Context mContext;
// a raw buffer to hold indices
ShortBuffer _indexBuffer;
// raw buffers to hold the vertices
FloatBuffer _vertexBuffer0;
FloatBuffer _vertexBuffer1;
FloatBuffer _vertexBuffer2;
FloatBuffer _vertexBuffer3;
FloatBuffer _vertexBuffer4;
FloatBuffer _vertexBuffer5;
int _numVertices = 3; //standard triangle vertices = 3
FloatBuffer _textureBuffer0123;
//private FloatBuffer _light0Position;
//private FloatBuffer _light0Ambient;
float _light0Position[] = new float[]{10.0f, 10.0f, 10.0f, 0.0f};
float _light0Ambient[] = new float[]{0.05f, 0.05f, 0.05f, 1.0f};
float _light0Diffuse[] = new float[]{0.5f, 0.5f, 0.5f, 1.0f};
float _light0Specular[] = new float[]{0.7f, 0.7f, 0.7f, 1.0f};
float _matAmbient[] = new float[] { 0.6f, 0.6f, 0.6f, 1.0f };
float _matDiffuse[] = new float[] { 0.6f, 0.6f, 0.6f, 1.0f };
private float _angleX=0f;
private float _angleY=0f;
private float _angleZ=0f;
Compass3DRenderer(Context context){
super();
mContext = context;
}
public void setAngleX(float angle) {
_angleX = angle;
}
public void setAngleY(float angle) {
_angleY = angle;
}
public void setAngleZ(float angle) {
_angleZ = angle;
}
FloatBuffer InitFloatBuffer(float[] src){
ByteBuffer bb = ByteBuffer.allocateDirect(4*src.length);
bb.order(ByteOrder.nativeOrder());
FloatBuffer inBuf = bb.asFloatBuffer();
inBuf.put(src);
return inBuf;
}
ShortBuffer InitShortBuffer(short[] src){
ByteBuffer bb = ByteBuffer.allocateDirect(2*src.length);
bb.order(ByteOrder.nativeOrder());
ShortBuffer inBuf = bb.asShortBuffer();
inBuf.put(src);
return inBuf;
}
//Init data for our rendered pyramid
private void initTriangles() {
//Side faces triangles
float[] coords = {
-0.25f, -0.5f, 0.25f,
0.25f, -0.5f, 0.25f,
0f, 0.5f, 0f
};
float[] coords1 = {
0.25f, -0.5f, 0.25f,
0.25f, -0.5f, -0.25f,
0f, 0.5f, 0f
};
float[] coords2 = {
0.25f, -0.5f, -0.25f,
-0.25f, -0.5f, -0.25f,
0f, 0.5f, 0f
};
float[] coords3 = {
-0.25f, -0.5f, -0.25f,
-0.25f, -0.5f, 0.25f,
0f, 0.5f, 0f
};
//Base triangles
float[] coords4 = {
-0.25f, -0.5f, 0.25f,
0.25f, -0.5f, -0.25f,
0.25f, -0.5f, 0.25f
};
float[] coords5 = {
-0.25f, -0.5f, 0.25f,
-0.25f, -0.5f, -0.25f,
0.25f, -0.5f, -0.25f
};
float[] textures0123 = {
// Mapping coordinates for the vertices (UV mapping CW)
0.0f, 0.0f, // bottom left
1.0f, 0.0f, // bottom right
0.5f, 1.0f, // top ctr
};
_vertexBuffer0 = InitFloatBuffer(coords);
_vertexBuffer0.position(0);
_vertexBuffer1 = InitFloatBuffer(coords1);
_vertexBuffer1.position(0);
_vertexBuffer2 = InitFloatBuffer(coords2);
_vertexBuffer2.position(0);
_vertexBuffer3 = InitFloatBuffer(coords3);
_vertexBuffer3.position(0);
_vertexBuffer4 = InitFloatBuffer(coords4);
_vertexBuffer4.position(0);
_vertexBuffer5 = InitFloatBuffer(coords5);
_vertexBuffer5.position(0);
_textureBuffer0123 = InitFloatBuffer(textures0123);
_textureBuffer0123.position(0);
short[] indices = {0, 1, 2};
_indexBuffer = InitShortBuffer(indices);
_indexBuffer.position(0);
}
public void onSurfaceCreated(GL10 gl, EGLConfig config) {
gl.glEnable(GL10.GL_CULL_FACE); // enable the differentiation of which side may be visible
gl.glShadeModel(GL10.GL_SMOOTH);
gl.glFrontFace(GL10.GL_CCW); // which is the front? the one which is drawn counter clockwise
gl.glCullFace(GL10.GL_BACK); // which one should NOT be drawn
initTriangles();
gl.glEnableClientState(GL10.GL_VERTEX_ARRAY);
gl.glEnableClientState(GL10.GL_TEXTURE_COORD_ARRAY);
}
public void onDrawFrame(GL10 gl) {
gl.glPushMatrix();
gl.glClearColor(0, 0, 0, 1.0f); //clipping backdrop color
// clear the color buffer to show the ClearColor we called above...
gl.glClear(GL10.GL_COLOR_BUFFER_BIT);
// set rotation
gl.glRotatef(_angleY, 0f, 1f, 0f); //ROLL
gl.glRotatef(_angleX, 1f, 0f, 0f); //ELEVATION
gl.glRotatef(_angleZ, 0f, 0f, 1f); //AZIMUTH
//Draw our pyramid
//4 side faces
gl.glColor4f(0.5f, 0f, 0f, 0.5f);
gl.glVertexPointer(3, GL10.GL_FLOAT, 0, _vertexBuffer0);
gl.glDrawElements(GL10.GL_TRIANGLES, _numVertices, GL10.GL_UNSIGNED_SHORT, _indexBuffer);
gl.glColor4f(0.5f, 0.5f, 0f, 0.5f);
gl.glVertexPointer(3, GL10.GL_FLOAT, 0, _vertexBuffer1);
gl.glDrawElements(GL10.GL_TRIANGLES, _numVertices, GL10.GL_UNSIGNED_SHORT, _indexBuffer);
gl.glColor4f(0f, 0.5f, 0f, 0.5f);
gl.glVertexPointer(3, GL10.GL_FLOAT, 0, _vertexBuffer2);
gl.glDrawElements(GL10.GL_TRIANGLES, _numVertices, GL10.GL_UNSIGNED_SHORT, _indexBuffer);
gl.glColor4f(0f, 0.5f, 0.5f, 0.5f);
gl.glVertexPointer(3, GL10.GL_FLOAT, 0, _vertexBuffer3);
gl.glDrawElements(GL10.GL_TRIANGLES, _numVertices, GL10.GL_UNSIGNED_SHORT, _indexBuffer);
//Base face
gl.glColor4f(0f, 0f, 0.5f, 0.5f);
gl.glVertexPointer(3, GL10.GL_FLOAT, 0, _vertexBuffer4);
gl.glDrawElements(GL10.GL_TRIANGLES, _numVertices, GL10.GL_UNSIGNED_SHORT, _indexBuffer);
gl.glVertexPointer(3, GL10.GL_FLOAT, 0, _vertexBuffer5);
gl.glDrawElements(GL10.GL_TRIANGLES, _numVertices, GL10.GL_UNSIGNED_SHORT, _indexBuffer);
gl.glPopMatrix();
}
public void onSurfaceChanged(GL10 gl, int w, int h) {
gl.glViewport(0, 0, w, h);
gl.glViewport(0, 0, w, h);
}
}
Please note that this code does not compensate for tablet default landscape orientation, so it is only expected to work correctly on a phone (I didn't have a tablet close by to test any correction code).
You should probably try a longer delay like Game and/or keep/increase the size of your circular buffer. The sensors (accelerometer, compass, etc.) on mobile devices are inherently noisy so when I asked about 'low pass filter', I meant do you use more data to decrease the frequency of your app usable updates. Your video was done inside, I would also recommend going to a place with less EM interference such as a park just to check that the behavior is consistent as well as the standard compass reset action (rotate device in figure-8). In the end you may have to apply some heuristics to throw out the 'bad' data to make a smoother experience for the user.
Well I had exactly the same problem as I was retrieving orientation. Thing is that I didn't get is solved (I had to set a constraint when it comes to the device position when retrieving it), and I don't know if you'll ever be able to.
Pick a magnetical compass and try to get north orientation when the compass is in the situation you describe - you will get the same non-sense results. So you can't really expect the device's compass to do it any better !
Few words about filtering, with your permissions.
I would suggest to do averaging on Magnetic Field Vector itself before turning it into angles.
It is wrong to do averaging/smoothing only on angles without use of some sort of magnitude. Angles themselves are not providing enough data to detect direction/heading/bearing.
Example: When you want to know average wind direction during the whole day you must use the strength of the wind, not just only angles. If you will average only angles you will get absolutely wrong wind direction.
As for bearing direction I would use the speed for magnitude.
Related
I'm playing around with some basic OpenGL stuff and I'm trying to set up a simple square with lighting enabled, but the lighting is not correct so there is something wrong with my normals i guess.
Or is my understanding of normals totally wrong?
Here's my rendering code (btw I'm using lwjgl):
public class Renderer {
DisplayMode displayMode;
int i;
int width;
int height;
private boolean drawAxes = false;
private float rotation = 40.0f;
private float zoom = -20f;
// ----------- Variables added for Lighting Test -----------//
private FloatBuffer matSpecular;
private FloatBuffer lightPosition;
private FloatBuffer whiteLight;
private FloatBuffer lModelAmbient;
public Renderer(int width, int height) {
this.width = width;
this.height = height;
}
public static Renderer start() throws LWJGLException {
Renderer r = new Renderer(800, 600);
r.initContext();
r.run();
return r;
}
private void initContext() throws LWJGLException {
Display.setFullscreen(false);
DisplayMode d[] = Display.getAvailableDisplayModes();
for (int i = 0; i < d.length; i++) {
if (d[i].getWidth() == width && d[i].getHeight() == height && d[i].getBitsPerPixel() == 32) {
displayMode = d[i];
break;
}
}
Display.setDisplayMode(displayMode);
Display.create();
}
private void run() {
initGL();
while (!Display.isCloseRequested()) {
preRender();
render();
Display.update();
Display.sync(60);
}
Display.destroy();
}
private void initGL() {
GL11.glClearColor(0.0f, 0.0f, 0.0f, 0.0f); // Black Background
GL11.glClearDepth(1.0); // Depth Buffer Setup
GL11.glEnable(GL11.GL_DEPTH_TEST); // Enables Depth Testing
GL11.glDepthFunc(GL11.GL_LEQUAL); // The Type Of Depth Testing To Do
GL11.glMatrixMode(GL11.GL_PROJECTION); // Select The Projection Matrix
GL11.glLoadIdentity(); // Reset The Projection Matrix
// Calculate The Aspect Ratio Of The Window
GLU.gluPerspective(45.0f, (float) displayMode.getWidth() / (float) displayMode.getHeight(), 0.1f, 100.0f);
GL11.glMatrixMode(GL11.GL_MODELVIEW); // Select The Modelview Matrix
// Really Nice Perspective Calculations
GL11.glHint(GL11.GL_PERSPECTIVE_CORRECTION_HINT, GL11.GL_NICEST);
GL11.glPolygonMode(GL11.GL_FRONT_AND_BACK, GL11.GL_FILL);
initLightArrays();
glShadeModel(GL_SMOOTH);
glMaterial(GL_FRONT, GL_SPECULAR, matSpecular); // sets specular material color
glMaterialf(GL_FRONT, GL_SHININESS, 100.0f); // sets shininess
glLight(GL_LIGHT0, GL_POSITION, lightPosition); // sets light position
glLight(GL_LIGHT0, GL_SPECULAR, whiteLight); // sets specular light to white
glLight(GL_LIGHT0, GL_DIFFUSE, whiteLight); // sets diffuse light to white
glLightModel(GL_LIGHT_MODEL_AMBIENT, lModelAmbient); // global ambient light
glEnable(GL_LIGHTING); // enables lighting
glEnable(GL_LIGHT0); // enables light0
glEnable(GL_COLOR_MATERIAL); // enables opengl to use glColor3f to define material color
glColorMaterial(GL_FRONT, GL_AMBIENT_AND_DIFFUSE); // tell opengl glColor3f effects the ambient and diffuse properties of material
}
private void preRender() {
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
GL11.glTranslatef(0f, 0f, zoom);
GL11.glRotatef(-60f, 1f, 0f, 0f);
GL11.glRotatef(rotation, 0f, 0f, 1f);
}
private void render() {
FloatBuffer cBuffer = BufferUtils.createFloatBuffer(6*3);
float[] cArray = { 1f,1f,1f,
1f,1f,1f,
1f,1f,1f,
1f,1f,1f,
1f,1f,1f,
1f,1f,1f};
cBuffer.put(cArray);
cBuffer.flip();
FloatBuffer vBuffer = BufferUtils.createFloatBuffer(6*3);
float[] vArray = { 1f,1f,0f,
-1f,-1f,0,
1f,-1f,0,
1f,1f,0f,
-1f,1f,0,
-1f,-1f,0};
vBuffer.put(vArray);
vBuffer.flip();
FloatBuffer nBuffer = BufferUtils.createFloatBuffer(6*3);
float[] nArray = { 0f,0f,1f,
0f,0f,1f,
0f,0f,1f,
0f,0f,1f,
0f,0f,1f,
0f,0f,1f};
nBuffer.put(nArray);
nBuffer.flip();
glEnableClientState(GL_VERTEX_ARRAY);
glEnableClientState(GL_COLOR_ARRAY);
glEnableClientState(GL_NORMAL_ARRAY);
glColorPointer(3, 0, cBuffer);
glVertexPointer(3, 0, vBuffer);
glNormalPointer(3, nBuffer);
glDrawArrays(GL_TRIANGLES, 0, 6);
glDisableClientState(GL_COLOR_ARRAY);
glDisableClientState(GL_VERTEX_ARRAY);
glDisableClientState(GL_NORMAL_ARRAY);
if (drawAxes) {
drawAxes(6);
}
glTranslatef(0.0f, 0.0f, 3);
glColor3f(0.1f, 0.4f, 0.9f);
}
public static void main(String[] args) throws LWJGLException {
System.setProperty("org.lwjgl.opengl.Display.allowSoftwareOpenGL", "true");
Renderer.start();
}
You are setting your normal pointer wrong:
glColorPointer(3, 0, cBuffer);
glVertexPointer(3, 0, vBuffer);
glNormalPointer(3, nBuffer);
The fixed-function GL might always expects normals to be 3-dimensional vectors, henze the size parameter (which tells the GL how many values are there in every vector) is not present in glNormalPointer. The 3 you are setting here is the stride parameter, which specifies the byte offset between consecutive array elements. Now 3 does not make any sence, it will interpret the second normal as to beginning 3 bytes into the arry, which means it combines the last byte of your first normal's x component together with 3 bytes from your first normal's y component when it reads the second normal'x s component, and so on...
Since your array is tightly packed, you can use the shortcut 0 here, like you do with the other pointers.
However, you must be aware that all of that is deprecated since almost a decade in OpenGL, modern core versions of OpenGL do not support the fixed function pipeline at all. If you are learning OpenGL nowadays, I strongly recommend you to learn modern, shader-based GL instead.
Without seeing more of your code, it's very difficult to see exactly what's going wrong.
However, I do see one thing that could be a problem:
FloatBuffer vBuffer = BufferUtils.createFloatBuffer(6*3);
float[] vArray = { 1f,1f,0f,
1f,-1f,0,
-1f,-1f,0,
1f,1f,0f,
-1f,1f,0,
-1f,-1f,0};
vBuffer.put(vArray);
vBuffer.flip();
The winding order on your triangles are not the same. The first triangle winds clockwise, whereas the second triangle winds counter-clockwise. You'll need to reorder the vertices to make sure that they wind in the same direction. OpenGL usually prefers things to wind counter-clockwise, so if I were you, I'd flip the first triangle.
If you're still getting the problem after you've done this, then post the rest of your draw code, as what you're showing here doesn't give a lot of information.
i been trying to implement a 3D animation in openGL (using JOGL) of a solar system so far i have 5 planets of different sizes but the problem i seem to be having is i cant add a map of the earth texture on a Sphere can anybody help me on how its done?
This is the code i have so far in my Display method:
#Override
public void display(GLAutoDrawable drawable) {
GL2 gl = drawable.getGL().getGL2();
GLU glu = new GLU();
gl.glClear(GL.GL_COLOR_BUFFER_BIT);
//make sure we are in model_view mode
gl.glMatrixMode(GL2.GL_MODELVIEW);
gl.glLoadIdentity();
glu.gluLookAt(10,20,20,0,3,0,0, 20, 0);
//gl.glMatrixMode(GL2.GL_PROJECTION);
//glu.gluPerspective(45,1,1,25);
//render ground plane
gl.glPushMatrix();
gl.glTranslatef(-10.75f, 3.0f, -1.0f);
gl.glColor3f(0.3f, 0.5f, 1f);
GLUquadric earth = glu.gluNewQuadric();
glu.gluQuadricDrawStyle(earth, GLU.GLU_FILL);
glu.gluQuadricNormals(earth, GLU.GLU_FLAT);
glu.gluQuadricOrientation(earth, GLU.GLU_OUTSIDE);
final float radius = 3.378f;
final int slices = 89;
final int stacks = 16;
glu.gluSphere(earth, radius, slices, stacks);
glu.gluDeleteQuadric(earth);
Texture earths;
try {
earths = TextureIO.newTexture(new File("earth.png"), true);
}
catch (IOException e) {
javax.swing.JOptionPane.showMessageDialog(null, e);
}
gl.glPopMatrix();
//gl.glEnd();
gl.glPushMatrix();
gl.glTranslatef(2.75f, 3.0f, -0.0f);
gl.glColor3f(0.3f, 0.5f, 1f);
GLUquadric earth1 = glu.gluNewQuadric();
glu.gluQuadricDrawStyle(earth1, GLU.GLU_FILL);
glu.gluQuadricNormals(earth1, GLU.GLU_FLAT);
glu.gluQuadricOrientation(earth1, GLU.GLU_OUTSIDE);
final float radius1 = 3.378f;
final int slices1 = 90;
final int stacks1 = 63;
glu.gluSphere(earth1, radius1, slices1, stacks1);
glu.gluDeleteQuadric(earth1);
gl.glPopMatrix();
gl.glPushMatrix();
gl.glTranslatef(3.75f, 6.0f, -7.20f);
gl.glColor3f(0.3f, 0.5f, 1f);
GLUquadric earth3 = glu.gluNewQuadric();
glu.gluQuadricDrawStyle(earth3, GLU.GLU_FILL);
glu.gluQuadricNormals(earth3, GLU.GLU_FLAT);
glu.gluQuadricOrientation(earth1, GLU.GLU_OUTSIDE);
final float radius3 = 1.878f;
final int slices3 = 89;
final int stacks3 = 16;
glu.gluSphere(earth3, radius3, slices3, stacks3);
glu.gluDeleteQuadric(earth3);
gl.glPopMatrix();
gl.glPushMatrix();
gl.glTranslatef(12.75f, 2.0f, -7.20f);
gl.glColor3f(0.3f, 0.5f, 1f);
GLUquadric earth4 = glu.gluNewQuadric();
glu.gluQuadricDrawStyle(earth4, GLU.GLU_FILL);
glu.gluQuadricNormals(earth4, GLU.GLU_FLAT);
glu.gluQuadricOrientation(earth4, GLU.GLU_OUTSIDE);
final float radius4 = 1.078f;
final int slices4 = 89;
final int stacks4 = 16;
glu.gluSphere(earth4, radius4, slices4, stacks4);
glu.gluDeleteQuadric(earth4);
gl.glPopMatrix();
gl.glPushMatrix();
gl.glTranslatef(2.75f, -6.0f, -0.0f);
gl.glColor3f(0.3f, 0.5f, 1f);
GLUquadric earth5 = glu.gluNewQuadric();
glu.gluQuadricDrawStyle(earth5, GLU.GLU_FILL);
glu.gluQuadricNormals(earth5, GLU.GLU_FLAT);
glu.gluQuadricOrientation(earth5, GLU.GLU_OUTSIDE);
final float radius5 = 3.778f;
final int slices5 = 90;
final int stacks5 = 63;
glu.gluSphere(earth5, radius5, slices5, stacks5);
glu.gluDeleteQuadric(earth5);
gl.glPopMatrix();
}
create your own sphere mesh
simple 2D loop through 2 angles (spherical coordinate system 2 Cartesian). You can easily add ellipsoid properties (earth is not a sphere) if you want more precision. If not then you can use single sphere mesh for all planets and just scale it before use ...
let a be the longitude and b the latitude so loop a from 0 to 2*PI [rad] and b from -0.5*PI to +0.5*PI [rad] where PI=3.1415... is the Pi (in C++ math.h it is called M_PI). If your math api uses degrees then convert to degrees PI [rad] = 180.0 [deg]
add necessary info per vertex
normals for lighting
// just unit sphere
nx=cos(b)*cos(a);
ny=cos(b)*sin(a);
nz=sin(b);
texture coordinate (assuming rectangle non distorted image)
// just convert a,b to <0,1> range
tx=a/(2.0*PI)
ty=(b/PI)+0.5;
vertex position
// just sphere(rx=ry=rz=r) or ellipsoid (rx=ry=equatorial and rz=polar radius)
// can also use rx*nx,ry*ny,rz*nz instead ...
x=rx*cos(b)*cos(a);
y=ry*cos(b)*sin(a);
z=rz*sin(b);
send all of this to OpenGL
so all above store in some memory space (CPU or GPU) and then send to rendering. You can use legacy glBegin(QUAD_STRIP); ... glEnd(); or displaylist/VBO/VAO. Bind the right texture before each planet/body and do not forget to update ModelView matrix too. This is how mine coordinate systems looks like:
Also have a look at these related Q/As:
realistic n-body solar system
sphere mesh by subdivision
[edit1] C++ example
//---------------------------------------------------------------------------
const int nb=15; // slices
const int na=nb<<1; // points per equator
class planet
{
public:
bool _init; // has been initiated ?
GLfloat x0,y0,z0; // center of planet [GCS]
GLfloat pos[na][nb][3]; // vertex
GLfloat nor[na][nb][3]; // normal
GLfloat txr[na][nb][2]; // texcoord
GLuint txrid; // texture id
GLfloat t; // dayly rotation angle [deg]
planet() { _init=false; txrid=0; x0=0.0; y0=0.0; z0=0.0; t=0.0; }
~planet() { if (_init) glDeleteTextures(1,&txrid); }
void init(GLfloat r,AnsiString texture); // call after OpenGL is already working !!!
void draw();
};
void planet::init(GLfloat r,AnsiString texture)
{
if (!_init) { _init=true; glGenTextures(1,&txrid); }
GLfloat x,y,z,a,b,da,db;
GLfloat tx0,tdx,ty0,tdy;// just correction if CLAMP_TO_EDGE is not available
int ia,ib;
// a,b to texture coordinate system
tx0=0.0;
ty0=0.5;
tdx=0.5/M_PI;
tdy=1.0/M_PI;
// load texture to GPU memory
if (texture!="")
{
Byte q;
unsigned int *pp;
int xs,ys,x,y,adr,*txr;
union { unsigned int c32; Byte db[4]; } c;
Graphics::TBitmap *bmp=new Graphics::TBitmap; // new bmp
bmp->LoadFromFile(texture); // load from file
bmp->HandleType=bmDIB; // allow direct access to pixels
bmp->PixelFormat=pf32bit; // set pixel to 32bit so int is the same size as pixel
xs=bmp->Width; // resolution should be power of 2
ys=bmp->Height;
txr=new int[xs*ys];
for(adr=0,y=0;y<ys;y++)
{
pp=(unsigned int*)bmp->ScanLine[y];
for(x=0;x<xs;x++,adr++)
{
// rgb2bgr and copy bmp -> txr[]
c.c32=pp[x];
q =c.db[2];
c.db[2]=c.db[0];
c.db[0]=q;
txr[adr]=c.c32;
}
}
glEnable(GL_TEXTURE_2D);
glBindTexture(GL_TEXTURE_2D,txrid);
glPixelStorei(GL_UNPACK_ALIGNMENT, 4);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S,GL_CLAMP);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T,GL_CLAMP);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER,GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER,GL_LINEAR);
glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE,GL_MODULATE);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, xs, ys, 0, GL_RGBA, GL_UNSIGNED_BYTE, txr);
glDisable(GL_TEXTURE_2D);
delete bmp;
delete[] txr;
// texture coordinates by 1 pixel from each edge (GL_CLAMP_TO_EDGE)
tx0+=1.0/GLfloat(xs);
ty0+=1.0/GLfloat(ys);
tdx*=GLfloat(xs-2)/GLfloat(xs);
tdy*=GLfloat(ys-2)/GLfloat(ys);
}
// correct texture coordinate system (invert x)
tx0=1.0-tx0; tdx=-tdx;
da=(2.0*M_PI)/GLfloat(na-1);
db= M_PI /GLfloat(nb-1);
for (ib=0,b=-0.5*M_PI;ib<nb;ib++,b+=db)
for (ia=0,a= 0.0 ;ia<na;ia++,a+=da)
{
x=cos(b)*cos(a);
y=cos(b)*sin(a);
z=sin(b);
nor[ia][ib][0]=x;
nor[ia][ib][1]=y;
nor[ia][ib][2]=z;
pos[ia][ib][0]=r*x;
pos[ia][ib][1]=r*y;
pos[ia][ib][2]=r*z;
txr[ia][ib][0]=tx0+(a*tdx);
txr[ia][ib][1]=ty0+(b*tdy);
}
}
void planet::draw()
{
if (!_init) return;
int ia,ib0,ib1;
glMatrixMode(GL_MODELVIEW);
glPushMatrix();
glLoadIdentity();
glTranslatef(x0,y0,z0);
glRotatef(90.0,1.0,0.0,0.0); // rotate planets z axis (North) to OpenGL y axis (Up)
glRotatef(-t,0.0,0.0,1.0); // rotate planets z axis (North) to OpenGL y axis (Up)
glEnable(GL_TEXTURE_2D);
glBindTexture(GL_TEXTURE_2D,txrid);
glColor3f(1.0,1.0,1.0);
for (ib0=0,ib1=1;ib1<nb;ib0=ib1,ib1++)
{
glBegin(GL_QUAD_STRIP);
for (ia=0;ia<na;ia++)
{
glNormal3fv (nor[ia][ib0]);
glTexCoord2fv(txr[ia][ib0]);
glVertex3fv (pos[ia][ib0]);
glNormal3fv (nor[ia][ib1]);
glTexCoord2fv(txr[ia][ib1]);
glVertex3fv (pos[ia][ib1]);
}
glEnd();
}
glDisable(GL_TEXTURE_2D);
glMatrixMode(GL_MODELVIEW);
glPopMatrix();
}
//---------------------------------------------------------------------------
usage:
// variable to store planet (global)
planet earth;
// init after OpenGL initialisation
earth.init(1.0,"earth.bmp");
// position update
earth.x0= 0.0;
earth.y0= 0.0;
earth.z0=-20.0;
// add this to render loop
earth.draw(); // draws the planet
earth.t+=2.5; // just rotate planet by 2.5 deg each frame...
I know its ugly but it does not use any funny stuff just legacy OpenGL and Math.h (cos(),sin(),M_PI) and VCL for bitmap loading. So rewrite to your environment and you will be fine. Do not forget that each planet has its own texture so you need to have one txrid per planet so either have each planet as separate planet variable or rewrite ...
I've spent days searching, trying tutorials, and not actually getting results in this, so here I am.
I'm trying, simply put, to animate a collection of objects (Android Studio) on the screen, in a 2D format, with each independent movements and rotations. However, when I try this, I'm either not getting the object rendered, or its rendering skewed (as if rotated through the vertical Y-axis)
I know the importance of the order in which objects are drawn too (to give correct Z-ordering appearance) however, I'm at a bit of a loss with the matrix manipulation.
Here is what I have so far:
Main Activity - standard stuff
private GLSurfaceView mGLSurfaceView;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
mGLSurfaceView = new GLSurfaceView(this);
//check if device supports ES 2.0
final ActivityManager activityManager = (ActivityManager) getSystemService(Context.ACTIVITY_SERVICE);
final ConfigurationInfo configurationInfo = activityManager.getDeviceConfigurationInfo();
final boolean supportsEs2 = configurationInfo.reqGlEsVersion >= 0x20000;
if (supportsEs2) {
//Get the ES2 compatible context
mGLSurfaceView.setEGLContextClientVersion(2);
//set renderer to my renderer below
mGLSurfaceView.setRenderer(new MyGL20Renderer(this));
} else {
//no support
return;
}
//setContentView(R.layout.activity_main);
setContentView(mGLSurfaceView);
}
GL20Renderer class - Notice I'm now just manually adding 2 objects to my collection to render
public class MyGL20Renderer implements GLSurfaceView.Renderer
{
private final Context mActivityContext;
//Matrix Initializations
private final float[] mMVPMatrix = new float[16];
private final float[] mProjMatrix = new float[16];
private final float[] mVMatrix = new float[16];
private float[] mRotationMatrix = new float[16];
private final float[] mRotateMatrix = new float[16];
private final float[] mMoveMatrix = new float[16];
private final float[] mTempMatrix = new float[16];
private final float[] mModelMatrix = new float[16];
private int numObjects = 2;
private ArrayList<Sprite> spriteList = new ArrayList<Sprite>();
//Declare as volatile because we are updating it from another thread
public volatile float mAngle;
//private Triangle triangle;
//private Sprite sprite;
public MyGL20Renderer(final Context activityContext)
{
mActivityContext = activityContext;
}
public void onSurfaceCreated(GL10 unused, EGLConfig config)
{
//Set the background frame color
GLES20.glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
//Set the camera position (View Matrix) //mtx, offset, eyex,y,z, centrex,y,z, upx,y,z
Matrix.setLookAtM(mVMatrix, 0,
0, 0, -1.5f, //Eye XYZ - position eye behind the origin
0f, 0f, -5.0f, //Look XYZ - We are looking toward the distance
0f, 1.0f, 0.0f); //Up XYZ - Up vector - where head would be pointing if holding the camera
//Initialize Shapes
//triangle = new Triangle();
//sprite = new Sprite(mActivityContext);
//Sprite newSprite;
float xMax = 2.0f;
float yMax = 2.0f;
//rand = 0->1
float newX = (new Random().nextFloat() * xMax * 2) - xMax; //2.0f; //-2 -> +2
float newY = (new Random().nextFloat() * yMax * 2) - yMax; //-3 -> +3
float newZ = 0f;
//for (int i=0; i<numObjects; i++) {
//newSprite = new Sprite(mActivityContext);
//spriteList.add(new Sprite(mActivityContext, newX, newY, newZ));
//}
spriteList.add(new Sprite(mActivityContext, -0.0f, -0.0f, 0.0f));
spriteList.add(new Sprite(mActivityContext, +0.5f, -0.5f, 0.0f));
//spriteList.add(new Sprite(mActivityContext, -1.0f, +1.0f, 0.0f));
//spriteList.add(new Sprite(mActivityContext, +1.0f, +1.0f, 0.0f));
}
public void onDrawFrame(GL10 unused)
{
//init
Sprite currSprite;
//Redraw background color
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
//timing
float jFactor = 0.1f;
long time = SystemClock.uptimeMillis() % 10000L;
float angleInDegrees = (360.0f / 1000.0f) * ((int) time) * jFactor;
/*
//number 1
//Matrix.setIdentityM(mModelMatrix, 0);
//currSprite = spriteList.get(0);
//Matrix.rotateM(mModelMatrix, 0, angleInDegrees, 0.0f, 0.0f, 1.0f);
//currSprite.Draw(mModelMatrix);
//number 2
Matrix.setIdentityM(mModelMatrix, 0);
currSprite = spriteList.get(1);
Matrix.translateM(mModelMatrix, 0, 0.0f, -0.1f, 0.0f);
//Matrix.rotateM(mModelMatrix, 0, 90.0f, 1.0f, 0.0f, 0.0f);
Matrix.rotateM(mModelMatrix, 0, angleInDegrees, 0.0f, 0.0f, 1.0f);
currSprite.Draw(mModelMatrix);
//Matrix.translateM(mModelMatrix, 0, 0, 0, 4.0f);
*/
//Calculate the projection and view transformation
Matrix.multiplyMM(mMVPMatrix, 0, mProjMatrix, 0, mVMatrix, 0);
//zoom out a bit?
Matrix.translateM(mMVPMatrix, 0, 0, 0, 4.0f);
//number 1
//currSprite = spriteList.get(0);
//Matrix.setIdentityM(mMVPMatrix, 0);
//Matrix.rotateM(mMVPMatrix, 0, angleInDegrees, 0.0f, 0.0f, 1.0f);
//Matrix.translateM(mMVPMatrix, 0, currSprite.coordX, 0.0f, 0.0f);
//currSprite.coordX += 0.01f;
//currSprite.Draw(mMVPMatrix);
//number 2
currSprite = spriteList.get(0);
Matrix.setIdentityM(mMVPMatrix, 0);
Matrix.translateM(mMVPMatrix, 0, 0.0f, 0.0f, 0.0f);
Matrix.rotateM(mMVPMatrix, 0, angleInDegrees, 0.0f, 0.0f, +1.0f);
//float[] mTempMatrix = new float[16];
//mTempMatrix = mModelMatrix.clone();
//Matrix.multiplyMM(mMVPMatrix, 0, mMVPMatrix, 0, mRotateMatrix, 0);
//mTempMatrix = mMVPMatrix.clone();
//Matrix.multiplyMM(mMVPMatrix, 0, mTempMatrix, 0, mModelMatrix, 0);
//Matrix.setIdentityM(mMVPMatrix, 0);
currSprite.Draw(mMVPMatrix);
/*
//Set the camera position (View Matrix) //mtx, offset, eyex,y,z, centrex,y,z, upx,y,z
Matrix.setLookAtM(mVMatrix, 0,
0, 0, -10,
0f, 0f, 0f,
0f, 1.0f, 0.0f);
//Calculate the projection and view transformation
Matrix.multiplyMM(mMVPMatrix, 0, mProjMatrix, 0, mVMatrix, 0);
//zoom out a bit?
Matrix.translateM(mMVPMatrix, 0, 0, 0, 4.0f);
for (int i=0; i<numObjects; i++) {
//Create a rotation transformation for the triangle
//Matrix.setRotateM(mRotationMatrix, 0, mAngle, 0, 0, -1.0f);
Matrix.setRotateM(mRotationMatrix, 0, 0, 0, 0, -1.0f); //-1.0 = Z, for some reason need this. Grr
//Combine the rotation matrix with the projection and camera view
Matrix.multiplyMM(mMVPMatrix, 0, mRotationMatrix, 0, mMVPMatrix, 0);
//Draw Shape
//triangle.Draw(mMVPMatrix);
//sprite.Draw(mMVPMatrix);
currSprite = spriteList.get(i);
//Move the object to the passed initial coordinates?
//Matrix.translateM(mMVPMatrix, 0, currSprite.coordX, currSprite.coordY, currSprite.coordZ);
currSprite.Draw(mMVPMatrix);
}
*/
}
public void onSurfaceChanged(GL10 unused, int width, int height)
{
GLES20.glViewport(0, 0, width, height);
if (height == 0) {
height = 1; //incase of div 0 errors
}
float ratio = (float) width / height;
final float left = -ratio;
final float right = ratio;
final float bottom = -1.0f;
final float top = 1.0f;
final float near = 1.0f;
final float far = 10.f;
//This Projection Matrix is applied to object coordinates in the onDrawFrame() method
//Matrix.frustumM(mProjMatrix, 0, -ratio, ratio, -1, 1, 3, 7);
Matrix.frustumM(mProjMatrix, 0, left, right, bottom, top, near, far);
}
public static int loadShader(int type, String shaderCode)
{
//Create a Vertex Shader Type Or a Fragment Shader Type (GLES20.GL_VERTEX_SHADER OR GLES20.GL_FRAGMENT_SHADER)
int shader = GLES20.glCreateShader(type);
//Add The Source Code and Compile it
GLES20.glShaderSource(shader, shaderCode);
GLES20.glCompileShader(shader);
return shader;
}
}
Please excuse the commented code in OnDrawFrame() where I've been experimenting, and failing.
Sprite Class
public class Sprite
{
//Reference to Activity Context
private final Context mActivityContext;
//Added for Textures
private final FloatBuffer mCubeTextureCoordinates;
private int mTextureUniformHandle;
private int mTextureCoordinateHandle;
private final int mTextureCoordinateDataSize = 2;
private int mTextureDataHandle;
private final String vertexShaderCode =
//Test
"attribute vec2 a_TexCoordinate;" +
"varying vec2 v_TexCoordinate;" +
//End Test
"uniform mat4 uMVPMatrix;" +
"attribute vec4 vPosition;" +
"void main() {" +
" gl_Position = vPosition * uMVPMatrix;" +
//Test
"v_TexCoordinate = a_TexCoordinate;" +
//End Test
"}";
private final String fragmentShaderCode =
"precision mediump float;" +
"uniform vec4 vColor;" +
//Test
"uniform sampler2D u_Texture;" +
"varying vec2 v_TexCoordinate;" +
//End Test
"void main() {" +
//"gl_FragColor = vColor;" +
"gl_FragColor = (vColor * texture2D(u_Texture, v_TexCoordinate));" +
"}";
private final int shaderProgram;
private final FloatBuffer vertexBuffer;
private final ShortBuffer drawListBuffer;
private int mPositionHandle;
private int mColorHandle;
private int mMVPMatrixHandle;
public float coordX;
public float coordY;
//public float coordZ;
// number of coordinates per vertex in this array
static final int COORDS_PER_VERTEX = 2;
static float spriteCoords[] = { -0.5f, 0.5f, // top left
-0.5f, -0.5f, // bottom left
0.5f, -0.5f, // bottom right
0.5f, 0.5f }; //top right
private short drawOrder[] = { 0, 1, 2, 0, 2, 3 }; //Order to draw vertices
private final int vertexStride = COORDS_PER_VERTEX * 4; //Bytes per vertex
// Set color with red, green, blue and alpha (opacity) values
//float color[] = { 0.63671875f, 0.76953125f, 0.22265625f, 1.0f };
float color[] = { 1f, 1f, 1f, 1.0f };
public Sprite(final Context activityContext, float initX, float initY, float initZ)
{
mActivityContext = activityContext;
this.coordX = initX;
this.coordY = initY;
//this.coordZ = initZ;
//ergh - will do manually for now. Paxo n00b
//just a 2D array, no need for Z nonsense
for (int i=0; i<spriteCoords.length; i++) {
spriteCoords[i] -= (i%2==0) ? coordX : coordY; //- works better than +
}
//float newPosMatrix[] = { initX, initY, 0f };
//adjust the vector coords accordingly
//Matrix.multiplyMV(spriteCoords, 0, newPosMatrix, 0, spriteCoords, 0);
//Initialize Vertex Byte Buffer for Shape Coordinates / # of coordinate values * 4 bytes per float
ByteBuffer bb = ByteBuffer.allocateDirect(spriteCoords.length * 4);
//Use the Device's Native Byte Order
bb.order(ByteOrder.nativeOrder());
//Create a floating point buffer from the ByteBuffer
vertexBuffer = bb.asFloatBuffer();
//Add the coordinates to the FloatBuffer
vertexBuffer.put(spriteCoords);
//Set the Buffer to Read the first coordinate
vertexBuffer.position(0);
// S, T (or X, Y)
// Texture coordinate data.
// Because images have a Y axis pointing downward (values increase as you move down the image) while
// OpenGL has a Y axis pointing upward, we adjust for that here by flipping the Y axis.
// What's more is that the texture coordinates are the same for every face.
final float[] cubeTextureCoordinateData =
{
//Front face
/*0.0f, 0.0f,
0.0f, 1.0f,
1.0f, 0.0f,
0.0f, 1.0f,
1.0f, 1.0f,
1.0f, 0.0f*/
/*-0.5f, 0.5f,
-0.5f, -0.5f,
0.5f, -0.5f,
0.5f, 0.5f*/
0f, 1f,
0f, 0f,
1f, 0f,
1f, 1f
};
mCubeTextureCoordinates = ByteBuffer.allocateDirect(cubeTextureCoordinateData.length * 4).order(ByteOrder.nativeOrder()).asFloatBuffer();
mCubeTextureCoordinates.put(cubeTextureCoordinateData).position(0);
//Initialize byte buffer for the draw list
ByteBuffer dlb = ByteBuffer.allocateDirect(spriteCoords.length * 2);
dlb.order(ByteOrder.nativeOrder());
drawListBuffer = dlb.asShortBuffer();
drawListBuffer.put(drawOrder);
drawListBuffer.position(0);
int vertexShader = MyGL20Renderer.loadShader(GLES20.GL_VERTEX_SHADER, vertexShaderCode);
int fragmentShader = MyGL20Renderer.loadShader(GLES20.GL_FRAGMENT_SHADER, fragmentShaderCode);
shaderProgram = GLES20.glCreateProgram();
GLES20.glAttachShader(shaderProgram, vertexShader);
GLES20.glAttachShader(shaderProgram, fragmentShader);
//Texture Code
GLES20.glBindAttribLocation(shaderProgram, 0, "a_TexCoordinate");
GLES20.glLinkProgram(shaderProgram);
//Load the texture
mTextureDataHandle = loadTexture(mActivityContext, R.drawable.cube);
}
public void Draw(float[] mvpMatrix)
{
//Add program to OpenGL ES Environment
GLES20.glUseProgram(shaderProgram);
//Get handle to vertex shader's vPosition member
mPositionHandle = GLES20.glGetAttribLocation(shaderProgram, "vPosition");
//Enable a handle to the triangle vertices
GLES20.glEnableVertexAttribArray(mPositionHandle);
//Prepare the triangle coordinate data
GLES20.glVertexAttribPointer(mPositionHandle, COORDS_PER_VERTEX, GLES20.GL_FLOAT, false, vertexStride, vertexBuffer);
//Get Handle to Fragment Shader's vColor member
mColorHandle = GLES20.glGetUniformLocation(shaderProgram, "vColor");
//Set the Color for drawing the triangle
GLES20.glUniform4fv(mColorHandle, 1, color, 0);
//Set Texture Handles and bind Texture
mTextureUniformHandle = GLES20.glGetAttribLocation(shaderProgram, "u_Texture");
mTextureCoordinateHandle = GLES20.glGetAttribLocation(shaderProgram, "a_TexCoordinate");
//Set the active texture unit to texture unit 0.
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
//Bind the texture to this unit.
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, mTextureDataHandle);
//Tell the texture uniform sampler to use this texture in the shader by binding to texture unit 0.
GLES20.glUniform1i(mTextureUniformHandle, 0);
//Pass in the texture coordinate information
mCubeTextureCoordinates.position(0);
GLES20.glVertexAttribPointer(mTextureCoordinateHandle, mTextureCoordinateDataSize, GLES20.GL_FLOAT, false, 0, mCubeTextureCoordinates);
GLES20.glEnableVertexAttribArray(mTextureCoordinateHandle);
//Get Handle to Shape's Transformation Matrix
mMVPMatrixHandle = GLES20.glGetUniformLocation(shaderProgram, "uMVPMatrix");
//Apply the projection and view transformation
GLES20.glUniformMatrix4fv(mMVPMatrixHandle, 1, false, mvpMatrix, 0);
//glTranslatef(0f, 0f, 0f);
//Draw the triangle
GLES20.glDrawElements(GLES20.GL_TRIANGLES, drawOrder.length, GLES20.GL_UNSIGNED_SHORT, drawListBuffer);
//Disable Vertex Array
GLES20.glDisableVertexAttribArray(mPositionHandle);
}
public static int loadTexture(final Context context, final int resourceId)
{
final int[] textureHandle = new int[1];
GLES20.glGenTextures(1, textureHandle, 0);
if (textureHandle[0] != 0)
{
final BitmapFactory.Options options = new BitmapFactory.Options();
options.inScaled = false; // No pre-scaling
// Read in the resource
final Bitmap bitmap = BitmapFactory.decodeResource(context.getResources(), resourceId, options);
// Bind to the texture in OpenGL
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureHandle[0]);
// Set filtering
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_NEAREST);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_NEAREST);
// Load the bitmap into the bound texture.
GLUtils.texImage2D(GLES20.GL_TEXTURE_2D, 0, bitmap, 0);
// Recycle the bitmap, since its data has been loaded into OpenGL.
bitmap.recycle();
}
if (textureHandle[0] == 0)
{
throw new RuntimeException("Error loading texture.");
}
return textureHandle[0];
}
}
Now, I don't know if I'm going about this the right way at all, but I simply want to just animate the collection of Sprite objects in spriteList.
More specifically, have a collection of 3 objects and then respond to screen touch, and animate the objects to that location (but that will come later)
Initially, I just want to be able to correctly render these objects (with initial locations) and then rotate them on the centre point (about the Z axis).
For some reason, TranslateM is warping the texture (as if about the Y axis) and not actually moving an object along the X/Y planes
Many thanks for any help you can offer. As you can see I'm fairly new to OpenGL and have had little luck with the limited tutorials out there that support Android Studio and GLES2.0.
Kind regards,
James
I think the problem is that you have not multiplied the translation matrices into your rotation matrices. A matrix multiply is required to combine those.
I've been making an Android OpenGLES2.0 2D game engine for the past week or so, and after a few bumps in the road, I've largely been successful. I've got the ModelMatrix, ProjectionMatrix, ViewMatrix, LightMatrix, shaders, 2D planes, and textures implemented. However, although my data is seemingly passing through this jungle of pipeline just fine, my textures do not appear, and are instead a solid black.
Most, if not all of my code was derived from this source, and it is ultimately the same, except that I created my own shader class, bounding box class, room class, and game object class to simplify the process of instantiating objects in-game. Renderer takes Room, Room takes GameObject(s) (SpaceShip extends game object), and GameObject takes BoundingBox, then Renderer renders the room's objects in a for loop. To do this, I moved the exact code from the example around so that certain handles are elements of some of the classes I created, instead of being elements of the renderer. This hasn't caused any problems with matrix multiplication or my data reaching the end of the pipeline, so I doubt moving the handles is the problem, but I felt it was important to know.
Things I've tried:
Changing the bitmap
Changed it to a bitmap with no alpha channel, both were 32x32 (2^5) and were .png.
Changing the order of operations
I moved glBindTexture in my implementation, so I moved it back, then back again.
Changing the texture parameters
I tried several combinations, none with mip-mapping
Changing the way I load the image
Went from BitmapFactory.decodeResource to BitmapFactory.decodeStream
Moved the texture to all drawable folders
Also tried it in the raw folder
Tried it on another device
My friend's DROID (Froyo 2.2), My rooted NextBook (Gingerbread 2.3). Both support OpenGLES2.0.
Thigs I haven't tried (That I'm aware of):
Changing the texture coordinates
They came directly from the example. I just took one face of the cube.
Changing my shader
It also came directly from the example (aside from it being it's own class now).
Restructuring my program to be just two (3, 4... x) classes
Dude...
I've been testing on the emulator (Eclipse Indigo, AVD, Intel Atom x86, ICS 4.2.2, API level 17) for some time now, and right about the time I got all the matrixes working, the emulator failed to render anything. It used to render just fine (when the projection was all screwy), now it just shows up black with a titlebar. This has made debugging incredibly difficult. I'm not sure if this is something related to what I've done (probably is) or if it is related to the emulator sucking at OpenGL.
Sorry to be so long winded and include so much code, but I don't know how to use a show/hide button.
Any ideas?
Edit: I was using the wrong shader from the example. The naming was very misleading. I wasn't passing in the color info. I still don't have texture, but the emulator works again. :)
OpenGLES20_2DRenderer
package mycompany.OpenGLES20_2DEngine;
import javax.microedition.khronos.egl.EGLConfig;
import javax.microedition.khronos.opengles.GL10;
import android.content.Context;
import android.opengl.GLES20;
import android.opengl.GLSurfaceView;
import android.opengl.Matrix;
import android.util.Log;
public class OpenGLES20_2DRenderer implements GLSurfaceView.Renderer {
/** Used for debug logs. */
private static final String TAG = "Renderer";
//Matrix Declarations*************************
/**
* Store the model matrix. This matrix is used to move models from object space (where each model can be thought
* of being located at the center of the universe) to world space.
*/
private float[] mModelMatrix = new float[16];
/**
* Store the view matrix. This can be thought of as our camera. This matrix transforms world space to eye space;
* it positions things relative to our eye.
*/
private float[] mViewMatrix = new float[16];
/** Store the projection matrix. This is used to project the scene onto a 2D viewport. */
private float[] mProjectionMatrix = new float[16];
/** Allocate storage for the final combined matrix. This will be passed into the shader program. */
private float[] mMVPMatrix = new float[16];
/**
* Stores a copy of the model matrix specifically for the light position.
*/
private float[] mLightModelMatrix = new float[16];
//********************************************
//Global Variable Declarations****************
//Shader
Shader shader;
//PointShader
PointShader pointShader;
//Application Context
Context context;
//A room to add objects to
Room room;
//********************************************
public OpenGLES20_2DRenderer(Context ctx) {
context = ctx;
}
public void onSurfaceCreated(GL10 unused, EGLConfig config) {
//Initialize GLES20***************************
// Set the background frame color
GLES20.glClearColor(0.0f, 1.0f, 0.0f, 1.0f);
// Use culling to remove back faces.
GLES20.glEnable(GLES20.GL_CULL_FACE);
// Enable depth testing
GLES20.glEnable(GLES20.GL_DEPTH_TEST);
// Position the eye in front of the origin.
final float eyeX = 0.0f;
final float eyeY = 0.0f;
final float eyeZ = -0.5f;
// We are looking toward the distance
final float lookX = 0.0f;
final float lookY = 0.0f;
final float lookZ = -5.0f;
// Set our up vector. This is where our head would be pointing were we holding the camera.
final float upX = 0.0f;
final float upY = 1.0f;
final float upZ = 0.0f;
// Set the view matrix. This matrix can be said to represent the camera position.
// NOTE: In OpenGL 1, a ModelView matrix is used, which is a combination of a model and
// view matrix. In OpenGL 2, we can keep track of these matrices separately if we choose.
Matrix.setLookAtM(mViewMatrix, 0, eyeX, eyeY, eyeZ, lookX, lookY, lookZ, upX, upY, upZ);
//********************************************
//Initialize Shaders**************************
shader = new Shader();
pointShader = new PointShader();
//********************************************
//Load The Level******************************
//Create a new room
room = new Room(800,600, 0);
//Load game objects
SpaceShip user = new SpaceShip();
//Load sprites
for(int i=0;i<room.numberOfGameObjects;i++) {
room.gameObjects[i].spriteGLIndex = room.gameObjects[i].loadSprite(context, room.gameObjects[i].spriteResId);
}
//Add them to the room
room.addGameObject(user);
//********************************************
}
public void onDrawFrame(GL10 unused) {
//Caclulate MVPMatrix*************************
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);
// Set our per-vertex lighting program.
GLES20.glUseProgram(shader.mProgram);
// Set program handles for object drawing.
shader.mMVPMatrixHandle = GLES20.glGetUniformLocation(shader.mProgram, "u_MVPMatrix");
shader.mMVMatrixHandle = GLES20.glGetUniformLocation(shader.mProgram, "u_MVMatrix");
shader.mLightPosHandle = GLES20.glGetUniformLocation(shader.mProgram, "u_LightPos");
shader.mTextureUniformHandle = GLES20.glGetUniformLocation(shader.mProgram, "u_Texture");
shader.mPositionHandle = GLES20.glGetAttribLocation(shader.mProgram, "a_Position");
shader.mColorHandle = GLES20.glGetAttribLocation(shader.mProgram, "a_Color");
shader.mNormalHandle = GLES20.glGetAttribLocation(shader.mProgram, "a_Normal");
shader.mTextureCoordinateHandle = GLES20.glGetAttribLocation(shader.mProgram, "a_TexCoordinate");
// Calculate position of the light. Rotate and then push into the distance.
Matrix.setIdentityM(mLightModelMatrix, 0);
Matrix.translateM(mLightModelMatrix, 0, 0.0f, 0.0f, -5.0f);
Matrix.rotateM(mLightModelMatrix, 0, 0, 0.0f, 1.0f, 0.0f);
Matrix.translateM(mLightModelMatrix, 0, 0.0f, 0.0f, 2.0f);
Matrix.multiplyMV(shader.mLightPosInWorldSpace, 0, mLightModelMatrix, 0, shader.mLightPosInModelSpace, 0);
Matrix.multiplyMV(shader.mLightPosInEyeSpace, 0, mViewMatrix, 0, shader.mLightPosInWorldSpace, 0);
//********************************************
//Draw****************************************
//Draw the background
//room.drawBackground(mMVPMatrix);
// Draw game objects
for(int i=0;i<room.numberOfGameObjects;i++) {
// Set the active texture unit to texture unit 0.
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
// Bind the texture to this unit.
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, room.gameObjects[i].spriteGLIndex);
// Tell the texture uniform sampler to use this texture in the shader by binding to texture unit 0.
GLES20.glUniform1i(shader.mTextureUniformHandle, 0);
//Set up the model matrix
Matrix.setIdentityM(mModelMatrix, 0);
Matrix.translateM(mModelMatrix, 0, 4.0f, 0.0f, -7.0f);
Matrix.rotateM(mModelMatrix, 0, room.gameObjects[i].rotation, 1.0f, 0.0f, 0.0f);
//Draw the object
room.gameObjects[i].draw(mModelMatrix, mViewMatrix, mProjectionMatrix, mMVPMatrix, shader);
}
//********************************************
// Draw a point to indicate the light.********
drawLight();
//********************************************
}
public void onSurfaceChanged(GL10 unused, int width, int height) {
//Initialize Projection Matrix****************
// Set the OpenGL viewport to the same size as the surface.
GLES20.glViewport(0, 0, width, height);
// Create a new perspective projection matrix. The height will stay the same
// while the width will vary as per aspect ratio.
final float ratio = (float) width / height;
final float left = -ratio;
final float right = ratio;
final float bottom = -1.0f;
final float top = 1.0f;
final float near = 1.0f;
final float far = 10.0f;
Matrix.frustumM(mProjectionMatrix, 0, left, right, bottom, top, near, far);
//********************************************
}
// Draws a point representing the position of the light.
private void drawLight()
{
GLES20.glUseProgram(pointShader.mProgram);
final int pointMVPMatrixHandle = GLES20.glGetUniformLocation(pointShader.mProgram, "u_MVPMatrix");
final int pointPositionHandle = GLES20.glGetAttribLocation(pointShader.mProgram, "a_Position");
// Pass in the position.
GLES20.glVertexAttrib3f(pointPositionHandle, shader.mLightPosInModelSpace[0], shader.mLightPosInModelSpace[1], shader.mLightPosInModelSpace[2]);
// Since we are not using a buffer object, disable vertex arrays for this attribute.
GLES20.glDisableVertexAttribArray(pointPositionHandle);
// Pass in the transformation matrix.
Matrix.multiplyMM(mMVPMatrix, 0, mViewMatrix, 0, mLightModelMatrix, 0);
Matrix.multiplyMM(mMVPMatrix, 0, mProjectionMatrix, 0, mMVPMatrix, 0);
GLES20.glUniformMatrix4fv(pointMVPMatrixHandle, 1, false, mMVPMatrix, 0);
// Draw the point.
GLES20.glDrawArrays(GLES20.GL_POINTS, 0, 1);
}
}
Shader
package mycompany.OpenGLES20_2DEngine;
import android.opengl.GLES20;
import android.util.Log;
public class Shader {
/** Used for debug logs. */
private static final String TAG = "Shader";
//Shaders*************************************
public int vertexShader;
public int fragmentShader;
//********************************************
//Handles*************************************
/** This will be used to pass in model position information. */
public int mPositionHandle;
/** This will be used to pass in model color information. */
public int mColorHandle;
/** This will be used to pass in model normal information. */
public int mNormalHandle;
/** This will be used to pass in model texture coordinate information. */
public int mTextureCoordinateHandle;
/** This will be used to pass in the transformation matrix. */
public int mMVPMatrixHandle;
/** This will be used to pass in the modelview matrix. */
public int mMVMatrixHandle;
/** This will be used to pass in the light position. */
public int mLightPosHandle;
/** This will be used to pass in the texture. */
public int mTextureUniformHandle;
/** Used to hold a light centered on the origin in model space. We need a 4th coordinate so we can get translations to work when
* we multiply this by our transformation matrices. */
public final float[] mLightPosInModelSpace = new float[] {0.0f, 0.0f, 0.0f, 1.0f};
/** Used to hold the current position of the light in world space (after transformation via model matrix). */
public final float[] mLightPosInWorldSpace = new float[4];
/** Used to hold the transformed position of the light in eye space (after transformation via modelview matrix) */
public final float[] mLightPosInEyeSpace = new float[4];
//********************************************
//GL Code For Shaders*************************
public final String vertexShaderCode =
// A constant representing the combined model/view/projection matrix.
"uniform mat4 u_MVPMatrix;" + "\n" +
// A constant representing the combined model/view matrix.
"uniform mat4 u_MVMatrix;" + "\n" +
// Per-vertex position information we will pass in.
"attribute vec4 a_Position;" + "\n" +
// Per-vertex normal information we will pass in.
"attribute vec3 a_Normal;" + "\n" +
// Per-vertex texture coordinate information we will pass in.
"attribute vec2 a_TexCoordinate;" + "\n" +
// This will be passed into the fragment shader.
"varying vec3 v_Position;" + "\n" +
// This will be passed into the fragment shader.
"varying vec3 v_Normal;" + "\n" +
// This will be passed into the fragment shader.
"varying vec2 v_TexCoordinate;" + "\n" +
// The entry point for our vertex shader.
"void main()" + "\n" +
"{" + "\n" +
// Transform the vertex into eye space.
"v_Position = vec3(u_MVMatrix * a_Position);" + "\n" +
// Pass through the texture coordinate.
"v_TexCoordinate = a_TexCoordinate;" + "\n" +
// Transform the normal's orientation into eye space.
"v_Normal = vec3(u_MVMatrix * vec4(a_Normal, 0.0));" + "\n" +
// gl_Position is a special variable used to store the final position.
// Multiply the vertex by the matrix to get the final point in normalized screen coordinates.
"gl_Position = u_MVPMatrix * a_Position;" + "\n" +
"}";
public final String fragmentShaderCode =
"precision mediump float;" + "\n" + // Set the default precision to medium. We don't need as high of a
// precision in the fragment shader.
"uniform vec3 u_LightPos;" + "\n" + // The position of the light in eye space.
"uniform sampler2D u_Texture;" + "\n" + // The input texture.
"varying vec3 v_Position;" + "\n" + // Interpolated position for this fragment.
"varying vec3 v_Normal;" + "\n" + // Interpolated normal for this fragment.
"varying vec2 v_TexCoordinate;" + "\n" + // Interpolated texture coordinate per fragment.
// The entry point for our fragment shader.
"void main()" + "\n" +
"{" + "\n" +
// Will be used for attenuation.
"float distance = length(u_LightPos - v_Position);" + "\n" +
// Get a lighting direction vector from the light to the vertex.
"vec3 lightVector = normalize(u_LightPos - v_Position);" + "\n" +
// Calculate the dot product of the light vector and vertex normal. If the normal and light vector are
// pointing in the same direction then it will get max illumination.
"float diffuse = max(dot(v_Normal, lightVector), 0.0);" + "\n" +
// Add attenuation.
"diffuse = diffuse * (1.0 / (1.0 + (0.25 * distance)));" + "\n" +
// Add ambient lighting
"diffuse = diffuse + 0.7;" + "\n" +
// Multiply the color by the diffuse illumination level and texture value to get final output color.
"gl_FragColor = (diffuse * texture2D(u_Texture, v_TexCoordinate));" + "\n" +
"}";
//********************************************
//GL Program Handle***************************
public int mProgram;
//********************************************
public Shader() {
//Load Shaders********************************
vertexShader = compileShader(GLES20.GL_VERTEX_SHADER, vertexShaderCode);
fragmentShader = compileShader(GLES20.GL_FRAGMENT_SHADER, fragmentShaderCode);
//********************************************
//Create GL Program***************************
mProgram = createAndLinkProgram(vertexShader, fragmentShader, new String[] {"a_Position", "a_Color", "a_Normal", "a_TexCoordinate"});
//********************************************
}
/**
* Helper function to compile a shader.
*
* #param shaderType The shader type.
* #param shaderSource The shader source code.
* #return An OpenGL handle to the shader.
*/
public static int compileShader(final int shaderType, final String shaderSource)
{
int shaderHandle = GLES20.glCreateShader(shaderType);
if (shaderHandle != 0)
{
// Pass in the shader source.
GLES20.glShaderSource(shaderHandle, shaderSource);
// Compile the shader.
GLES20.glCompileShader(shaderHandle);
// Get the compilation status.
final int[] compileStatus = new int[1];
GLES20.glGetShaderiv(shaderHandle, GLES20.GL_COMPILE_STATUS, compileStatus, 0);
// If the compilation failed, delete the shader.
if (compileStatus[0] == 0)
{
Log.e(TAG, "Error compiling shader " /*+ GLES20.glGetShaderInfoLog(shaderHandle)*/);
GLES20.glDeleteShader(shaderHandle);
shaderHandle = 0;
}
}
if (shaderHandle == 0)
{
throw new RuntimeException("Error creating shader.");
}
return shaderHandle;
}
/**
* Helper function to compile and link a program.
*
* #param vertexShaderHandle An OpenGL handle to an already-compiled vertex shader.
* #param fragmentShaderHandle An OpenGL handle to an already-compiled fragment shader.
* #param attributes Attributes that need to be bound to the program.
* #return An OpenGL handle to the program.
*/
public static int createAndLinkProgram(final int vertexShaderHandle, final int fragmentShaderHandle, final String[] attributes)
{
int programHandle = GLES20.glCreateProgram();
if (programHandle != 0)
{
// Bind the vertex shader to the program.
GLES20.glAttachShader(programHandle, vertexShaderHandle);
// Bind the fragment shader to the program.
GLES20.glAttachShader(programHandle, fragmentShaderHandle);
// Bind attributes
if (attributes != null)
{
final int size = attributes.length;
for (int i = 0; i < size; i++)
{
GLES20.glBindAttribLocation(programHandle, i, attributes[i]);
}
}
// Link the two shaders together into a program.
GLES20.glLinkProgram(programHandle);
// Get the link status.
final int[] linkStatus = new int[1];
GLES20.glGetProgramiv(programHandle, GLES20.GL_LINK_STATUS, linkStatus, 0);
// If the link failed, delete the program.
if (linkStatus[0] == 0)
{
Log.e(TAG, "Error compiling program " /*+ GLES20.glGetProgramInfoLog(programHandle)*/);
GLES20.glDeleteProgram(programHandle);
programHandle = 0;
}
}
if (programHandle == 0)
{
throw new RuntimeException("Error creating program.");
}
return programHandle;
}
}
GameObject
package mycompany.OpenGLES20_2DEngine;
import java.io.IOException;
import java.io.InputStream;
import android.content.Context;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.opengl.GLES20;
import android.opengl.GLUtils;
import android.opengl.Matrix;
import android.util.Log;
public class GameObject {
/** Used for debug logs. */
private static final String TAG = "GameObject";
//Declare Variables****************************
//Position
public int x;
public int y;
public int z;
//Size
public int width;
public int height;
//Movement
double thrustX;
double thrustY;
//Rotation
public int rotation;
public int rotationSpeed;
//Unique Identifier
public int UID;
//Sprite Resource ID
int spriteResId;
//GL Texture Reference
int spriteGLIndex;
//Bounding Box
BoundingBox boundingBox;
//********************************************
GameObject() {
}
public int loadSprite(final Context context, final int resourceId) {
final int[] textureHandle = new int[1];
GLES20.glGenTextures(1, textureHandle, 0);
if (textureHandle[0] != 0)
{
final BitmapFactory.Options options = new BitmapFactory.Options();
options.inScaled = false; // No pre-scaling
// Read in the resource
InputStream is = context.getResources()
.openRawResource(resourceId);
Bitmap bitmap = null;
try {
bitmap = BitmapFactory.decodeStream(is);
is.close();
} catch(IOException e) {
Log.e(TAG, "Could not load the texture");
}
// Bind to the texture in OpenGL
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureHandle[0]);
// Set filtering
//TODO: Offending Line - Makes textures black because of parameters
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_NEAREST);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_NEAREST);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);
// Load the bitmap into the bound texture.
GLUtils.texImage2D(GLES20.GL_TEXTURE_2D, 0, bitmap, 0);
// Recycle the bitmap, since its data has been loaded into OpenGL.
bitmap.recycle();
}
if (textureHandle[0] == 0)
{
throw new RuntimeException("Error loading texture.");
}
return textureHandle[0];
}
public void setUID(int uid) {
UID = uid;
}
public int getUID() {
return UID;
}
public void draw(float[] mModelMatrix, float[] mViewMatrix, float[] mProjectionMatrix, float[] mMVPMatrix, Shader shader) {
{
// Pass in the position information
boundingBox.mPositions.position(0);
GLES20.glVertexAttribPointer(shader.mPositionHandle, boundingBox.mPositionDataSize, GLES20.GL_FLOAT, false,
0, boundingBox.mPositions);
GLES20.glEnableVertexAttribArray(shader.mPositionHandle);
// Pass in the color information
boundingBox.mColors.position(0);
GLES20.glVertexAttribPointer(shader.mColorHandle, boundingBox.mColorDataSize, GLES20.GL_FLOAT, false,
0, boundingBox.mColors);
GLES20.glEnableVertexAttribArray(shader.mColorHandle);
// Pass in the normal information
boundingBox.mNormals.position(0);
GLES20.glVertexAttribPointer(shader.mNormalHandle, boundingBox.mNormalDataSize, GLES20.GL_FLOAT, false,
0, boundingBox.mNormals);
GLES20.glEnableVertexAttribArray(shader.mNormalHandle);
// Pass in the texture coordinate information
boundingBox.mTextureCoordinates.position(0);
GLES20.glVertexAttribPointer(shader.mTextureCoordinateHandle, boundingBox.mTextureCoordinateDataSize, GLES20.GL_FLOAT, false,
0, boundingBox.mTextureCoordinates);
GLES20.glEnableVertexAttribArray(shader.mTextureCoordinateHandle);
// This multiplies the view matrix by the model matrix, and stores the result in the MVP matrix
// (which currently contains model * view).
Matrix.multiplyMM(mMVPMatrix, 0, mViewMatrix, 0, mModelMatrix, 0);
// Pass in the modelview matrix.
GLES20.glUniformMatrix4fv(shader.mMVMatrixHandle, 1, false, mMVPMatrix, 0);
// This multiplies the modelview matrix by the projection matrix, and stores the result in the MVP matrix
// (which now contains model * view * projection).
Matrix.multiplyMM(mMVPMatrix, 0, mProjectionMatrix, 0, mMVPMatrix, 0);
// Pass in the combined matrix.
GLES20.glUniformMatrix4fv(shader.mMVPMatrixHandle, 1, false, mMVPMatrix, 0);
// Pass in the light position in eye space.
GLES20.glUniform3f(shader.mLightPosHandle, shader.mLightPosInEyeSpace[0], shader.mLightPosInEyeSpace[1], shader.mLightPosInEyeSpace[2]);
// Draw the object
GLES20.glDrawArrays(GLES20.GL_TRIANGLES, 0, 6);
}
}
}
BoundingBox
package mycompany.OpenGLES20_2DEngine;
import java.nio.ByteBuffer;
import java.nio.ByteOrder;
import java.nio.FloatBuffer;
//TODO: make this dynamic, both the constructor and the coordinates.
class BoundingBox {
//Variable Declarations***********************
/** How many bytes per float. */
private final int mBytesPerFloat = 4;
/** Store our model data in a float buffer. */
public final FloatBuffer mPositions;
public final FloatBuffer mColors;
public final FloatBuffer mNormals;
public final FloatBuffer mTextureCoordinates;
//Number of coordinates per vertex in this array
final int COORDS_PER_VERTEX = 3;
//Coordinates
float[] positionData;
//Texture Coordinates
float[] textureCoordinateData;
//Vertex Color
float[] colorData;
float[] normalData;
//Vertex Stride
final int vertexStride = COORDS_PER_VERTEX * 4;
/** Size of the position data in elements. */
public final int mPositionDataSize = 3;
/** Size of the color data in elements. */
public final int mColorDataSize = 4;
/** Size of the normal data in elements. */
public final int mNormalDataSize = 3;
/** Size of the texture coordinate data in elements. */
public final int mTextureCoordinateDataSize = 2;
//********************************************
public BoundingBox(float[] coords) {
//TODO: Normalize values
//Set Coordinates and Texture Coordinates*****
if(coords==null) {
float[] newPositionData = {
// Front face
-1.0f, 1.0f, 1.0f,
-1.0f, -1.0f, 1.0f,
1.0f, 1.0f, 1.0f,
-1.0f, -1.0f, 1.0f,
1.0f, -1.0f, 1.0f,
1.0f, 1.0f, 1.0f
};
positionData = newPositionData;
float[] newColorData = {
// Front face (red)
1.0f, 0.0f, 0.0f, 1.0f,
1.0f, 0.0f, 0.0f, 1.0f,
1.0f, 0.0f, 0.0f, 1.0f,
1.0f, 0.0f, 0.0f, 1.0f,
1.0f, 0.0f, 0.0f, 1.0f,
1.0f, 0.0f, 0.0f, 1.0f
};
colorData = newColorData;
float[] newTextureCoordinateData =
{
// Front face
0.0f, 0.0f,
0.0f, 1.0f,
1.0f, 0.0f,
0.0f, 1.0f,
1.0f, 1.0f,
1.0f, 0.0f,
};
textureCoordinateData = newTextureCoordinateData;
float[] newNormalData = {
// Front face
0.0f, 0.0f, 1.0f,
0.0f, 0.0f, 1.0f,
0.0f, 0.0f, 1.0f,
0.0f, 0.0f, 1.0f,
0.0f, 0.0f, 1.0f,
0.0f, 0.0f, 1.0f
};
normalData = newNormalData;
}
else {
positionData = coords;
//TODO:Reverse coords HERE
textureCoordinateData = coords;
}
//********************************************
//Initialize Buffers**************************
mPositions = ByteBuffer.allocateDirect(positionData.length * mBytesPerFloat)
.order(ByteOrder.nativeOrder()).asFloatBuffer();
mPositions.put(positionData).position(0);
mColors = ByteBuffer.allocateDirect(colorData.length * mBytesPerFloat)
.order(ByteOrder.nativeOrder()).asFloatBuffer();
mColors.put(colorData).position(0);
mNormals = ByteBuffer.allocateDirect(normalData.length * mBytesPerFloat)
.order(ByteOrder.nativeOrder()).asFloatBuffer();
mNormals.put(normalData).position(0);
mTextureCoordinates = ByteBuffer.allocateDirect(textureCoordinateData.length * mBytesPerFloat)
.order(ByteOrder.nativeOrder()).asFloatBuffer();
mTextureCoordinates.put(textureCoordinateData).position(0);
//********************************************
}
}
SpaceShip
package mycompany.OpenGLES20_2DEngine;
public class SpaceShip extends GameObject{
public SpaceShip() {
spriteResId = R.drawable.spaceship;
boundingBox = new BoundingBox(null);
}
}
Got it. I added the spaceship to the room AFTER I loaded it's bitmap (from the room).
//Load The Level******************************
//Create a new room
room = new Room(800,600, 0);
//Load game objects
SpaceShip user = new SpaceShip();
**//Load sprites
for(int i=0;i<room.numberOfGameObjects;i++) {
room.gameObjects[i].spriteGLIndex = room.gameObjects[i].loadSprite(context, room.gameObjects[i].spriteResId);
}
//Add them to the room
room.addGameObject(user);**
//********************************************
Would anyone share an example on how to implement GradienRectangle that would have different colors for each vertex?
I've tried to call glColorPointer from GL10 passing float buffer and GL11 using similar to vertices selectOnHardware approach but both methods failed for me...
On AndEngine forum I found this code, but it does not work, however maby it will help someone to find a better solution.
that example does not work for you because author have not shown piece of code responsible for setting vertexes.
Here is my example(it is long, but that's opengl...)
NOTE - remember to setUp viewport correctly.
public static void drawGradientRectangle(GL10 gl, float centerX, float centerY,
float width, float height) {
gl.glPushMatrix();
gl.glDisable(GL10.GL_TEXTURE_2D);
gl.glEnableClientState(GL10.GL_COLOR_ARRAY);
gl.glEnableClientState(GL10.GL_VERTEX_ARRAY); //just in case if you have not done that before
gl.glFrontFace(GL10.GL_CCW); //Set the face
gl.glTranslatef(centerX, centerY, 0);
if (width != 1 || height != 1) {
gl.glScalef(width, height, 1);
}
gl.glVertexPointer(2, GL10.GL_FLOAT, 0, GLDrawConstants.vertexBuffer0_5);
gl.glColorPointer(4, GL10.GL_FLOAT, 0, GLDrawConstants.gradOrangeWhiteBuffer);
// Draw the vertices as triangle strip
gl.glDrawArrays(GL10.GL_TRIANGLE_STRIP, 0, 4);
gl.glDisableClientState(GL10.GL_COLOR_ARRAY);
gl.glEnable(GL10.GL_TEXTURE_2D);
gl.glPopMatrix();
}
GLDrawConstants class:
public class GLDrawConstants {
public static final FloatBuffer gradOrangeWhiteBuffer;
public static final FloatBuffer vertexBuffer0_5;
private static final float vertices0_5[] = {
-0.5f, -0.5f,// Bottom Left
0.5f, -0.5f,// Bottom right
-0.5f, 0.5f,// Top Left
0.5f, 0.5f// Top Right
};
private static final float gradOrangeWhiteColor[] = {
255/255f, 239/255f, 196/255f, 0f, // Bottom Left
255/255f, 239/255f, 196/255f, 0f, // Bottom right
250/255f, 200/255f, 62/255f, 0.3f, // Top Left
250/255f, 200/255f, 62/255f, 0.3f // Top Right
};
static {
gradOrangeWhiteBuffer = WDUtils.floatBuffer(gradOrangeWhiteColor);
vertexBuffer0_5 = WDUtils.floatBuffer(vertices0_5);
}
}
WDUtils class:
public class WDUtils {
/**
* Make a direct NIO FloatBuffer from an array of floats
*
* #param arr
* The array
* #return The newly created FloatBuffer
*/
public static final FloatBuffer floatBuffer(float[] arr) {
ByteBuffer bb = ByteBuffer.allocateDirect(arr.length * 4);
bb.order(ByteOrder.nativeOrder());
FloatBuffer fb = bb.asFloatBuffer();
fb.put(arr);
fb.position(0);
return fb;
}
}
Or you use (or backport) the Gradient class from the GLES2-AnchorCenter branch, which gives you a super easy to use and feature rich API without worrying about OpenGL. =)