Related
So, I'm trying to make a simple 3d engine based on OpenGL ES 2 (Android). First (apart from many tutorials and such), I copy-pasted code from developer.android.com tutorial step-by-step. Everything worked fine. Then I started to modify it. I changed Mesh class' fields and constructor so that vertex coordinates, shader codes, color and number of vertices are not preset (all the code is at the end of the question).
Changed a Mesh field in GL20Renderer class to private List<Mesh> meshs = new ArrayList<Mesh>();. Created functions for adding and removing meshs. The only add function used at the moment in my code is the following:
public void addMesh( Mesh meshToAdd ) {
meshs.add( meshToAdd );
}
The only change in GL20Renderer.onDrawFrame() is that the code there loops through mesh list and calls Mesh.draw() method for every mesh:
for( Mesh meshToDraw : meshs ) {
meshToDraw.draw(scratch);
}
Then I tried to add one mesh from GL20Renderer.onSurfaceCreated() method - it worked fine. Add another mesh from the same method - works fine.
And then I tried to add a mesh from GL20Activity.onCreate() method (while removing the same code from GL20Renderer.onSurfaceCreated()):
float[] vertexCoords = { 0.0f, 0.622008459f, 0.0f, -0.5f, -0.311004243f, 0.0f, 0.5f, -0.311004243f, 0.0f };
int[] drawOrder = { 1, 2, 3 };
float color[] = { 0.63671875f, 0.76953125f, 0.22265625f, 1.0f };
String vertexShaderCode = "uniform mat4 uMVPMatrix;" + "attribute vec4 vPosition;" + "void main() {" + " gl_Position = uMVPMatrix * vPosition;" + "}";
String fragmentShaderCode = "precision mediump float;" + "uniform vec4 vColor;" + "void main() {" + " gl_FragColor = vColor;" + "}";
renderer.addMesh( new Mesh( 3, vertexCoords, drawOrder, 1, color, vertexShaderCode, fragmentShaderCode) );
Mesh gets added to mesh list but doesn't get displayed for some reason (it's properties are the same as the tutorial's mesh's ones). If meshs are added from both GL20Activity.onCreate() and Renderer.onSurfaceCreated() - none of them gets displayed, but they're still on mesh list. Then I added onRendererInitialized() method to GL20Activity, which is called from the very end of GL20Renderer.onCreate() method. onRendererInited() just adds a mesh, and it gets displayed, but only if no mesh is added from GL20Activity.onCreate().
The question is: Why does that mesh not get drawn if it's added from GL20Activity.onCreate() method? I tried changing vertices' coordinates so they are not the cause of a problem. I have also tried to add some code to GL20Renderer.onDrawFrame() to make sure that meshToDraw.draw( scratch ); line of code runs properly, and it does run properly, but still mesh doesn't get drawn. But since mesh with same vertices' coordinates and color added from GL20Activity.onRendererInitialized() and/or from GL20Renderer.onSurfaceCreated() gets drawn, there shouldn't be any problem with drawing mesh, which is added from GL20Activity.onCreate().
MainActivity.java:
package com.Reaper.VisionEngine;
import android.app.*;
import android.content.*;
import android.os.*;
public class MainActivity extends Activity {
#Override
protected void onCreate(Bundle savedInstanceState {
super.onCreate(savedInstanceState);
Intent intent = new Intent( this, GL20Activity.class );
startActivity( intent );
}
}
GL20Activity.java:
package com.Reaper.VisionEngine;
import android.app.*;
import android.os.*;
import android.widget.*;
import com.Reaper.VisionEngine.GLRenderer.*;
import com.Reaper.VisionEngine.GLSurfaceView.*;
import com.Reaper.VisionEngine.Mesh.*;
import java.io.*;
public class GL20Activity extends Activity {
GL20SurfaceView GLView;
GL20Renderer renderer;
#Override
protected void onCreate( Bundle savedInstanceState ) {
super.onCreate(savedInstanceState);
GLView = new GL20SurfaceView(this);
setContentView(GLView);
renderer = GLView.getRenderer();
float[] vertexCoords = { 0.0f, 0.622008459f, 0.0f, -0.5f, -0.311004243f, 0.0f, 0.5f, -0.311004243f, 0.0f };
int[] drawOrder = { 1, 2, 3 };
float color[] = { 0.63671875f, 0.76953125f, 0.22265625f, 1.0f };
String vertexShaderCode = "uniform mat4 uMVPMatrix;" + "attribute vec4 vPosition;" + "void main() {" + " gl_Position = uMVPMatrix * vPosition;" + "}";
String fragmentShaderCode = "precision mediump float;" + "uniform vec4 vColor;" + "void main() {" + " gl_FragColor = vColor;" + "}";
renderer.addMesh( new Mesh( 3, vertexCoords, drawOrder, 1, color, vertexShaderCode, fragmentShaderCode) );
}
GL20Renderer.java:
package com.Reaper.VisionEngine.GLRenderer;
import android.opengl.*;
import com.Reaper.VisionEngine.*;
import com.Reaper.VisionEngine.Mesh.*;
import java.util.*;
import javax.microedition.khronos.egl.*;
import javax.microedition.khronos.opengles.*;
import javax.microedition.khronos.egl.EGLConfig;
public class GL20Renderer implements GLSurfaceView.Renderer {
private List<Mesh> meshs = new ArrayList<Mesh>();
private final float[] mMVPMatrix = new float[16];
private final float[] mProjectionMatrix = new float[16];
private final float[] mViewMatrix = new float[16];
private float[] mRotationMatrix = new float[16];
public volatile float mAngle;
private List<Mesh> meshQueue = new ArrayList<Mesh>();
private GL20Activity activity;
public void onSurfaceCreated(GL10 gl, EGLConfig glConfig) {
GLES20.glClearColor( 0.0f, 0.0f, 0.0f, 1.0f );
}
public void onDrawFrame(GL10 gl) {
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
Matrix.setLookAtM(mViewMatrix, 0, 0, 0, -3, 0f, 0f, 0f, 0f, 1.0f, 0.0f);
Matrix.multiplyMM(mMVPMatrix, 0, mProjectionMatrix, 0, mViewMatrix, 0);
float[] scratch = new float[16];
Matrix.setRotateM(mRotationMatrix, 0, mAngle, 0, 0, -1.0f);
Matrix.multiplyMM(scratch, 0, mMVPMatrix, 0, mRotationMatrix, 0);
for( Mesh meshToDraw : meshs ) {
meshToDraw.draw(scratch);
}
}
public void onSurfaceChanged(GL10 gl, int width, int height) {
GLES20.glViewport(0, 0, width, height);
float ratio = (float) width / height;
Matrix.frustumM(mProjectionMatrix, 0, -ratio, ratio, -1, 1, 3, 7);
}
public static int loadShader(int type, String shaderCode){
int shader = GLES20.glCreateShader(type);
GLES20.glShaderSource(shader, shaderCode);
GLES20.glCompileShader(shader);
return shader;
}
public void addMesh( Mesh meshToAdd ) {
meshs.add( meshToAdd );
}
GL20SurfaceView.java:
package com.Reaper.VisionEngine.GLSurfaceView;
import android.content.*;
import android.opengl.*;
import android.view.*;
import com.Reaper.VisionEngine.*;
import com.Reaper.VisionEngine.GLRenderer.*;
public class GL20SurfaceView extends GLSurfaceView {
private final GL20Renderer Renderer;
public GL20SurfaceView(Context context) {
super(context);
setEGLContextClientVersion(2);
Renderer = new GL20Renderer();
setRenderer(Renderer);
setRenderMode(GLSurfaceView.RENDERMODE_WHEN_DIRTY);
}
public boolean onTouchEvent(MotionEvent e) { // TODO: CHANGE THIS
switch (e.getAction()) {
case MotionEvent.ACTION_MOVE:
requestRender();
}
return true;
}
public GL20Renderer getRenderer() {
return Renderer;
}
}
Mesh.java:
package com.Reaper.VisionEngine.Mesh;
import android.opengl.*;
import com.Reaper.VisionEngine.GLRenderer.*;
import java.nio.*;
public class Mesh {
private FloatBuffer vertexBuffer;
public final int GL20Program;
private int PositionHandle;
private int ColorHandle;
private int MVPMatrixHandle;
private int colorOverridesTexture = 0;
private int vertexCount = 1;
private final int vertexStride = COORDS_PER_VERTEX * 4; // 4 bytes per vertex
private int[] drawOrder; // TODO: Implement
static final int COORDS_PER_VERTEX = 3;
private float vertexCoords[];
float color[] = new float[3];
private String vertexShaderCode;
private String fragmentShaderCode;
private final String defaultVertexShaderCode = "uniform mat4 uMVPMatrix;" + "attribute vec4 vPosition;" + "void main() }" + " gl_Position = uMVPMatrix * vPosition;" + "}";
private final String defaultFragmentShaderCode = "precision mediump float;" + "uniform vec4 vColor;" + "void main() }" + " gl_FragColor = vColor;" + "}";
public Mesh( int newVertexCount, float[] newVertexCoords, int[] newDrawOrder, int useColor, float[] newColor, String newVertexShaderCode, String newFragmentShaderCode ) {
vertexCount = newVertexCount;
vertexCoords = newVertexCoords;
drawOrder = newDrawOrder;
colorOverridesTexture = useColor;
color = newColor;
vertexShaderCode = newVertexShaderCode;
fragmentShaderCode = newFragmentShaderCode;
ByteBuffer bb = ByteBuffer.allocateDirect( vertexCoords.length * 4 );
bb.order(ByteOrder.nativeOrder());
vertexBuffer = bb.asFloatBuffer();
vertexBuffer.put( vertexCoords );
vertexBuffer.position( 0 );
int vertexShader = GL20Renderer.loadShader(GLES20.GL_VERTEX_SHADER, vertexShaderCode);
int fragmentShader = GL20Renderer.loadShader(GLES20.GL_FRAGMENT_SHADER, fragmentShaderCode);
GL20Program = GLES20.glCreateProgram();
GLES20.glAttachShader(GL20Program, vertexShader);
GLES20.glAttachShader(GL20Program, fragmentShader);
GLES20.glLinkProgram(GL20Program);
}
public void draw(float[] mvpMatrix) {
MVPMatrixHandle = GLES20.glGetUniformLocation(GL20Program, "uMVPMatrix");
GLES20.glUniformMatrix4fv(MVPMatrixHandle, 1, false, mvpMatrix, 0);
GLES20.glUseProgram(GL20Program);
PositionHandle = GLES20.glGetAttribLocation(GL20Program, "vPosition");
GLES20.glEnableVertexAttribArray(PositionHandle);
GLES20.glVertexAttribPointer(PositionHandle, COORDS_PER_VERTEX, GLES20.GL_FLOAT, false, vertexStride, vertexBuffer);
ColorHandle = GLES20.glGetUniformLocation(GL20Program, "vColor");
GLES20.glUniform4fv(ColorHandle, 1, color, 0);
GLES20.glDrawArrays(GLES20.GL_TRIANGLES, 0, vertexCount);
GLES20.glDisableVertexAttribArray(PositionHandle);
}
}
Sorry for bad english.
I think at your GL20Activity.onCreate() stage, the EGL context is not available or not made current. Hence the mesh added there, the gl* commands in Mesh's constructor will fail. Even though the mesh object is added to the list, the corresponding OpenGL elements are not initialized so the draw will fail, or draw nothing. You can verify this by checking for OpenGL errors after gl* statements in Mesh's constructor when called from GL20Activity.onCreate().
Now when you do the same at the GL20Renderer.onSurfaceCreated(), (I think that's what you mean by GL20Renderer.onCreated()) the EGL context is available and valid so all gl* commands in Mesh's constructor will execute and hence the mesh's OpenGL components exist, so it is displayed correctly.
Since this method is called at the beginning of rendering, as well as
every time the EGL context is lost, this method is a convenient place
to put code to create resources that need to be created when the
rendering starts, and that need to be recreated when the EGL context
is lost. Textures are an example of a resource that you might want to
create here.
refer https://developer.android.com/reference/android/opengl/GLSurfaceView.Renderer.html for more info.
So, I received this code from another post to SO. It seems to be the most complete working code for adding textures in openGl 2.0 with shaders in android. How does he know what integer to pass into textureResourceId, and how do I set-up a resource Id to the .png files I put into my project?
Here's the link to the SO post I'm reffering to:
Android OpenGL|ES 2.0 Texturing
And here's the Square Code:
public class Square{
private static final String TAG = "Square";
public float[] rotation = {0.0f,0.0f,45.0f};
public float[] scale = {100.0f,100f,100f};
public float[] position = {0.0f,0.0f,100f};
public float[] color = { 0.0f, 0.0f, 1.0f, 1.0f };
private int textureRef = -1;
private int mMVPMatrixHandle;
protected int DRAW_MODE = GLES20.GL_TRIANGLES;
protected int mProgram;
protected int mPositionHandle;
protected Vertex vertices;
protected Vertex texture;
private int mColorHandle;
private int vsTextureCoord;
private int fsTexture;
protected float[] result_matrix = new float[16];
private final String vertexShaderCode =
"uniform mat4 uMVPMatrix;" +
"attribute vec3 vPosition;" +
"attribute vec2 TexCoordIn;" +
"varying vec2 TexCoordOut;" +
"void main() {" +
//the matrix must be included as a modifier of gl_Position
" gl_Position = uMVPMatrix * vec4(vPosition,1.0);" +
" TexCoordOut = TexCoordIn;" +
"}";
private final String fragmentShaderCode =
"precision mediump float;" +
"uniform vec4 vColor;" +
"uniform sampler2D Texture;" +
"varying lowp vec2 TexCoordOut;" +
"void main() {" +
" gl_FragColor = vColor*TexCoordOut*Texture;" +
"}";
//I am fully aware that I am not using the texture by assigning the colour, but until I can actually SEND the texture through, there would be no point.
static float squareCoords[] = { -0.5f, 0.5f, 0.0f, // top left
-0.5f, -0.5f, 0.0f, // bottom left
0.5f, -0.5f, 0.0f, // bottom right
0.5f, 0.5f, 0.0f }; // top right
private short drawOrder[] = { 0, 1, 2, 0, 2, 3 }; // order to draw vertices
public Square(int textureResourceId) {
int vertexShader = GFXUtils.loadShader(GLES20.GL_VERTEX_SHADER, vertexShaderCode);
int fragmentShader = GFXUtils.loadShader(GLES20.GL_FRAGMENT_SHADER, fragmentShaderCode);
mProgram = GLES20.glCreateProgram(); // create empty OpenGL ES Program
GLES20.glAttachShader(mProgram, vertexShader); // add the vertex shader to program
GLES20.glAttachShader(mProgram, fragmentShader); // add the fragment shader to program
GLES20.glLinkProgram(mProgram); // creates OpenGL ES program executables
textureRef = GFXUtils.textures.get(textureResourceId);
// initialize vertex byte buffer for shape coordinates
vertices = new Vertex(squareCoords, drawOrder, GFXUtils.COORDS_PER_VERTEX);
texture = new Vertex (new float[]
{
1.0f, 0.0f,
0.0f, 0.0f,
1.0f, 1.0f,
0.0f, 1.0f,
}, GFXUtils.COORDS_PER_TEXTURE);
DRAW_MODE = GLES20.GL_TRIANGLE_FAN;
}
private void getHandles()
{
//get handle to vertex shader's vPosition member
mPositionHandle = GLES20.glGetAttribLocation(mProgram, "vPosition");
if (mPositionHandle == -1) Log.e(TAG, "vPosition not found");
//get handle to fragment shader's vColor member
mColorHandle = GLES20.glGetUniformLocation(mProgram, "vColor");
if (mColorHandle == -1) Log.e(TAG, "vColor not found");
//get handle to shape's transformation matrix
mMVPMatrixHandle = GLES20.glGetUniformLocation(mProgram, "uMVPMatrix");
if (mMVPMatrixHandle == -1) Log.e(TAG, "uMVPMatrix not found");
//get handle to texture coordinate variable
vsTextureCoord = GLES20.glGetAttribLocation(mProgram, "TexCoordIn");
if (vsTextureCoord == -1) Log.e(TAG, "TexCoordIn not found");
//get handle to shape's texture reference
fsTexture = GLES20.glGetUniformLocation(mProgram, "Texture");
if (fsTexture == -1) Log.e(TAG, "Texture not found");
}
private void translateRotateScale(float[] matrix, float[] perspectiveMatrix)
{
for (int i= 0; i < perspectiveMatrix.length;i++)
matrix[i] = perspectiveMatrix[i];
Matrix.translateM(matrix, 0, position[0], position[1], position[2]);
Matrix.rotateM(matrix, 0, rotation[0], 1.0f, 0.0f, 0.0f);
Matrix.rotateM(matrix, 0, rotation[1], 0.0f, 1.0f, 0.0f);
Matrix.rotateM(matrix, 0, rotation[2], 0.0f, 0.0f, 1.0f);
Matrix.scaleM(matrix, 0, scale[0], scale[1], scale[2]);
}
public void draw(float[] mvpMatrix) {
rotation[2]+=0.5;
// Add program to OpenGL ES environment
GLES20.glUseProgram(mProgram);
GFXUtils.checkGlError("using program");
//Housekeeping
getHandles();
translateRotateScale(result_matrix, mvpMatrix);
//end housekeeping
// Set color for drawing the shape
GLES20.glUniform4fv(mColorHandle, 1, color, 0);
// Apply the projection and view transformation
GLES20.glUniformMatrix4fv(mMVPMatrixHandle, 1, false, result_matrix, 0);
GFXUtils.checkGlError("glUniformMatrix4fv");
// Prepare the shape coordinate data
GLES20.glVertexAttribPointer(mPositionHandle, GFXUtils.COORDS_PER_VERTEX,
GLES20.GL_FLOAT, false,
GFXUtils.vertexStride, vertices.floatBuffer);
GFXUtils.checkGlError("load vertex buffer");
GLES20.glVertexAttribPointer(vsTextureCoord, GFXUtils.COORDS_PER_TEXTURE,
GLES20.GL_FLOAT, false,
GFXUtils.textureStride, texture.floatBuffer);
GFXUtils.checkGlError("load texture buffer - " + vsTextureCoord);
// Enable a handle to the shape vertices
GLES20.glEnableVertexAttribArray(mPositionHandle);
GFXUtils.checkGlError("enable position handle");
GLES20.glEnableVertexAttribArray(vsTextureCoord);
GFXUtils.checkGlError("enable texture handle");
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
GFXUtils.checkGlError("activtexture");
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureRef);
GFXUtils.checkGlError("bindtexture");
GLES20.glUniform1i(fsTexture, 0);
GFXUtils.checkGlError("uniformi");
//Draw the shape
GLES20.glDrawElements(DRAW_MODE, vertices.numIndeces, GLES20.GL_UNSIGNED_SHORT, vertices.indexBuffer);
GFXUtils.checkGlError("glDrawArrays with " + vertices.numVertices + " vertices");
//Disable vertex array
GLES20.glDisableVertexAttribArray(vsTextureCoord);
GLES20.glDisableVertexAttribArray(mPositionHandle);
GFXUtils.checkGlError("glDisableVertexAttribArray for position");
}
}
My Main Activity Class:
public class MainActivity extends ActionBarActivity {
GLSurfaceView myGL;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
myGL=new MySurface(this);
setContentView(myGL);
}
#Override
public boolean onCreateOptionsMenu(Menu menu) {
// Inflate the menu; this adds items to the action bar if it is present.
getMenuInflater().inflate(R.menu.main, menu);
return true;
}
#Override
public boolean onOptionsItemSelected(MenuItem item) {
// Handle action bar item clicks here. The action bar will
// automatically handle clicks on the Home/Up button, so long
// as you specify a parent activity in AndroidManifest.xml.
int id = item.getItemId();
if (id == R.id.action_settings) {
return true;
}
return super.onOptionsItemSelected(item);
}
/**
* A placeholder fragment containing a simple view.
*/
public static class PlaceholderFragment extends Fragment {
public PlaceholderFragment() {
}
#Override
public View onCreateView(LayoutInflater inflater, ViewGroup container,
Bundle savedInstanceState) {
View rootView = inflater.inflate(R.layout.fragment_main, container,
false);
return rootView;
}
}
//---------------------------------Open GL-------------------------------------------------------
class MySurface extends GLSurfaceView
{
private MyRenderer myRend;
public MySurface(Context context)
{
super(context);
myRend=new MyRenderer();
setEGLContextClientVersion(2);
setRenderer(myRend);
setRenderMode(GLSurfaceView.RENDERMODE_WHEN_DIRTY);
}
}
Square mSquare;
float [] mViewMatrix=new float[16];
float [] mMVPMatrix=new float[16];
float [] mProjectionMatrix=new float[16];
private float[] mRotationMatrix=new float[16];
class MyRenderer implements GLSurfaceView.Renderer
{
#Override
public void onSurfaceCreated(GL10 gl, EGLConfig config) {
GLES20.glClearColor(0.0f, 0.0f, 0.0f, 1f);
mSquare=new Square(R.drawable.brick_texture);
}
#Override
public void onDrawFrame(GL10 gl) {
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
Matrix.setLookAtM(mViewMatrix, 0, 0, 0, -5, 0f, 0f, 0f, 0f, 1.0f, 0.0f);
Matrix.multiplyMM(mMVPMatrix, 0, mProjectionMatrix, 0, mViewMatrix, 0);
mSquare.draw(mMVPMatrix);
}
#Override
public void onSurfaceChanged(GL10 gl, int width, int height) {
GLES20.glViewport(0, 0, width, height);
float ratio = (float) width / height;
Matrix.frustumM(mProjectionMatrix, 0, -ratio, ratio, -1, 1, 2, 50);
}
}
}
Hello StackOverflow community. I have been working on a live wallpaper for android using OpenGL and have been successful thus far in getting OpenGL to render on the live wallpaper and have also been able to load and bind a single texture onto a quad. However when I load more than one texture the last bound texture which is a crate texture is bound onto the first quad which should have a dirt texture, and the second quad tries to bind to a texture that does not exist.
This is my quad class which draw the quads and binds whatever texture the texture object stores:
public class Quad {
/** The quad's location in space. */
private float[] vector;
private Texture texture;
/** Buffer holding the vertices. */
private FloatBuffer vertexBuffer;
/** Array holding the quad's size. */
private float[] vertices;
public Quad(GL10 gl, Context context, float[] vector, Texture texture) {
this.vector = vector;
this.texture = texture;
// Check the length of the vector to make sure it is valid.
if(vector.length != 8) {
throw new IllegalArgumentException("Please pass a vector with a length of 8. (x, y, z) (w, h) (rotX, rotY, rotZ)");
}
vertices = new float[] {
0, 0, 0.0f, // V1 - bottom left
0, vector[4], 0.0f, // V2 - top left
vector[3], 0, 0.0f, // V3 - bottom right
vector[3], vector[4], 0.0f // V4 - top right
};
ByteBuffer byteBuffer = ByteBuffer.allocateDirect(vertices.length * 4);
byteBuffer.order(ByteOrder.nativeOrder());
vertexBuffer = byteBuffer.asFloatBuffer();
vertexBuffer.put(vertices);
vertexBuffer.position(0);
}
public void draw(GL10 gl) {
// Reset the Modelview Matrix
gl.glLoadIdentity();
float[] cameraVector = Camera.getLocationVector();
// Translate the object
gl.glTranslatef(vector[0] - cameraVector[0], vector[1] - cameraVector[1], vector[2] - cameraVector[2]);
// Rotate the object
gl.glRotatef(vector[5] - cameraVector[3], 1f, 0f, 0f);
gl.glRotatef(vector[6] - cameraVector[4], 0f, 1f, 0f);
gl.glRotatef(vector[7] - cameraVector[5], 0f, 0f, 1f);
// Bind the previously generated texture
int crap = texture.getTextureID();
gl.glBindTexture(GL10.GL_TEXTURE_2D, texture.getTextureID());
// Point to our buffers
gl.glEnableClientState(GL10.GL_VERTEX_ARRAY);
gl.glEnableClientState(GL10.GL_TEXTURE_COORD_ARRAY);
// Set the face rotation
gl.glFrontFace(GL10.GL_CW);
// Point to our vertex buffer
gl.glVertexPointer(3, GL10.GL_FLOAT, 0, vertexBuffer);
gl.glTexCoordPointer(2, GL10.GL_FLOAT, 0, texture.getBuffer());
// Draw the vertices as triangle strip
gl.glDrawArrays(GL10.GL_TRIANGLE_STRIP, 0, vertices.length / 3);
// Disable the client state before leaving
gl.glDisableClientState(GL10.GL_VERTEX_ARRAY);
gl.glDisableClientState(GL10.GL_TEXTURE_COORD_ARRAY);
}
}
This is how I initialize my quads:
Texture.loadTextures(gl, context);
quad = new Quad(gl, context, new float[] { 0.0f, 0.0f, -5.0f, 1.0f, 1.0f, 0.0f, 0.0f, 0.0f, }, Texture.TEXTURE_DIRT);
quad2 = new Quad(gl, context, new float[] { -1.0f, 0.0f, -10.0f, 1.0f, 1.0f, 0.0f, 45.0f, 0.0f, }, Texture.TEXTURE_CRATE);
This is the texture class that manages all the wallpaper's textures loaded from android bitmaps:
public class Texture {
/** Buffer holding the texture coordinates. */
private FloatBuffer textureBuffer;
private final float textureCoords[] = {
0.0f, 1.0f, // Top left (V2)
0.0f, 0.0f, // Bottom left (V1)
1.0f, 1.0f, // Top right (V4)
1.0f, 0.0f // Bottom right (V3)
};
/** The texture this specific instance is pointing to. */
private int texture;
/** The id of the texture to be grabbed from android bitmap loader. */
private int textureID;
/** Stores all texture data. */
private static ArrayList<Texture> textureList;
/** List of all registered textures */
private static int lastRegisteredTexture;
/** Array of all loaded texture pointers. */
private static int[] textures;
/** Flag to indicate whether the class has been initialized properly. */
private static boolean isInitialized;
private static boolean texturesGenned;
/////////////////////////////////////////////////////////////////////////////////////////
// All recycled textures loaded on app launch
public static Texture TEXTURE_DIRT;
public static Texture TEXTURE_CRATE;
/**
* Create a new opengl texture.
* #param gl
* #param context
* #param id
*/
private Texture(GL10 gl, Context context, int id) {
if(isInitialized) {
ByteBuffer byteBuffer = ByteBuffer.allocateDirect(textureCoords.length * 4);
byteBuffer.order(ByteOrder.nativeOrder());
textureBuffer = byteBuffer.asFloatBuffer();
textureBuffer.put(textureCoords);
textureBuffer.position(0);
textureID = id;
// Add the texture into the list before loading the texture so the
// size of the texture pointer array can be determined and not have to be
// reinitialized
textureList.add(this);
} else {
throw new IllegalStateException("Not yet initialized.");
}
}
/**
* Load the texture into opengl.
* #param gl
* #param context
* #param id
*/
private static void loadGLTexture(GL10 gl, Context context) {
gl.glGenTextures(textures.length, textures, 0);
for(int i = 0; i < textureList.size(); i++) {
// Loading texture without scaling
BitmapFactory.Options opts = new BitmapFactory.Options(); opts.inScaled = false;
Bitmap bitmap = BitmapFactory.decodeResource(context.getResources(), textureList.get(i).getID(), opts);
// ...And bind it to our array
gl.glBindTexture(GL10.GL_TEXTURE_2D, textureList.get(i).getTextureID());
// Create nearest filtered texture
gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MIN_FILTER, GL10.GL_NEAREST);
gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MAG_FILTER, GL10.GL_NEAREST);
// Use Android GLUtils to specify a two-dimensional texture image from our bitmap
GLUtils.texImage2D(GL10.GL_TEXTURE_2D, 0, bitmap, 0);
textureList.get(i).texture = lastRegisteredTexture;
lastRegisteredTexture++;
// Clean up native resources
bitmap.recycle();
}
Log.d("Test", "Test2");
}
public int getTextureID() {
return textures[texture];
}
public FloatBuffer getBuffer() {
return textureBuffer;
}
private int getID() {
return textureID;
}
public static void loadTextures(GL10 gl, Context context) {
if(!isInitialized) {
isInitialized = true;
textureList = new ArrayList<Texture>();
// Load all static textures
TEXTURE_DIRT = new Texture(gl, context, R.raw.dirt);
TEXTURE_CRATE = new Texture(gl, context, R.drawable.crate);
// Initialize the texture pointer to match the number of textures
textures = new int[textureList.size()];
loadGLTexture(gl, context);
} else {
throw new IllegalStateException("Already initialized.");
}
}
}
My suspension is that OpenGL is somehow loading one texture over the other due to a flaw in my logic and I believe the problem is happening somewhere in "loadGLTexture". Thanks ahead of time.
You called glBindTexture first.
gl.glBindTexture(GL10.GL_TEXTURE_2D,
textureList.get(i).getTextureID());
But [texture] is set later.
textureList.get(i).texture = lastRegisteredTexture;
So the first texture(textures[0]) was replaced and the second not binded.
I'm trying to learn OpenGL ES as part of my foray into Android development.
So far, I've created the following Android application by cutting and pasting from various tutorials I found.
The application is supposed to create 2 coloured squares (1 red square and 1 blue square) and rotate them around a central point.
So during part of the rotation, the red square should be in front while during another part of the rotation, the blue square should be in front.
When I run my application in the Android simulator however, it only shows the blue square in front.
Does anyone know what I'm missing?
package hello.world;
import java.nio.ByteBuffer;
import java.nio.ByteOrder;
import java.nio.FloatBuffer;
import java.util.ArrayList;
import java.util.List;
import javax.microedition.khronos.egl.EGLConfig;
import javax.microedition.khronos.opengles.GL10;
import android.app.Activity;
import android.content.Context;
import android.opengl.GLSurfaceView;
import android.os.Bundle;
public class HelloActivity extends Activity {
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(new HelloView(this));
}
private class HelloView extends GLSurfaceView {
private HelloRenderer renderer;
public HelloView(Context context) {
super(context);
renderer = new HelloRenderer(context);
setRenderer(renderer);
}
}
private class HelloRenderer implements GLSurfaceView.Renderer {
public float xrot; //X Rotation ( NEW )
public float yrot; //Y Rotation ( NEW )
public float zrot; //Z Rotation ( NEW )
private List<ColoredQuad> quads;
public HelloRenderer(Context context) {
quads = new ArrayList<ColoredQuad>();
quads.add(new ColoredQuad(
new Vertex3D(-1.0f, -1.0f, 1.0f),
new Vertex3D(1.0f, -1.0f, 1.0f),
new Vertex3D(-1.0f, 1.0f, 1.0f),
new Vertex3D(1.0f, 1.0f, 1.0f),
new RGBA(1.0f, 0.0f, 0.0f)));
quads.add(new ColoredQuad(
new Vertex3D(-1.0f, -1.0f, -1.0f),
new Vertex3D(1.0f, -1.0f, -1.0f),
new Vertex3D(-1.0f, 1.0f, -1.0f),
new Vertex3D(1.0f, 1.0f, -1.0f),
new RGBA(0.0f, 0.0f, 1.0f)));
}
/**
* Called whenever drawing is needed.
*/
public void onDrawFrame(GL10 gl) {
// clear screen and depth buffer
gl.glClear(GL10.GL_COLOR_BUFFER_BIT | GL10.GL_DEPTH_BUFFER_BIT);
gl.glLoadIdentity();
//Drawing
gl.glTranslatef(0.0f, 0.0f, -5.0f); //move 5 units into the screen
gl.glScalef(0.5f, 0.5f, 0.5f); //scale the objects to 50 percent of original size
//Rotate around the axis based on the rotation matrix (rotation, x, y, z)
gl.glRotatef(xrot, 1.0f, 0.0f, 0.0f); //X
gl.glRotatef(yrot, 0.0f, 1.0f, 0.0f); //Y
gl.glRotatef(zrot, 0.0f, 0.0f, 1.0f); //Z
for (ColoredQuad quad : quads) {
quad.draw(gl);
}
//Change rotation factors (nice rotation)
xrot += 3.0f;
yrot += 2.0f;
zrot += 1.0f;
}
/**
* Called when the surface has changed.
* For example, when switching from portrait to landscape view.
*/
public void onSurfaceChanged(GL10 gl, int width, int height) {
gl.glViewport(0, 0, width, height);
}
public void onSurfaceCreated(GL10 gl, EGLConfig config) {
gl.glEnable(GL10.GL_SMOOTH); // enable smooth shading
gl.glClearColor(0.0f, 0.0f, 0.0f, 0.5f); // black background
gl.glClearDepthf(GL10.GL_DEPTH_TEST); // enable depth testing
gl.glDepthFunc(GL10.GL_LEQUAL); // type of depth testing to do
//Really Nice Perspective Calculations
gl.glHint(GL10.GL_PERSPECTIVE_CORRECTION_HINT, GL10.GL_NICEST);
}
}
private class Vertex3D {
public float x;
public float y;
public float z;
public Vertex3D(float x, float y, float z) {
this.x = x;
this.y = y;
this.z = z;
}
}
public class RGBA {
public float red;
public float blue;
public float green;
public float alpha;
public RGBA(float red, float green, float blue) {
this.red = red;
this.blue = blue;
this.green = green;
this.alpha = 1.0f;
}
}
private ByteBuffer makeByteBuffer(byte[] array)
{
ByteBuffer bb = ByteBuffer.allocateDirect(array.length);
bb.put(array);
bb.position(0);
return bb;
}
private FloatBuffer makeFloatBuffer(float[] array)
{
ByteBuffer bb = ByteBuffer.allocateDirect(array.length * 4);
bb.order(ByteOrder.nativeOrder());
FloatBuffer fb = bb.asFloatBuffer();
fb.put(array);
fb.position(0);
return fb;
}
private class ColoredQuad {
private FloatBuffer vertexBuffer;
private FloatBuffer colorBuffer;
private ByteBuffer indexBuffer;
private float[] vertices = new float[12]; // 4 vertices * XYZ (12)
private float[] colors = new float[16]; // 4 vertices * RGBA (16)
private byte[] indices = {
0, 1, 2, 1, 2, 3
};
public ColoredQuad(Vertex3D bottomLeft, Vertex3D bottomRight, Vertex3D topLeft, Vertex3D topRight, RGBA color) {
vertices[0] = bottomLeft.x; vertices[1] = bottomLeft.y; vertices[2] = bottomLeft.z;
vertices[3] = bottomRight.x; vertices[4] = bottomRight.y; vertices[5] = bottomRight.z;
vertices[6] = topLeft.x; vertices[7] = topLeft.y; vertices[8] = topLeft.z;
vertices[9] = topRight.x; vertices[10]= topRight.y; vertices[11]= topRight.z;
colors[0] = color.red; colors[1] = color.green; colors[2] = color.blue; colors[3] = color.alpha;
colors[4] = color.red; colors[5] = color.green; colors[6] = color.blue; colors[7] = color.alpha;
colors[8] = color.red; colors[9] = color.green; colors[10]= color.blue; colors[11]= color.alpha;
colors[12]= color.red; colors[13]= color.green; colors[14]= color.blue; colors[15]= color.alpha;
vertexBuffer = makeFloatBuffer(vertices);
colorBuffer = makeFloatBuffer(colors);
indexBuffer = makeByteBuffer(indices);
}
public void draw(GL10 gl) {
//Point to our buffers
gl.glEnableClientState(GL10.GL_VERTEX_ARRAY);
gl.glEnableClientState(GL10.GL_COLOR_ARRAY);
//Set the face rotation
gl.glFrontFace(GL10.GL_CCW);
//Enable the vertex and texture state
gl.glVertexPointer(3, GL10.GL_FLOAT, 0, vertexBuffer);
gl.glColorPointer(4, GL10.GL_FLOAT, 0, colorBuffer);
//Draw the vertices as triangles, based on the Index Buffer information
gl.glDrawElements(GL10.GL_TRIANGLES, indices.length, GL10.GL_UNSIGNED_BYTE, indexBuffer);
//Disable the client state before leaving
gl.glDisableClientState(GL10.GL_VERTEX_ARRAY);
gl.glDisableClientState(GL10.GL_TEXTURE_COORD_ARRAY);
}
}
}
The reason you only see the blue one in front is because something is going wrong with the depth testing, and the blue one is simply drawn last (over everything else).
Where you say
gl.glClearDepthf(GL10.GL_DEPTH_TEST); // enable depth testing
you probably mean
gl.glEnable(GL10.GL_DEPTH_TEST);
and possibly
gl.glClearDepthf(1.0f);
but that's the default anyway.
Cheers, Aert.
I'm new to OpenGL and I'm teaching myself by making a 2D game for Android with ES 2.0. I am starting off by creating a "Sprite" class that creates a plane and renders a texture to it. To practice, I have two Sprite objects that are drawn alternating in the same place. I got this much working fine and well with ES 1.0, but now that I've switched to 2.0, I am getting a black screen with no errors. I'm exhausted trying to figure out what I'm doing wrong, but I have a strong feeling it has to do with my shaders. I'm going to dump all the relevant code here and hopefully somebody can give me an answer or some advice as to what I'm doing wrong. And if it's not immediately apparent what I'm doing wrong, perhaps some advice on how to figure it out? Thanks in advance for looking through all the code I'm about to post.
The three classes I'm posting are:
GameRenderer - the renderer for my GLSurfaceView
Shader - creates a shader program object
Sprite - creates a square and draws a texture on it
Also, I'll post my vertex and fragment shader source.
Related classes I didn't think were relevant enough to post:
GameActivity
GameView - A GLSurfaceView
GameLoopThread - My main game loop
FPSCounter - outputs the average FPS to logcat every 100 frames.
GameRender class:
package com.detour.raw;
import javax.microedition.khronos.egl.EGLConfig;
import javax.microedition.khronos.opengles.GL10;
import android.content.Context;
import android.graphics.Bitmap;
import android.opengl.GLES20;
import android.opengl.GLU;
import android.opengl.Matrix;
import android.opengl.GLSurfaceView;
public class GameRenderer implements GLSurfaceView.Renderer{
private static final String LOG_TAG = GameRenderer.class.getSimpleName();
Context mContext;
Bitmap bitmap;
private float red = 0.0f;
private float green = 0.0f;
private float blue = 0.0f;
Shader shader;
FPSCounter fps;
Sprite sprite;
Sprite sprite2;
int x = 0;
private float[] mProjMatrix = new float[16];
private float[] mVMatrix = new float[16];
//int[] vertexShader;
//int[] fragmentShader;
//int program;
//String vShaderSource = "";
//String fShaderSource = "";
public GameRenderer(Context context){
mContext = context;
//create objects/sprites
sprite = new Sprite(mContext);
sprite2 = new Sprite(mContext);
fps = new FPSCounter();
}
#Override
public void onDrawFrame(GL10 gl) {
GLES20.glClearColor(red, green, blue, 1.0f);
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);
if(x>3){
x=0;
}
if(x%2==0){
sprite.draw(gl);
}else{
sprite2.draw(gl);
}
x++;
fps.calculate();
//fps.draw(gl);
}
#Override
public void onSurfaceChanged(GL10 gl, int width, int height) {
GLES20.glViewport(0, 0, width, height);
float ratio = (float)(width/height);
Matrix.frustumM(mProjMatrix, 0, -ratio, ratio, -1, 1, 0.5f, 10);
}
#Override
public void onSurfaceCreated(GL10 gl, EGLConfig config) {
// TODO Auto-generated method stub
GLES20.glEnable(GLES20.GL_TEXTURE_2D);
GLES20.glEnable(GLES20.GL_DEPTH_TEST);
GLES20.glClearDepthf(1.0f);
GLES20.glDepthFunc(GLES20.GL_LEQUAL);
GLES20.glDepthMask(true);
GLES20.glEnable(GLES20.GL_CULL_FACE);
GLES20.glCullFace(GLES20.GL_BACK);
GLES20.glClearColor(red, green, blue, 1.0f);
//load sprite/object textures (preferably loop through an array of all sprites).
sprite.loadGLTexture(gl, mContext, R.drawable.raw1);
sprite2.loadGLTexture(gl, mContext, R.drawable.raw2);
Matrix.setLookAtM(mVMatrix, 0, 0, 0, -5.0f, 0.0f, 0f, 0f, 0f, 0.0f, 0.0f);
System.gc();
}
}
Shader class:
package com.detour.raw;
import java.io.BufferedReader;
import java.io.InputStream;
import java.io.InputStreamReader;
import android.content.Context;
import android.opengl.GLES20;
import android.util.Log;
public class Shader {
public static final String TAG = Shader.class.getSimpleName();
int program;
int vertexShader;
int fragmentShader;
String vShaderSource;
String fShaderSource;
public Shader(){
//blank constructor
//createProgram();
}
public Shader(String vs_source, String fs_source){
this.vShaderSource = vs_source;
this.fShaderSource = fs_source;
createProgram();
}
public Shader(int vs_source_id, int fs_source_id, Context context) {
StringBuffer vs = new StringBuffer();
StringBuffer fs = new StringBuffer();
try{
InputStream inputStream = context.getResources().openRawResource(vs_source_id);
BufferedReader in = new BufferedReader(new InputStreamReader(inputStream));
String read = in.readLine();
while (read != null) {
vs.append(read + "\n");
read = in.readLine();
}
vs.deleteCharAt(vs.length() - 1);
inputStream = context.getResources().openRawResource(fs_source_id);
in = new BufferedReader(new InputStreamReader(inputStream));
read = in.readLine();
while (read != null) {
fs.append(read + "\n");
read = in.readLine();
}
fs.deleteCharAt(fs.length() - 1);
}catch (Exception e){
Log.d("ERROR-readingShader", "Could not read shader: " + e.getLocalizedMessage());
}
this.vShaderSource = vs.toString();
this.fShaderSource = fs.toString();
createProgram();
}
private void createProgram(){
program = GLES20.glCreateProgram();
if(program!=0){
vertexShader = createShader(GLES20.GL_VERTEX_SHADER, vShaderSource);
fragmentShader = createShader(GLES20.GL_FRAGMENT_SHADER, fShaderSource);
GLES20.glAttachShader(program, vertexShader);
GLES20.glAttachShader(program, fragmentShader);
GLES20.glLinkProgram(program);
}else{
Log.e(TAG, "Couldn't create program.");
}
}
private int createShader(int type, String source){
int shader = GLES20.glCreateShader(type);
if(shader!=0){
GLES20.glShaderSource(shader, source);
GLES20.glCompileShader(shader);
}
return shader;
}
public int getProgram(){
return program;
}
Sprite class:
package com.detour.raw;
import java.io.IOException;
import java.io.InputStream;
import java.nio.ByteBuffer;
import java.nio.ByteOrder;
import java.nio.FloatBuffer;
import java.nio.IntBuffer;
import java.nio.ShortBuffer;
import javax.microedition.khronos.opengles.GL10;
import android.content.Context;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.opengl.GLES20;
import android.opengl.GLUtils;
public class Sprite {
//public static final int FRAME_WIDTH = 64;
//public static final int FRAME_HEIGHT = 64;
private static final String LOG_TAG = Sprite.class.getSimpleName();
Context mContext;
Bitmap bitmap;
private int textureLoc;
private int vertexLoc;
private int[] textures = new int[1];
//private int[] pixels;
/*private float textureCoordinates[] = {
0.0f, 0.0f,
0.0f, 1.0f,
1.0f, 1.0f,
1.0f, 0.0f};*/
private float vertices[] = {
-1.0f, 1.0f,// 0.0f,
-1.0f, -1.0f,// 0.0f,
1.0f, -1.0f,// 0.0f,
1.0f, 1.0f// 0.0f
};
private short[] indices = {
0, 1, 2,
0, 2, 3};
private FloatBuffer vertexBuffer;
//private IntBuffer textureBuffer;
private ShortBuffer indexBuffer;
Shader shader;
int program;
String vShaderSource = "";
String fShaderSource = "";
public Sprite(Context context){
mContext = context;
ByteBuffer vbb = ByteBuffer.allocateDirect(vertices.length * 4);
vbb.order(ByteOrder.nativeOrder());
vertexBuffer = vbb.asFloatBuffer();
vertexBuffer.put(vertices);
vertexBuffer.position(0);
ByteBuffer ibb = ByteBuffer.allocateDirect(indices.length * 2);
ibb.order(ByteOrder.nativeOrder());
indexBuffer = ibb.asShortBuffer();
indexBuffer.put(indices);
indexBuffer.position(0);
}
public void draw(GL10 gl) {
GLES20.glDrawElements(GLES20.GL_TRIANGLES, indices.length, GLES20.GL_FLOAT, indexBuffer);
}
public void loadGLTexture(GL10 gl, Context context, int id){
shader = new Shader(R.raw.sprite_vs, R.raw.sprite_fs, mContext);
program = shader.getProgram();
GLES20.glUseProgram(program);
vertexLoc = GLES20.glGetAttribLocation(program, "a_position");
textureLoc = GLES20.glGetUniformLocation(program, "u_texture"); //texture
InputStream is = context.getResources().openRawResource(id);
try {
bitmap = BitmapFactory.decodeStream(is);
} finally {
try {
is.close();
is = null;
} catch (IOException e) {
}
}
//pixels = new int[(bitmap.getWidth()*bitmap.getHeight())];
//bitmap.getPixels(pixels, 0, bitmap.getWidth(), 0, 0, bitmap.getWidth(), bitmap.getHeight());
/*ByteBuffer byteBuf = ByteBuffer.allocateDirect(pixels.length * 4);
byteBuf.order(ByteOrder.nativeOrder());
textureBuffer = byteBuf.asIntBuffer();
textureBuffer.put(pixels);
textureBuffer.position(0);*/
GLES20.glDeleteTextures(1, textures, 0);
GLES20.glGenTextures(1, textures, 0);
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textures[0]);
GLES20.glUniform1i(textureLoc, 0);
GLES20.glEnableVertexAttribArray(vertexLoc);
GLES20.glVertexAttribPointer(vertexLoc, 2, GLES20.GL_FLOAT, false, 0, vertexBuffer);
GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR);
GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_REPEAT);
GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_REPEAT);
//GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_RGBA, FRAME_WIDTH, FRAME_HEIGHT, 0, GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, byteBuf);//(GLES20.GL_TEXTURE_2D, 0, bitmap, 0);
GLUtils.texImage2D(GLES20.GL_TEXTURE_2D, 0, bitmap, 0);
bitmap.recycle();
}
}
Vertex shader (sprite_vs.txt):
#version 110
attribute vec2 a_position;
varying vec2 v_texcoord;
void main()
{
gl_Position = vec4(a_position, 0.0, 1.0);
v_texcoord = a_position * vec2(0.5) + vec2(0.5);
}
Fragment (pixel) shader (sprite_fs.txt):
#version 110
uniform sampler2D u_texture;
varying vec2 v_texcoord;
void main()
{
gl_FragColor = texture2D(u_texture, v_texcoord);
}
Thank you so much if you actually took the time to look through this! Hopefully someone else can use this as a resource for themselves in the future, also.
A few observations/questions:
I don't know how you changed the fragment shader, but the version that is currently posted needs a precision specifier. Just add:
precision mediump float;
to the top, and it should work. Now regarding the black screen here are some questions:
When you change the glClearColor to something not black and comment out all the draw commands, does it still look black? If so, then you have a bigger problem than textures.
Second, if you ignore the texture output and try drawing each sprite as just a flat colored rectangle with no texture data, what do you get? You should be able to see some colored rectangle on the screen.
Finally, you need to bind the texture before you call glDrawElements. (Though this shouldn't matter in this example since you haven't changed the state yet.)