Android OpenGLES2.0 - Solid Black Textures When Rendered - java

I've been making an Android OpenGLES2.0 2D game engine for the past week or so, and after a few bumps in the road, I've largely been successful. I've got the ModelMatrix, ProjectionMatrix, ViewMatrix, LightMatrix, shaders, 2D planes, and textures implemented. However, although my data is seemingly passing through this jungle of pipeline just fine, my textures do not appear, and are instead a solid black.
Most, if not all of my code was derived from this source, and it is ultimately the same, except that I created my own shader class, bounding box class, room class, and game object class to simplify the process of instantiating objects in-game. Renderer takes Room, Room takes GameObject(s) (SpaceShip extends game object), and GameObject takes BoundingBox, then Renderer renders the room's objects in a for loop. To do this, I moved the exact code from the example around so that certain handles are elements of some of the classes I created, instead of being elements of the renderer. This hasn't caused any problems with matrix multiplication or my data reaching the end of the pipeline, so I doubt moving the handles is the problem, but I felt it was important to know.
Things I've tried:
Changing the bitmap
Changed it to a bitmap with no alpha channel, both were 32x32 (2^5) and were .png.
Changing the order of operations
I moved glBindTexture in my implementation, so I moved it back, then back again.
Changing the texture parameters
I tried several combinations, none with mip-mapping
Changing the way I load the image
Went from BitmapFactory.decodeResource to BitmapFactory.decodeStream
Moved the texture to all drawable folders
Also tried it in the raw folder
Tried it on another device
My friend's DROID (Froyo 2.2), My rooted NextBook (Gingerbread 2.3). Both support OpenGLES2.0.
Thigs I haven't tried (That I'm aware of):
Changing the texture coordinates
They came directly from the example. I just took one face of the cube.
Changing my shader
It also came directly from the example (aside from it being it's own class now).
Restructuring my program to be just two (3, 4... x) classes
Dude...
I've been testing on the emulator (Eclipse Indigo, AVD, Intel Atom x86, ICS 4.2.2, API level 17) for some time now, and right about the time I got all the matrixes working, the emulator failed to render anything. It used to render just fine (when the projection was all screwy), now it just shows up black with a titlebar. This has made debugging incredibly difficult. I'm not sure if this is something related to what I've done (probably is) or if it is related to the emulator sucking at OpenGL.
Sorry to be so long winded and include so much code, but I don't know how to use a show/hide button.
Any ideas?
Edit: I was using the wrong shader from the example. The naming was very misleading. I wasn't passing in the color info. I still don't have texture, but the emulator works again. :)
OpenGLES20_2DRenderer
package mycompany.OpenGLES20_2DEngine;
import javax.microedition.khronos.egl.EGLConfig;
import javax.microedition.khronos.opengles.GL10;
import android.content.Context;
import android.opengl.GLES20;
import android.opengl.GLSurfaceView;
import android.opengl.Matrix;
import android.util.Log;
public class OpenGLES20_2DRenderer implements GLSurfaceView.Renderer {
/** Used for debug logs. */
private static final String TAG = "Renderer";
//Matrix Declarations*************************
/**
* Store the model matrix. This matrix is used to move models from object space (where each model can be thought
* of being located at the center of the universe) to world space.
*/
private float[] mModelMatrix = new float[16];
/**
* Store the view matrix. This can be thought of as our camera. This matrix transforms world space to eye space;
* it positions things relative to our eye.
*/
private float[] mViewMatrix = new float[16];
/** Store the projection matrix. This is used to project the scene onto a 2D viewport. */
private float[] mProjectionMatrix = new float[16];
/** Allocate storage for the final combined matrix. This will be passed into the shader program. */
private float[] mMVPMatrix = new float[16];
/**
* Stores a copy of the model matrix specifically for the light position.
*/
private float[] mLightModelMatrix = new float[16];
//********************************************
//Global Variable Declarations****************
//Shader
Shader shader;
//PointShader
PointShader pointShader;
//Application Context
Context context;
//A room to add objects to
Room room;
//********************************************
public OpenGLES20_2DRenderer(Context ctx) {
context = ctx;
}
public void onSurfaceCreated(GL10 unused, EGLConfig config) {
//Initialize GLES20***************************
// Set the background frame color
GLES20.glClearColor(0.0f, 1.0f, 0.0f, 1.0f);
// Use culling to remove back faces.
GLES20.glEnable(GLES20.GL_CULL_FACE);
// Enable depth testing
GLES20.glEnable(GLES20.GL_DEPTH_TEST);
// Position the eye in front of the origin.
final float eyeX = 0.0f;
final float eyeY = 0.0f;
final float eyeZ = -0.5f;
// We are looking toward the distance
final float lookX = 0.0f;
final float lookY = 0.0f;
final float lookZ = -5.0f;
// Set our up vector. This is where our head would be pointing were we holding the camera.
final float upX = 0.0f;
final float upY = 1.0f;
final float upZ = 0.0f;
// Set the view matrix. This matrix can be said to represent the camera position.
// NOTE: In OpenGL 1, a ModelView matrix is used, which is a combination of a model and
// view matrix. In OpenGL 2, we can keep track of these matrices separately if we choose.
Matrix.setLookAtM(mViewMatrix, 0, eyeX, eyeY, eyeZ, lookX, lookY, lookZ, upX, upY, upZ);
//********************************************
//Initialize Shaders**************************
shader = new Shader();
pointShader = new PointShader();
//********************************************
//Load The Level******************************
//Create a new room
room = new Room(800,600, 0);
//Load game objects
SpaceShip user = new SpaceShip();
//Load sprites
for(int i=0;i<room.numberOfGameObjects;i++) {
room.gameObjects[i].spriteGLIndex = room.gameObjects[i].loadSprite(context, room.gameObjects[i].spriteResId);
}
//Add them to the room
room.addGameObject(user);
//********************************************
}
public void onDrawFrame(GL10 unused) {
//Caclulate MVPMatrix*************************
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);
// Set our per-vertex lighting program.
GLES20.glUseProgram(shader.mProgram);
// Set program handles for object drawing.
shader.mMVPMatrixHandle = GLES20.glGetUniformLocation(shader.mProgram, "u_MVPMatrix");
shader.mMVMatrixHandle = GLES20.glGetUniformLocation(shader.mProgram, "u_MVMatrix");
shader.mLightPosHandle = GLES20.glGetUniformLocation(shader.mProgram, "u_LightPos");
shader.mTextureUniformHandle = GLES20.glGetUniformLocation(shader.mProgram, "u_Texture");
shader.mPositionHandle = GLES20.glGetAttribLocation(shader.mProgram, "a_Position");
shader.mColorHandle = GLES20.glGetAttribLocation(shader.mProgram, "a_Color");
shader.mNormalHandle = GLES20.glGetAttribLocation(shader.mProgram, "a_Normal");
shader.mTextureCoordinateHandle = GLES20.glGetAttribLocation(shader.mProgram, "a_TexCoordinate");
// Calculate position of the light. Rotate and then push into the distance.
Matrix.setIdentityM(mLightModelMatrix, 0);
Matrix.translateM(mLightModelMatrix, 0, 0.0f, 0.0f, -5.0f);
Matrix.rotateM(mLightModelMatrix, 0, 0, 0.0f, 1.0f, 0.0f);
Matrix.translateM(mLightModelMatrix, 0, 0.0f, 0.0f, 2.0f);
Matrix.multiplyMV(shader.mLightPosInWorldSpace, 0, mLightModelMatrix, 0, shader.mLightPosInModelSpace, 0);
Matrix.multiplyMV(shader.mLightPosInEyeSpace, 0, mViewMatrix, 0, shader.mLightPosInWorldSpace, 0);
//********************************************
//Draw****************************************
//Draw the background
//room.drawBackground(mMVPMatrix);
// Draw game objects
for(int i=0;i<room.numberOfGameObjects;i++) {
// Set the active texture unit to texture unit 0.
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
// Bind the texture to this unit.
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, room.gameObjects[i].spriteGLIndex);
// Tell the texture uniform sampler to use this texture in the shader by binding to texture unit 0.
GLES20.glUniform1i(shader.mTextureUniformHandle, 0);
//Set up the model matrix
Matrix.setIdentityM(mModelMatrix, 0);
Matrix.translateM(mModelMatrix, 0, 4.0f, 0.0f, -7.0f);
Matrix.rotateM(mModelMatrix, 0, room.gameObjects[i].rotation, 1.0f, 0.0f, 0.0f);
//Draw the object
room.gameObjects[i].draw(mModelMatrix, mViewMatrix, mProjectionMatrix, mMVPMatrix, shader);
}
//********************************************
// Draw a point to indicate the light.********
drawLight();
//********************************************
}
public void onSurfaceChanged(GL10 unused, int width, int height) {
//Initialize Projection Matrix****************
// Set the OpenGL viewport to the same size as the surface.
GLES20.glViewport(0, 0, width, height);
// Create a new perspective projection matrix. The height will stay the same
// while the width will vary as per aspect ratio.
final float ratio = (float) width / height;
final float left = -ratio;
final float right = ratio;
final float bottom = -1.0f;
final float top = 1.0f;
final float near = 1.0f;
final float far = 10.0f;
Matrix.frustumM(mProjectionMatrix, 0, left, right, bottom, top, near, far);
//********************************************
}
// Draws a point representing the position of the light.
private void drawLight()
{
GLES20.glUseProgram(pointShader.mProgram);
final int pointMVPMatrixHandle = GLES20.glGetUniformLocation(pointShader.mProgram, "u_MVPMatrix");
final int pointPositionHandle = GLES20.glGetAttribLocation(pointShader.mProgram, "a_Position");
// Pass in the position.
GLES20.glVertexAttrib3f(pointPositionHandle, shader.mLightPosInModelSpace[0], shader.mLightPosInModelSpace[1], shader.mLightPosInModelSpace[2]);
// Since we are not using a buffer object, disable vertex arrays for this attribute.
GLES20.glDisableVertexAttribArray(pointPositionHandle);
// Pass in the transformation matrix.
Matrix.multiplyMM(mMVPMatrix, 0, mViewMatrix, 0, mLightModelMatrix, 0);
Matrix.multiplyMM(mMVPMatrix, 0, mProjectionMatrix, 0, mMVPMatrix, 0);
GLES20.glUniformMatrix4fv(pointMVPMatrixHandle, 1, false, mMVPMatrix, 0);
// Draw the point.
GLES20.glDrawArrays(GLES20.GL_POINTS, 0, 1);
}
}
Shader
package mycompany.OpenGLES20_2DEngine;
import android.opengl.GLES20;
import android.util.Log;
public class Shader {
/** Used for debug logs. */
private static final String TAG = "Shader";
//Shaders*************************************
public int vertexShader;
public int fragmentShader;
//********************************************
//Handles*************************************
/** This will be used to pass in model position information. */
public int mPositionHandle;
/** This will be used to pass in model color information. */
public int mColorHandle;
/** This will be used to pass in model normal information. */
public int mNormalHandle;
/** This will be used to pass in model texture coordinate information. */
public int mTextureCoordinateHandle;
/** This will be used to pass in the transformation matrix. */
public int mMVPMatrixHandle;
/** This will be used to pass in the modelview matrix. */
public int mMVMatrixHandle;
/** This will be used to pass in the light position. */
public int mLightPosHandle;
/** This will be used to pass in the texture. */
public int mTextureUniformHandle;
/** Used to hold a light centered on the origin in model space. We need a 4th coordinate so we can get translations to work when
* we multiply this by our transformation matrices. */
public final float[] mLightPosInModelSpace = new float[] {0.0f, 0.0f, 0.0f, 1.0f};
/** Used to hold the current position of the light in world space (after transformation via model matrix). */
public final float[] mLightPosInWorldSpace = new float[4];
/** Used to hold the transformed position of the light in eye space (after transformation via modelview matrix) */
public final float[] mLightPosInEyeSpace = new float[4];
//********************************************
//GL Code For Shaders*************************
public final String vertexShaderCode =
// A constant representing the combined model/view/projection matrix.
"uniform mat4 u_MVPMatrix;" + "\n" +
// A constant representing the combined model/view matrix.
"uniform mat4 u_MVMatrix;" + "\n" +
// Per-vertex position information we will pass in.
"attribute vec4 a_Position;" + "\n" +
// Per-vertex normal information we will pass in.
"attribute vec3 a_Normal;" + "\n" +
// Per-vertex texture coordinate information we will pass in.
"attribute vec2 a_TexCoordinate;" + "\n" +
// This will be passed into the fragment shader.
"varying vec3 v_Position;" + "\n" +
// This will be passed into the fragment shader.
"varying vec3 v_Normal;" + "\n" +
// This will be passed into the fragment shader.
"varying vec2 v_TexCoordinate;" + "\n" +
// The entry point for our vertex shader.
"void main()" + "\n" +
"{" + "\n" +
// Transform the vertex into eye space.
"v_Position = vec3(u_MVMatrix * a_Position);" + "\n" +
// Pass through the texture coordinate.
"v_TexCoordinate = a_TexCoordinate;" + "\n" +
// Transform the normal's orientation into eye space.
"v_Normal = vec3(u_MVMatrix * vec4(a_Normal, 0.0));" + "\n" +
// gl_Position is a special variable used to store the final position.
// Multiply the vertex by the matrix to get the final point in normalized screen coordinates.
"gl_Position = u_MVPMatrix * a_Position;" + "\n" +
"}";
public final String fragmentShaderCode =
"precision mediump float;" + "\n" + // Set the default precision to medium. We don't need as high of a
// precision in the fragment shader.
"uniform vec3 u_LightPos;" + "\n" + // The position of the light in eye space.
"uniform sampler2D u_Texture;" + "\n" + // The input texture.
"varying vec3 v_Position;" + "\n" + // Interpolated position for this fragment.
"varying vec3 v_Normal;" + "\n" + // Interpolated normal for this fragment.
"varying vec2 v_TexCoordinate;" + "\n" + // Interpolated texture coordinate per fragment.
// The entry point for our fragment shader.
"void main()" + "\n" +
"{" + "\n" +
// Will be used for attenuation.
"float distance = length(u_LightPos - v_Position);" + "\n" +
// Get a lighting direction vector from the light to the vertex.
"vec3 lightVector = normalize(u_LightPos - v_Position);" + "\n" +
// Calculate the dot product of the light vector and vertex normal. If the normal and light vector are
// pointing in the same direction then it will get max illumination.
"float diffuse = max(dot(v_Normal, lightVector), 0.0);" + "\n" +
// Add attenuation.
"diffuse = diffuse * (1.0 / (1.0 + (0.25 * distance)));" + "\n" +
// Add ambient lighting
"diffuse = diffuse + 0.7;" + "\n" +
// Multiply the color by the diffuse illumination level and texture value to get final output color.
"gl_FragColor = (diffuse * texture2D(u_Texture, v_TexCoordinate));" + "\n" +
"}";
//********************************************
//GL Program Handle***************************
public int mProgram;
//********************************************
public Shader() {
//Load Shaders********************************
vertexShader = compileShader(GLES20.GL_VERTEX_SHADER, vertexShaderCode);
fragmentShader = compileShader(GLES20.GL_FRAGMENT_SHADER, fragmentShaderCode);
//********************************************
//Create GL Program***************************
mProgram = createAndLinkProgram(vertexShader, fragmentShader, new String[] {"a_Position", "a_Color", "a_Normal", "a_TexCoordinate"});
//********************************************
}
/**
* Helper function to compile a shader.
*
* #param shaderType The shader type.
* #param shaderSource The shader source code.
* #return An OpenGL handle to the shader.
*/
public static int compileShader(final int shaderType, final String shaderSource)
{
int shaderHandle = GLES20.glCreateShader(shaderType);
if (shaderHandle != 0)
{
// Pass in the shader source.
GLES20.glShaderSource(shaderHandle, shaderSource);
// Compile the shader.
GLES20.glCompileShader(shaderHandle);
// Get the compilation status.
final int[] compileStatus = new int[1];
GLES20.glGetShaderiv(shaderHandle, GLES20.GL_COMPILE_STATUS, compileStatus, 0);
// If the compilation failed, delete the shader.
if (compileStatus[0] == 0)
{
Log.e(TAG, "Error compiling shader " /*+ GLES20.glGetShaderInfoLog(shaderHandle)*/);
GLES20.glDeleteShader(shaderHandle);
shaderHandle = 0;
}
}
if (shaderHandle == 0)
{
throw new RuntimeException("Error creating shader.");
}
return shaderHandle;
}
/**
* Helper function to compile and link a program.
*
* #param vertexShaderHandle An OpenGL handle to an already-compiled vertex shader.
* #param fragmentShaderHandle An OpenGL handle to an already-compiled fragment shader.
* #param attributes Attributes that need to be bound to the program.
* #return An OpenGL handle to the program.
*/
public static int createAndLinkProgram(final int vertexShaderHandle, final int fragmentShaderHandle, final String[] attributes)
{
int programHandle = GLES20.glCreateProgram();
if (programHandle != 0)
{
// Bind the vertex shader to the program.
GLES20.glAttachShader(programHandle, vertexShaderHandle);
// Bind the fragment shader to the program.
GLES20.glAttachShader(programHandle, fragmentShaderHandle);
// Bind attributes
if (attributes != null)
{
final int size = attributes.length;
for (int i = 0; i < size; i++)
{
GLES20.glBindAttribLocation(programHandle, i, attributes[i]);
}
}
// Link the two shaders together into a program.
GLES20.glLinkProgram(programHandle);
// Get the link status.
final int[] linkStatus = new int[1];
GLES20.glGetProgramiv(programHandle, GLES20.GL_LINK_STATUS, linkStatus, 0);
// If the link failed, delete the program.
if (linkStatus[0] == 0)
{
Log.e(TAG, "Error compiling program " /*+ GLES20.glGetProgramInfoLog(programHandle)*/);
GLES20.glDeleteProgram(programHandle);
programHandle = 0;
}
}
if (programHandle == 0)
{
throw new RuntimeException("Error creating program.");
}
return programHandle;
}
}
GameObject
package mycompany.OpenGLES20_2DEngine;
import java.io.IOException;
import java.io.InputStream;
import android.content.Context;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.opengl.GLES20;
import android.opengl.GLUtils;
import android.opengl.Matrix;
import android.util.Log;
public class GameObject {
/** Used for debug logs. */
private static final String TAG = "GameObject";
//Declare Variables****************************
//Position
public int x;
public int y;
public int z;
//Size
public int width;
public int height;
//Movement
double thrustX;
double thrustY;
//Rotation
public int rotation;
public int rotationSpeed;
//Unique Identifier
public int UID;
//Sprite Resource ID
int spriteResId;
//GL Texture Reference
int spriteGLIndex;
//Bounding Box
BoundingBox boundingBox;
//********************************************
GameObject() {
}
public int loadSprite(final Context context, final int resourceId) {
final int[] textureHandle = new int[1];
GLES20.glGenTextures(1, textureHandle, 0);
if (textureHandle[0] != 0)
{
final BitmapFactory.Options options = new BitmapFactory.Options();
options.inScaled = false; // No pre-scaling
// Read in the resource
InputStream is = context.getResources()
.openRawResource(resourceId);
Bitmap bitmap = null;
try {
bitmap = BitmapFactory.decodeStream(is);
is.close();
} catch(IOException e) {
Log.e(TAG, "Could not load the texture");
}
// Bind to the texture in OpenGL
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureHandle[0]);
// Set filtering
//TODO: Offending Line - Makes textures black because of parameters
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_NEAREST);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_NEAREST);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);
// Load the bitmap into the bound texture.
GLUtils.texImage2D(GLES20.GL_TEXTURE_2D, 0, bitmap, 0);
// Recycle the bitmap, since its data has been loaded into OpenGL.
bitmap.recycle();
}
if (textureHandle[0] == 0)
{
throw new RuntimeException("Error loading texture.");
}
return textureHandle[0];
}
public void setUID(int uid) {
UID = uid;
}
public int getUID() {
return UID;
}
public void draw(float[] mModelMatrix, float[] mViewMatrix, float[] mProjectionMatrix, float[] mMVPMatrix, Shader shader) {
{
// Pass in the position information
boundingBox.mPositions.position(0);
GLES20.glVertexAttribPointer(shader.mPositionHandle, boundingBox.mPositionDataSize, GLES20.GL_FLOAT, false,
0, boundingBox.mPositions);
GLES20.glEnableVertexAttribArray(shader.mPositionHandle);
// Pass in the color information
boundingBox.mColors.position(0);
GLES20.glVertexAttribPointer(shader.mColorHandle, boundingBox.mColorDataSize, GLES20.GL_FLOAT, false,
0, boundingBox.mColors);
GLES20.glEnableVertexAttribArray(shader.mColorHandle);
// Pass in the normal information
boundingBox.mNormals.position(0);
GLES20.glVertexAttribPointer(shader.mNormalHandle, boundingBox.mNormalDataSize, GLES20.GL_FLOAT, false,
0, boundingBox.mNormals);
GLES20.glEnableVertexAttribArray(shader.mNormalHandle);
// Pass in the texture coordinate information
boundingBox.mTextureCoordinates.position(0);
GLES20.glVertexAttribPointer(shader.mTextureCoordinateHandle, boundingBox.mTextureCoordinateDataSize, GLES20.GL_FLOAT, false,
0, boundingBox.mTextureCoordinates);
GLES20.glEnableVertexAttribArray(shader.mTextureCoordinateHandle);
// This multiplies the view matrix by the model matrix, and stores the result in the MVP matrix
// (which currently contains model * view).
Matrix.multiplyMM(mMVPMatrix, 0, mViewMatrix, 0, mModelMatrix, 0);
// Pass in the modelview matrix.
GLES20.glUniformMatrix4fv(shader.mMVMatrixHandle, 1, false, mMVPMatrix, 0);
// This multiplies the modelview matrix by the projection matrix, and stores the result in the MVP matrix
// (which now contains model * view * projection).
Matrix.multiplyMM(mMVPMatrix, 0, mProjectionMatrix, 0, mMVPMatrix, 0);
// Pass in the combined matrix.
GLES20.glUniformMatrix4fv(shader.mMVPMatrixHandle, 1, false, mMVPMatrix, 0);
// Pass in the light position in eye space.
GLES20.glUniform3f(shader.mLightPosHandle, shader.mLightPosInEyeSpace[0], shader.mLightPosInEyeSpace[1], shader.mLightPosInEyeSpace[2]);
// Draw the object
GLES20.glDrawArrays(GLES20.GL_TRIANGLES, 0, 6);
}
}
}
BoundingBox
package mycompany.OpenGLES20_2DEngine;
import java.nio.ByteBuffer;
import java.nio.ByteOrder;
import java.nio.FloatBuffer;
//TODO: make this dynamic, both the constructor and the coordinates.
class BoundingBox {
//Variable Declarations***********************
/** How many bytes per float. */
private final int mBytesPerFloat = 4;
/** Store our model data in a float buffer. */
public final FloatBuffer mPositions;
public final FloatBuffer mColors;
public final FloatBuffer mNormals;
public final FloatBuffer mTextureCoordinates;
//Number of coordinates per vertex in this array
final int COORDS_PER_VERTEX = 3;
//Coordinates
float[] positionData;
//Texture Coordinates
float[] textureCoordinateData;
//Vertex Color
float[] colorData;
float[] normalData;
//Vertex Stride
final int vertexStride = COORDS_PER_VERTEX * 4;
/** Size of the position data in elements. */
public final int mPositionDataSize = 3;
/** Size of the color data in elements. */
public final int mColorDataSize = 4;
/** Size of the normal data in elements. */
public final int mNormalDataSize = 3;
/** Size of the texture coordinate data in elements. */
public final int mTextureCoordinateDataSize = 2;
//********************************************
public BoundingBox(float[] coords) {
//TODO: Normalize values
//Set Coordinates and Texture Coordinates*****
if(coords==null) {
float[] newPositionData = {
// Front face
-1.0f, 1.0f, 1.0f,
-1.0f, -1.0f, 1.0f,
1.0f, 1.0f, 1.0f,
-1.0f, -1.0f, 1.0f,
1.0f, -1.0f, 1.0f,
1.0f, 1.0f, 1.0f
};
positionData = newPositionData;
float[] newColorData = {
// Front face (red)
1.0f, 0.0f, 0.0f, 1.0f,
1.0f, 0.0f, 0.0f, 1.0f,
1.0f, 0.0f, 0.0f, 1.0f,
1.0f, 0.0f, 0.0f, 1.0f,
1.0f, 0.0f, 0.0f, 1.0f,
1.0f, 0.0f, 0.0f, 1.0f
};
colorData = newColorData;
float[] newTextureCoordinateData =
{
// Front face
0.0f, 0.0f,
0.0f, 1.0f,
1.0f, 0.0f,
0.0f, 1.0f,
1.0f, 1.0f,
1.0f, 0.0f,
};
textureCoordinateData = newTextureCoordinateData;
float[] newNormalData = {
// Front face
0.0f, 0.0f, 1.0f,
0.0f, 0.0f, 1.0f,
0.0f, 0.0f, 1.0f,
0.0f, 0.0f, 1.0f,
0.0f, 0.0f, 1.0f,
0.0f, 0.0f, 1.0f
};
normalData = newNormalData;
}
else {
positionData = coords;
//TODO:Reverse coords HERE
textureCoordinateData = coords;
}
//********************************************
//Initialize Buffers**************************
mPositions = ByteBuffer.allocateDirect(positionData.length * mBytesPerFloat)
.order(ByteOrder.nativeOrder()).asFloatBuffer();
mPositions.put(positionData).position(0);
mColors = ByteBuffer.allocateDirect(colorData.length * mBytesPerFloat)
.order(ByteOrder.nativeOrder()).asFloatBuffer();
mColors.put(colorData).position(0);
mNormals = ByteBuffer.allocateDirect(normalData.length * mBytesPerFloat)
.order(ByteOrder.nativeOrder()).asFloatBuffer();
mNormals.put(normalData).position(0);
mTextureCoordinates = ByteBuffer.allocateDirect(textureCoordinateData.length * mBytesPerFloat)
.order(ByteOrder.nativeOrder()).asFloatBuffer();
mTextureCoordinates.put(textureCoordinateData).position(0);
//********************************************
}
}
SpaceShip
package mycompany.OpenGLES20_2DEngine;
public class SpaceShip extends GameObject{
public SpaceShip() {
spriteResId = R.drawable.spaceship;
boundingBox = new BoundingBox(null);
}
}

Got it. I added the spaceship to the room AFTER I loaded it's bitmap (from the room).
//Load The Level******************************
//Create a new room
room = new Room(800,600, 0);
//Load game objects
SpaceShip user = new SpaceShip();
**//Load sprites
for(int i=0;i<room.numberOfGameObjects;i++) {
room.gameObjects[i].spriteGLIndex = room.gameObjects[i].loadSprite(context, room.gameObjects[i].spriteResId);
}
//Add them to the room
room.addGameObject(user);**
//********************************************

Related

How do I change my OpenGL object color only when I touch it?

I am trying to change my OpenGL square's color only when it is touched. I looked around online at some good sources to see how I could find the coordinates to change its color, Converting pixel co-ordinates to normalized co-ordinates at draw time in OpenGL 3.0. However, I am still confused about how to get my square's or onTouchEvent inputs coordinates to be translated in OpenGL code(vertexShaderCode). I have tried to directly track my square coordinates in the onTouchEvent activity, but it wrongly tracks the position since I am working with two different coordinate systems(OpenGl, Android Studios).
//THIS IS NOT MY FULL CODE
public boolean onTouchEvent(MotionEvent e) {
// MotionEvent reports input details from the touch screen
// and other input controls. In this case, you are only
// interested in events where the touch position changed.
float x = e.getX();
float y = e.getY();
colorHolder = renderer.getmSquare().getColor();
switch (e.getAction()) {
case MotionEvent.ACTION_DOWN:
//THIS IS MY PROBLEM. I DON'T KNOW A GOOD WAY OF TRACKING THE SQUARE'S POSITION BESIDES
//ADDING VARIBLE TO IT'S MAIN CLASS THEN REFERENCING THEM HERE
if(renderer.mSquareY > (y / getHeight()) && renderer.mSquareX > (x / getWidth()))
renderer.getmSquare().color = tempColor;
case MotionEvent.ACTION_MOVE:
float dx = x - previousX;
float dy = y - previousY;
float tempHeight = y / getHeight();
float tempWidth = x / getWidth();
//THIS IS MY PROBLEM. I DON'T KNOW A GOOD WAY OF TRACKING THE SQUARE'S POSITION BESIDES ADDING VARIBLE TO IT'S MAIN CLASS THEN REFERENCING THEM HERE
if(renderer.mSquareY < (y / getHeight()) && renderer.mSquareX < (x / getWidth()))
renderer.getmSquare().color = tempColor;
renderer.mSquareX = (x / getWidth());
renderer.mSquareY = (y / getHeight());
...
I have three classes that handle creating the square, handles rendering, and the main activity in the corresponding order: Square.java, MyGLRenderer.java, MyGLSurfaceView.java.
public class Square {
private final String vertexShaderCode =
// This matrix member variable provides a hook to manipulate
// the coordinates of the objects that use this vertex shader
"uniform mat4 uMVPMatrix;" +
"attribute vec4 vPosition;" +
"void main() {" +
// The matrix must be included as a modifier of gl_Position.
// Note that the uMVPMatrix factor *must be first* in order
// for the matrix multiplication product to be correct.
" gl_Position = uMVPMatrix * vPosition;" +
"}";
private final String fragmentShaderCode =
"precision mediump float;" +
"uniform vec4 vColor;" +
"void main() {" +
" gl_FragColor = vColor;" +
"}";
private FloatBuffer vertexBuffer;
private ShortBuffer drawListBuffer;
private final int mProgram;
private int mPositionHandle;
private int mColorHandle;
private int mMVPMatrixHandle;
// number of coordinates per vertex in this array
static final int COORDS_PER_VERTEX = 3;
static float squareCoords[] = {
0.5f, 0.5f, 0.0f, // top left
0.5f, -0.5f, 0.0f, // bottom left
-0.5f, -0.5f, 0.0f, // bottom right
-0.5f, 0.5f, 0.0f }; // top right
private short drawOrder[] = { 0, 1, 2, 0, 2, 3 }; // order to draw vertices
private final int vertexStride = COORDS_PER_VERTEX * 4; // 4 bytes per vertex
public float[] getColor() {
return color;
}
public void setColor(float[] color) {
this.color = color;
}
float color[] = { 0.2f, 0.709803922f, 0.898039216f, 1.0f };
public Square() {
// initialize vertex byte buffer for shape coordinates
ByteBuffer bb = ByteBuffer.allocateDirect(
// (# of coordinate values * 4 bytes per float)
squareCoords.length * 4);
bb.order(ByteOrder.nativeOrder());
vertexBuffer = bb.asFloatBuffer();
vertexBuffer.put(squareCoords);
vertexBuffer.position(0);
// initialize byte buffer for the draw list
ByteBuffer dlb = ByteBuffer.allocateDirect(
// (# of coordinate values * 2 bytes per short)
drawOrder.length * 2);
dlb.order(ByteOrder.nativeOrder());
drawListBuffer = dlb.asShortBuffer();
drawListBuffer.put(drawOrder);
drawListBuffer.position(0);
// prepare shaders and OpenGL program
int vertexShader = MyGLRenderer.loadShader(
GLES20.GL_VERTEX_SHADER,
vertexShaderCode);
int fragmentShader = MyGLRenderer.loadShader(
GLES20.GL_FRAGMENT_SHADER,
fragmentShaderCode);
mProgram = GLES20.glCreateProgram(); // create empty OpenGL Program
GLES20.glAttachShader(mProgram, vertexShader); // add the vertex shader to program
GLES20.glAttachShader(mProgram, fragmentShader); // add the fragment shader to program
GLES20.glLinkProgram(mProgram); // create OpenGL program executables
}
public void draw(float[] mvpMatrix) {
// Add program to OpenGL environment
GLES20.glUseProgram(mProgram);
// get handle to vertex shader's vPosition member
mPositionHandle = GLES20.glGetAttribLocation(mProgram, "vPosition");
// Enable a handle to the triangle vertices
GLES20.glEnableVertexAttribArray(mPositionHandle);
// Prepare the triangle coordinate data
GLES20.glVertexAttribPointer(
mPositionHandle, COORDS_PER_VERTEX,
GLES20.GL_FLOAT, false,
vertexStride, vertexBuffer);
// get handle to fragment shader's vColor member
mColorHandle = GLES20.glGetUniformLocation(mProgram, "vColor");
// Set color for drawing the triangle
GLES20.glUniform4fv(mColorHandle, 1, color, 0);
// get handle to shape's transformation matrix
mMVPMatrixHandle = GLES20.glGetUniformLocation(mProgram, "uMVPMatrix");
//MyGLRenderer.checkGlError("glGetUniformLocation");
// Apply the projection and view transformation
GLES20.glUniformMatrix4fv(mMVPMatrixHandle, 1, false, mvpMatrix, 0);
//MyGLRenderer.checkGlError("glUniformMatrix4fv");
// Draw the square
GLES20.glDrawElements(
GLES20.GL_TRIANGLES, drawOrder.length,
GLES20.GL_UNSIGNED_SHORT, drawListBuffer);
// Disable vertex array
GLES20.glDisableVertexAttribArray(mPositionHandle);
}
}
public class MyGLRenderer implements GLSurfaceView.Renderer {
private Triangle mTriangle;
public Square getmSquare() {
return mSquare;
}
public void setmSquare(Square mSquare) {
this.mSquare = mSquare;
}
private Square mSquare;
private Circle mCircle;
// vPMatrix is an abbreviation for "Model View Projection Matrix"
private final float[] vPMatrix = new float[16];
private final float[] projectionMatrix = new float[16];
private final float[] viewMatrix = new float[16];
private float[] rotationMatrix = new float[16];
private float[] translationMatrix = new float[16];
private float[] scaleMatrix = new float[16];
public volatile float mAngle;
public float mSquareX = 1.5f;
public float mSquareY = 0.0f;
public float mRadius = 1.0f;
public float getAngle() {
return mAngle;
}
public void setAngle(float angle) {
mAngle = angle;
}
public void onSurfaceCreated(GL10 unused, EGLConfig eglconfig) {
// Set the background frame color
GLES20.glClearColor(0.0f, 0.0f, 1.0f, 1.0f);
// initialize a triangle
mTriangle = new Triangle();
// initialize a square
mSquare = new Square();
// initialize a square
mCircle = new Circle();
}
#Override
public void onDrawFrame(GL10 unused) {
float[] scratch = new float[16];
float[] movementSquare = new float[16];
float[] scaleCircle = new float[16];
float tempscaleFactor = 1.0f * mRadius;
// Redraw background color
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
// Set the camera position (View matrix)
Matrix.setLookAtM(viewMatrix, 0, 0, 0, -3, 0f, 0f, 0f, 0f, 1.0f, 0.0f);
// Calculate the projection and view transformation
Matrix.multiplyMM(vPMatrix, 0, projectionMatrix, 0, viewMatrix, 0);
// Create a rotation transformation for the triangle
//long time = SystemClock.uptimeMillis() % 4000L;
//float angle = 0.090f * ((int) time);
Matrix.setRotateM(rotationMatrix, 0, mAngle, 0, 0, -1.0f);
Matrix.setIdentityM(translationMatrix,0);
Matrix.translateM(translationMatrix, 0, mSquareX, mSquareY,0);
//THIS PROBLEM HERE IS THAT MY CIRCLE TRANSLATE'S ON THE X-AXIS WHEN SCALING. MY GOAL IS TO TRY AND KEEP IT IN PLACE WHILE IT'S BEING SCALED. Y-AXIS HAS NOT ISSUES
Matrix.setIdentityM(scaleMatrix, 0);
Matrix.scaleM(scaleMatrix, 0, mRadius, mRadius, 0);
if(mRadius != 1f)
Matrix.translateM(scaleMatrix, 0, -(1 + (mRadius / 2)),0,0);
// Combine the rotation matrix with the projection and camera view
// Note that the vPMatrix factor *must be first* in order
// for the matrix multiplication product to be correct.
Matrix.multiplyMM(movementSquare, 0, vPMatrix, 0, translationMatrix, 0);
Matrix.multiplyMM(scratch, 0, vPMatrix, 0, rotationMatrix, 0);
Matrix.multiplyMM(scaleCircle, 0, vPMatrix, 0, scaleMatrix, 0);
// Draw shape
mTriangle.draw(scratch);
mSquare.draw(movementSquare);
mCircle.draw(scaleCircle);
}
#Override
public void onSurfaceChanged(GL10 unused, int width, int height) {
GLES20.glViewport(0, 0, width, height);
float ratio = (float) width / height;
// this projection matrix is applied to object coordinates
// in the onDrawFrame() method
Matrix.frustumM(projectionMatrix, 0, -ratio, ratio, -1, 1, 2, 7);
}
public static int loadShader(int type, String shaderCode){
// create a vertex shader type (GLES20.GL_VERTEX_SHADER)
// or a fragment shader type (GLES20.GL_FRAGMENT_SHADER)
int shader = GLES20.glCreateShader(type);
// add the source code to the shader and compile it
GLES20.glShaderSource(shader, shaderCode);
GLES20.glCompileShader(shader);
return shader;
}
}

OpenGL ES 3.0 call object by reference To add gravity and OnCLick Event

I am new to OpenGL and I am trying to find a way to give a drawn object such as a triangle an ID of some kind. This way I can call the Id and give it motion as well as ad touch events.
I am not sure if this is the correct way or if there is a much better way to do this. I have objects made but do not know how to call them and give them motion or onClick event. I have looked around however a lot of the ways seem to be outdated and do not work or the link is now dead.
I have a Renderer as such:
public class MyGLRenderer implements GLSurfaceView.Renderer {
private Triangle mTriangle;
// Called once to set up the view's opengl es environment
public void onSurfaceCreated(GL10 unused, EGLConfig config){
//Set the background frame color
GLES30.glClearColor(255.0f,255.0f,255.0f,0.0f);
mTriangle = new Triangle();
}
// Called for each redraw of the view
public void onDrawFrame(GL10 gl){
gl.glClear(GL10.GL_COLOR_BUFFER_BIT | GL10.GL_DEPTH_BUFFER_BIT);
//Redraw background color
GLES30.glClear(GLES30.GL_COLOR_BUFFER_BIT);
mTriangle.draw();
}
// Called if the geometry of the view changes (example is when the screen orientation changes from landscape to portrait
public void onSurfaceChanged(GL10 unused, int width, int height){
// Called if the geometry of the viewport changes
GLES30.glViewport(0, 0, width, height);
}
public static int loadShader(int type, String shaderCode){
// create a vertex shader type (GLES30.GL_VERTEX_SHADER)
// or a fragment shader type (GLES30.GL_FRAGMENT_SHADER)
int shader = GLES30.glCreateShader(type);
// add the source code to the shader and compile it
GLES30.glShaderSource(shader, shaderCode);
GLES30.glCompileShader(shader);
return shader;
}
}
Surface View as Such:
public class MyGLSurfaceView extends GLSurfaceView {
private final MyGLRenderer mRenderer;
public MyGLSurfaceView(Context context, AttributeSet attrs){
super(context, attrs);
//Create an OpenGl 3.0 context
setEGLContextClientVersion(3);
mRenderer = new MyGLRenderer();
//Set the Renderer for drawing on the GLSurfaceView
setRenderer(mRenderer);
//Render the view only when there is a change in the drawing data
setRenderMode(GLSurfaceView.RENDERMODE_WHEN_DIRTY);
}
}
And a Triangle Class as such:
public class Triangle {
private FloatBuffer vertexBuffer;
private final String vertexShaderCode =
"attribute vec4 vPosition;" +
"void main() {" +
" gl_Position = vPosition;" +
"}";
private final String fragmentShaderCode =
"precision mediump float;" +
"uniform vec4 vColor;" +
"void main() {" +
" gl_FragColor = vColor;" +
"}";
private final int mProgram;
// number of coordinates per vertex in this array
static final int COORDS_PER_VERTEX = 3;
static float triangleCoords[] = { // in counterclockwise order:
0.0f, 0.622008459f, 0.0f, // top
-0.5f, -0.311004243f, 0.0f, // bottom left
0.5f, -0.311004243f, 0.0f // bottom right
};
// Set color with red, green, blue and alpha (opacity) values
float color[] = { 0.63671875f, 0.76953125f, 0.22265625f, 1.0f };
public Triangle() {
// initialize vertex byte buffer for shape coordinates
ByteBuffer bb = ByteBuffer.allocateDirect(
// (number of coordinate values * 4 bytes per float)
triangleCoords.length * 4);
// use the device hardware's native byte order
bb.order(ByteOrder.nativeOrder());
// create a floating point buffer from the ByteBuffer
vertexBuffer = bb.asFloatBuffer();
// add the coordinates to the FloatBuffer
vertexBuffer.put(triangleCoords);
// set the buffer to read the first coordinate
vertexBuffer.position(0);
int vertexShader = MyGLRenderer.loadShader(GLES30.GL_VERTEX_SHADER,
vertexShaderCode);
int fragmentShader = MyGLRenderer.loadShader(GLES30.GL_FRAGMENT_SHADER,
fragmentShaderCode);
// create empty OpenGL ES Program
mProgram = GLES30.glCreateProgram();
// add the vertex shader to program
GLES30.glAttachShader(mProgram, vertexShader);
// add the fragment shader to program
GLES30.glAttachShader(mProgram, fragmentShader);
// creates OpenGL ES program executables
GLES30.glLinkProgram(mProgram);
}
private int mPositionHandle;
private int mColorHandle;
private final int vertexCount = triangleCoords.length / COORDS_PER_VERTEX;
private final int vertexStride = COORDS_PER_VERTEX * 4; // 4 bytes per vertex
public void draw() {
// Add program to OpenGL ES environment
GLES30.glUseProgram(mProgram);
// get handle to vertex shader's vPosition member
mPositionHandle = GLES30.glGetAttribLocation(mProgram, "vPosition");
// Enable a handle to the triangle vertices
GLES30.glEnableVertexAttribArray(mPositionHandle);
// Prepare the triangle coordinate data
GLES30.glVertexAttribPointer(mPositionHandle, COORDS_PER_VERTEX,
GLES30.GL_FLOAT, false,
vertexStride, vertexBuffer);
// get handle to fragment shader's vColor member
mColorHandle = GLES30.glGetUniformLocation(mProgram, "vColor");
// Set color for drawing the triangle
GLES30.glUniform4fv(mColorHandle, 1, color, 0);
// Draw the triangle
GLES30.glDrawArrays(GLES30.GL_TRIANGLES, 0, vertexCount);
// Disable vertex array
GLES30.glDisableVertexAttribArray(mPositionHandle);
}
I would like something to do the following:
drawnTriangleObject_ID.Add gravity to move down or up
drawnTriangleObject_ID.OnClick( // Do Something when this object is clicked )
In your triangle class, add positioning data
float x = 0.0f;
float y = 0.0f;
float z = 0.0f;
and when drawing the triangle, you have to apply the translation
Matrix.setIdentityM(modelmatrix, 0);
Matrix.translateM(modelmatrix, 0, x, y, z);
then multiply the model matrix by the view matrix
Matrix.multiplyMM(resultmodelview, 0, viewmatrix, 0, modelmatrix, 0);
then multiply the result with projection matrix
Matrix.multiplyMM(resultresultprojection, 0, ProjectionMatrix, 0, resultmodelview, 0);
and publish it
World.glUniformMatrix4fv(mMVPMatrixHandle, 1, false, resultresultprojection, 0);
This all supposing you have the frustum and projection and view matrix built already (which if you see the triangle, could be already done)..
Good luck and have fun coding !
As for the "OnClick" it is a bit more tricky:
- Once the screen has been clicked, cast a ray from your XY plance (screen) to the 3D world. the ray (line) will have 2 coordinates, a start point and end point in form of X,Y,Z, or might have a start point and a vector(direction of the line)... On each frame you have to check if this ray is created, if it is, you need to use some Math to check if that line intersects your triangle (Don't forget to apply the rotation and translation of the triangle to it's vertices before checking intersection with the Ray). when the frame has been drawn don't forget to delete the ray

OpenGL ES 2.0 2D-Image displaying

I have followed the official OpenGL ES tutorial to create a working OpenGL-environment. I have been able to do anything I wanted to except for displaying 2D-images. To do so I worked through this tutorial and came up with the following:
Here is my Sprite class:
public class Sprite
{
//Reference to Activity Context
private final Context mActivityContext;
//Added for Textures
private final FloatBuffer mCubeTextureCoordinates;
private int mTextureUniformHandle;
private int mTextureCoordinateHandle;
private final int mTextureCoordinateDataSize = 2;
private int mTextureDataHandle;
private final String vertexShaderCode =
"attribute vec2 a_TexCoordinate;" +
"varying vec2 v_TexCoordinate;" +
"uniform mat4 uMVPMatrix;" +
"attribute vec4 vPosition;" +
"void main() {" +
" gl_Position = vPosition * uMVPMatrix;" +
"v_TexCoordinate = a_TexCoordinate" +
"}";
private final String fragmentShaderCode =
"precision mediump float;" +
"uniform vec4 vColor;" +
"uniform sampler2D u_Texture;" +
"varying vec2 v_TexCoordinate;" +
"void main() {" +
"gl_FragColor = (vColor * texture2D(u_Texture, v_TexCoordinate));" +
"}";
private final int shaderProgram;
private final FloatBuffer vertexBuffer;
private final ShortBuffer drawListBuffer;
private int mPositionHandle;
private int mColorHandle;
private int mMVPMatrixHandle;
// number of coordinates per vertex in this array
static final int COORDS_PER_VERTEX = 2;
static float spriteCoords[] = { -0.5f, 0.5f, // top left
-0.5f, -0.5f, // bottom left
0.5f, -0.5f, // bottom right
0.5f, 0.5f }; //top right
private short drawOrder[] = { 0, 1, 2, 0, 2, 3 }; //Order to draw vertices
private final int vertexStride = COORDS_PER_VERTEX * 4; //Bytes per vertex
// Set color with red, green, blue and alpha (opacity) values
float color[] = { 0.63671875f, 0.76953125f, 0.22265625f, 1.0f };
public Sprite(final Context activityContext)
{
mActivityContext = activityContext;
//Initialize Vertex Byte Buffer for Shape Coordinates / # of coordinate values * 4 bytes per float
ByteBuffer bb = ByteBuffer.allocateDirect(spriteCoords.length * 4);
//Use the Device's Native Byte Order
bb.order(ByteOrder.nativeOrder());
//Create a floating point buffer from the ByteBuffer
vertexBuffer = bb.asFloatBuffer();
//Add the coordinates to the FloatBuffer
vertexBuffer.put(spriteCoords);
//Set the Buffer to Read the first coordinate
vertexBuffer.position(0);
// S, T (or X, Y)
// Texture coordinate data.
// Because images have a Y axis pointing downward (values increase as you move down the image) while
// OpenGL has a Y axis pointing upward, we adjust for that here by flipping the Y axis.
// What's more is that the texture coordinates are the same for every face.
final float[] cubeTextureCoordinateData =
{
-0.5f, 0.5f,
-0.5f, -0.5f,
0.5f, -0.5f,
0.5f, 0.5f
};
mCubeTextureCoordinates = ByteBuffer.allocateDirect(cubeTextureCoordinateData.length * 4).order(ByteOrder.nativeOrder()).asFloatBuffer();
mCubeTextureCoordinates.put(cubeTextureCoordinateData).position(0);
//Initialize byte buffer for the draw list
ByteBuffer dlb = ByteBuffer.allocateDirect(spriteCoords.length * 2);
dlb.order(ByteOrder.nativeOrder());
drawListBuffer = dlb.asShortBuffer();
drawListBuffer.put(drawOrder);
drawListBuffer.position(0);
int vertexShader = MyGLRenderer.loadShader(GLES20.GL_VERTEX_SHADER, vertexShaderCode);
int fragmentShader = MyGLRenderer.loadShader(GLES20.GL_FRAGMENT_SHADER, fragmentShaderCode);
shaderProgram = GLES20.glCreateProgram();
GLES20.glAttachShader(shaderProgram, vertexShader);
GLES20.glAttachShader(shaderProgram, fragmentShader);
//Texture Code
GLES20.glBindAttribLocation(shaderProgram, 0, "a_TexCoordinate");
GLES20.glLinkProgram(shaderProgram);
//Load the texture
mTextureDataHandle = loadTexture(mActivityContext, R.drawable.ic_launcher);
}
public void draw(float[] mvpMatrix)
{
//Add program to OpenGL ES Environment
GLES20.glUseProgram(shaderProgram);
//Get handle to vertex shader's vPosition member
mPositionHandle = GLES20.glGetAttribLocation(shaderProgram, "vPosition");
//Enable a handle to the triangle vertices
GLES20.glEnableVertexAttribArray(mPositionHandle);
//Prepare the triangle coordinate data
GLES20.glVertexAttribPointer(mPositionHandle, COORDS_PER_VERTEX, GLES20.GL_FLOAT, false, vertexStride, vertexBuffer);
//Get Handle to Fragment Shader's vColor member
mColorHandle = GLES20.glGetUniformLocation(shaderProgram, "vColor");
//Set the Color for drawing the triangle
GLES20.glUniform4fv(mColorHandle, 1, color, 0);
//Set Texture Handles and bind Texture
mTextureUniformHandle = GLES20.glGetAttribLocation(shaderProgram, "u_Texture");
mTextureCoordinateHandle = GLES20.glGetAttribLocation(shaderProgram, "a_TexCoordinate");
//Set the active texture unit to texture unit 0.
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
//Bind the texture to this unit.
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, mTextureDataHandle);
//Tell the texture uniform sampler to use this texture in the shader by binding to texture unit 0.
GLES20.glUniform1i(mTextureUniformHandle, 0);
//Pass in the texture coordinate information
mCubeTextureCoordinates.position(0);
GLES20.glVertexAttribPointer(mTextureCoordinateHandle, mTextureCoordinateDataSize, GLES20.GL_FLOAT, false, 0, mCubeTextureCoordinates);
GLES20.glEnableVertexAttribArray(mTextureCoordinateHandle);
//Get Handle to Shape's Transformation Matrix
mMVPMatrixHandle = GLES20.glGetUniformLocation(shaderProgram, "uMVPMatrix");
//Apply the projection and view transformation
GLES20.glUniformMatrix4fv(mMVPMatrixHandle, 1, false, mvpMatrix, 0);
//Draw the triangle
GLES20.glDrawElements(GLES20.GL_TRIANGLES, drawOrder.length, GLES20.GL_UNSIGNED_SHORT, drawListBuffer);
//Disable Vertex Array
GLES20.glDisableVertexAttribArray(mPositionHandle);
}
public static int loadTexture(final Context context, final int resourceId)
{
final int[] textureHandle = new int[1];
GLES20.glGenTextures(1, textureHandle, 0);
if (textureHandle[0] != 0)
{
final BitmapFactory.Options options = new BitmapFactory.Options();
options.inScaled = false; // No pre-scaling
// Read in the resource
final Bitmap bitmap = BitmapFactory.decodeResource(context.getResources(), resourceId, options);
// Bind to the texture in OpenGL
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureHandle[0]);
// Set filtering
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_NEAREST);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_NEAREST);
// Load the bitmap into the bound texture.
GLUtils.texImage2D(GLES20.GL_TEXTURE_2D, 0, bitmap, 0);
// Recycle the bitmap, since its data has been loaded into OpenGL.
bitmap.recycle();
}
if (textureHandle[0] == 0)
{
throw new RuntimeException("Error loading texture.");
}
return textureHandle[0];
}
}
Here is MyGLRenderer.java:
public class MyGLRenderer implements GLSurfaceView.Renderer {
private static final String TAG = "MyGLRenderer";
private Context context;
private Sprite sprite;
// mMVPMatrix is an abbreviation for "Model View Projection Matrix"
private final float[] mMVPMatrix = new float[16];
private final float[] mProjectionMatrix = new float[16];
private final float[] mViewMatrix = new float[16];
public MyGLRenderer(Context ctx) {
this.context = ctx;
}
#Override
public void onSurfaceCreated(GL10 unused, EGLConfig config) {
// Set the background frame color
GLES20.glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
sprite = new Sprite(context);
}
#Override
public void onDrawFrame(GL10 unused) {
sprite.draw(mMVPMatrix);
}
#Override
public void onSurfaceChanged(GL10 unused, int width, int height) {
// Adjust the viewport based on geometry changes,
// such as screen rotation
GLES20.glViewport(0, 0, width, height);
float ratio = (float) width / height;
// this projection matrix is applied to object coordinates
// in the onDrawFrame() method
Matrix.frustumM(mProjectionMatrix, 0, -ratio, ratio, -1, 1, 2, 7);
}
public static int loadShader(int type, String shaderCode){
// create a vertex shader type (GLES20.GL_VERTEX_SHADER)
// or a fragment shader type (GLES20.GL_FRAGMENT_SHADER)
int shader = GLES20.glCreateShader(type);
// add the source code to the shader and compile it
GLES20.glShaderSource(shader, shaderCode);
GLES20.glCompileShader(shader);
return shader;
}
public static void checkGlError(String glOperation) {
int error;
while ((error = GLES20.glGetError()) != GLES20.GL_NO_ERROR) {
Log.e(TAG, glOperation + ": glError " + error);
throw new RuntimeException(glOperation + ": glError " + error);
}
}
//NEW
public static int loadTexture(final Context context, final int resourceId)
{
final int[] textureHandle = new int[1];
GLES20.glGenTextures(1, textureHandle, 0);
if (textureHandle[0] != 0)
{
final BitmapFactory.Options options = new BitmapFactory.Options();
options.inScaled = false; // No pre-scaling
// Read in the resource
final Bitmap bitmap = BitmapFactory.decodeResource(context.getResources(), resourceId, options);
// Bind to the texture in OpenGL
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureHandle[0]);
// Set filtering
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_NEAREST);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_NEAREST);
// Load the bitmap into the bound texture.
GLUtils.texImage2D(GLES20.GL_TEXTURE_2D, 0, bitmap, 0);
// Recycle the bitmap, since its data has been loaded into OpenGL.
bitmap.recycle();
}
if (textureHandle[0] == 0)
{
throw new RuntimeException("Error loading texture.");
}
return textureHandle[0];
}
}
The problem that I have is that the app crashes with a
java.lang.RuntimeException: glGetUniformLocation: glError 1282
before it even opens. Thanks for any help in advance!
EDIT:
I don't know if this helps but when I start my app in an AVD I get following error:
../../sdk/emulator/opengl//host/libs/Translator/GLES_V2//GLESv2Imp.cpp:glGetAttribLocation:857 error 0x502
../../sdk/emulator/opengl//host/libs/Translator/GLES_V2//GLESv2Imp.cpp:glGetUniformLocation:1442 error 0x502
../../sdk/emulator/opengl//host/libs/Translator/GLES_V2//GLESv2Imp.cpp:glGetUniformLocation:1442 error 0x502
../../sdk/emulator/opengl//host/libs/Translator/GLES_V2//GLESv2Imp.cpp:glGetAttribLocation:857 error 0x502
../../sdk/emulator/opengl//host/libs/Translator/GLES_V2//GLESv2Imp.cpp:glGetUniformLocation:1442 error 0x502
Lacking an OpenGL compiler I forgot a semicolon. So this
"v_TexCoordinate = a_TexCoordinate" +
should be this:
"v_TexCoordinate = a_TexCoordinate;" +
Alper Cinar was right with his answer, too.
I did not check all of it but i think i've found a problem in your code .
u_Texture is defined as an uniform in your fragment shader but you try to get its position as if it were an attribute .
change this line in your draw method of Sprite class.
mTextureUniformHandle = GLES20.glGetAttribLocation(shaderProgram, "u_Texture");
to this
mTextureUniformHandle = GLES20.glGetUniformLocation(shaderProgram, "u_Texture");
The vectors order here seems to be wrong:
"gl_FragColor = (vColor * texture2D(u_Texture, v_TexCoordinate));" +
You may want to change that to
"gl_FragColor = texture2D(u_Texture, v_TexCoordinate) * vColor;" +

Transformations are weird in OpenGL ES 2.0

I'm developing an application for Android that uses OpenGL ES 2.0
Since it's my first time with OpenGL (I used to use WebGL), I made a custom and pretty simple API like THREE.js, which consists of a Object3D and Geometry objects.
Basicaly, what I did was: Store shapes inside the Geometry object, and create Mesh objects with the the geometry instance inside. Also, inside Mesh, I have: Vector3 object for: position, scale, rotation.
I created a circle to test, and here is what is happening
If I don't change ANY thing, the circle is perfect on the screen. If I change the vertices positions on the creation of the circle, the circle is still Ok also.
But, when I do some transformation (change the attribute position, scale or rotation) or Object3D (in this case, Mesh), the circle becomes "strech".
So, I think that there is some problem with the projectionMatrix, but the circle it's ok if I don't transform it.
Is there a problem with my matrix code? Should I send the Rotation, Translation and Scale matrix to the GPU?
Perhaps I'm complicating things, but since this is the first time I use OpenGL after reading lot's of information, it's acceptable...
Here is the Object3D code:
public class Object3D {
public Vector3 position = new Vector3();
public Vector3 rotation = new Vector3();
public Vector3 scale = new Vector3();
public Color color = new Color();
public float[] getMVMatrix(){
// Initialize matrix with Identity
float[] mvMatrix = new float[16];
Matrix.setIdentityM(mvMatrix, 0);
// apply scale
Matrix.scaleM(mvMatrix, 0, scale.x, scale.y, scale.z);
// set rotation
Matrix.setRotateM(mvMatrix, 0, rotation.x, 1f, 0, 0);
Matrix.setRotateM(mvMatrix, 0, rotation.y, 0, 1f, 0);
Matrix.setRotateM(mvMatrix, 0, rotation.z, 0, 0, 1f);
// apply translation
Matrix.translateM(mvMatrix, 0, position.x, position.y, position.z);
return mvMatrix;
}
}
This is the Geometry class, that simplifies the use of Triangles:
public class Geometry {
// Public, to allow modifications
public ArrayList<Vector3> vertices;
public ArrayList<Face3> faces;
// Type of Geometry
public int triangleType = GLES20.GL_TRIANGLES;
[...]
public FloatBuffer getVerticesBuffer(){
if(verticesBuffer == null || verticesBufferNeedsUpdate){
/*
* Cache faces
*/
int size = vertices.size();
// (size of Vector3 list) * (3 for each object) * (4 bytes per float)
ByteBuffer bb = ByteBuffer.allocateDirect( size * 3 * 4 );
// use the device hardware's native byte order
bb.order(ByteOrder.nativeOrder());
// Get the ByteBuffer as a floatBuffer
verticesBuffer = bb.asFloatBuffer();
for(int i = 0; i < size; i++)
verticesBuffer.put(vertices.get(i).toArray());
verticesBufferNeedsUpdate = false;
}
verticesBuffer.position(0);
return verticesBuffer;
}
public ShortBuffer getFacesBuffer(){
if(facesBuffer == null || facesBufferNeedsUpdate){
/*
* Cache faces
*/
int size = faces.size();
// Log.i(TAG, "FACES Size: "+size);
// (size of Vector3 list) * (3 for each object) * (2 bytes per short)
ByteBuffer bb = ByteBuffer.allocateDirect( size * 3 * 2 );
// use the device hardware's native byte order
bb.order(ByteOrder.nativeOrder());
// Get the ByteBuffer as a floatBuffer
facesBuffer = bb.asShortBuffer();
for(int i = 0; i < size; i++)
facesBuffer.put(faces.get(i).toArray());
facesBufferNeedsUpdate = false;
}
facesBuffer.position(0);
return facesBuffer;
}
}
Also, The Mesh class, responsable for resndering Geometry objects:
public class Mesh extends Object3D{
[...]
public void draw(float[] projectionMatrix, int shaderProgram){
float[] MVMatrix = getMVMatrix();
Matrix.multiplyMM(projectionMatrix, 0, projectionMatrix, 0, MVMatrix, 0);
// Check if geometry is set
if(geometry == null){
Log.i(TAG, "Geometry is null. skiping");
return;
}
// Add program to OpenGL environment
GLES20.glUseProgram(shaderProgram);
// Get, enable and Set the position attribute
positionHandle = GLES20.glGetAttribLocation(shaderProgram, "vPosition");
GLES20.glEnableVertexAttribArray(positionHandle);
// Prepare the triangles coordinate data
Buffer vertexBuffer = geometry.getVerticesBuffer();
GLES20.glVertexAttribPointer(positionHandle, COORDS_PER_VERTEX,
GLES20.GL_FLOAT, false,
COORDS_PER_VERTEX*4,
vertexBuffer);
// get handle to fragment shader's vColor member
int mColorHandle = GLES20.glGetUniformLocation(shaderProgram, "vColor");
// Set color for drawing the triangle
GLES20.glUniform4fv(mColorHandle, 1, color.toArray(), 0);
// get handle to shape's transformation matrix
int mMVPMatrixHandle = GLES20.glGetUniformLocation(shaderProgram, "uMVPMatrix");
ChwaziSurfaceView.checkGlError("glGetUniformLocation");
// Apply the projection and view transformation
GLES20.glUniformMatrix4fv(mMVPMatrixHandle, 1, false, projectionMatrix, 0);
ChwaziSurfaceView.checkGlError("glUniformMatrix4fv");
// Draw the triangles
if(geometry.triangleType == GLES20.GL_TRIANGLES){
Buffer indexesBuffer = geometry.getFacesBuffer();
GLES20.glDrawElements(
GLES20.GL_TRIANGLES,
geometry.faces.size()*3,
GL10.GL_UNSIGNED_SHORT,
indexesBuffer);
}else{
GLES20.glDrawArrays(geometry.triangleType, 0, geometry.vertices.size());
ChwaziSurfaceView.checkGlError("glDrawArrays");
}
// Disable vertex array
GLES20.glDisableVertexAttribArray(positionHandle);
}
}
This is the sample code I made to test if it's working properly (just translation)
// Inside my Renderer...
#Override
public void onDrawFrame(GL10 unused) {
// Draw background color
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
GLES20.glCullFace(GLES20.GL_FRONT_AND_BACK);
// Set the camera position (View matrix)
Matrix.setLookAtM(mVMatrix, 0,
0, 0, -3,
0f, 0f, 0f,
0f, 1.0f, 0.0f);
// Calculate the projection and view transformation
Matrix.multiplyMM(mMVPMatrix, 0, mProjMatrix, 0, mVMatrix, 0);
// Create a rotation for the triangle
long time = SystemClock.uptimeMillis();// % 4000L;
myMesh.position.x = (time%4000)/4000f;
myMesh.draw(mMVPMatrix, shaderProgram.getProgram());
}
#Override
public void onSurfaceChanged(GL10 unused, int width, int height) {
GLES20.glViewport(0, 0, width, height);
float ratio = (float) width / height;
// this projection matrix is applied to object coordinates
Matrix.orthoM(mProjMatrix, 0, -1, 1, -1, 1, 0, 10);
}
EDIT
Shader code:
private final String vertexShaderCode =
// This matrix member variable provides a hook to manipulate
// the coordinates of the objects that use this vertex shader
"uniform mat4 uMVPMatrix;" +
"attribute vec4 vPosition;" +
"void main() {" +
// the matrix must be included as a modifier of gl_Position
" gl_Position = vPosition * uMVPMatrix;" +
"}";
private final String fragmentShaderCode =
"precision mediump float;" +
"uniform vec4 vColor;" +
"void main() {" +
" gl_FragColor = vColor;" +
"}";

Android: OpenGL ES2 Texture not working

UPDATE: got rid of the line GLES20.glEnable(GLES20.GL_TEXTURE_2D); But the line GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_RGB, 256, 256, 0, GLES20.GL_RGB, GLES20.GL_BYTE, ByteBuffer.wrap(pixels)); gives GL_INVALID_ENUM... pixel buffer length is 196608.
Project files: http://godofgod.co.uk/my_files/NightCamPrj.zip
I am trying to get camera data to a OpenGL ES2 shader and the camera stuff appears to work but I cannot get the texture to work even when I try my own values. I get a black screen. Here is the code:
package com.matthewmitchell.nightcam;
import java.io.IOException;
import java.io.InputStream;
import java.nio.ByteBuffer;
import java.nio.ByteOrder;
import java.nio.FloatBuffer;
import java.nio.IntBuffer;
import java.util.Scanner;
import javax.microedition.khronos.egl.EGLConfig;
import javax.microedition.khronos.opengles.GL10;
import android.content.Context;
import android.content.res.AssetManager;
import android.opengl.GLES20;
import android.opengl.GLSurfaceView;
public class MyRenderer implements GLSurfaceView.Renderer{
private FloatBuffer vertices;
private FloatBuffer texcoords;
private int mProgram;
private int maPositionHandle;
private int gvTexCoordHandle;
private int gvSamplerHandle;
private static Context context;
int[] camera_texture;
public void onSurfaceCreated(GL10 unused, EGLConfig config) {
initShapes();
GLES20.glClearColor(0.0f, 1.0f, 0.2f, 1.0f);
Debug.out("Hello init.");
//Shaders
int vertexShader = 0;
int fragmentShader = 0;
try {
vertexShader = loadShader(GLES20.GL_VERTEX_SHADER, readFile("vertex.vsh"));
fragmentShader = loadShader(GLES20.GL_FRAGMENT_SHADER, readFile("fragment.fsh"));
} catch (IOException e) {
Debug.out("The shaders could not be found.");
e.printStackTrace();
}
mProgram = GLES20.glCreateProgram(); // create empty OpenGL Program
GLES20.glAttachShader(mProgram, vertexShader); // add the vertex shader to program
Debug.out("VS LOG: " + GLES20.glGetShaderInfoLog(vertexShader));
GLES20.glAttachShader(mProgram, fragmentShader); // add the fragment shader to program
Debug.out("FS LOG: " + GLES20.glGetShaderInfoLog(fragmentShader));
GLES20.glLinkProgram(mProgram); // creates OpenGL program executables
Debug.out("PROG LOG: " + GLES20.glGetProgramInfoLog(mProgram));
// get handles
maPositionHandle = GLES20.glGetAttribLocation(mProgram, "vPosition");
gvTexCoordHandle = GLES20.glGetAttribLocation(mProgram, "a_texCoord");
gvSamplerHandle = GLES20.glGetAttribLocation(mProgram, "s_texture");
camera_texture = null;
GLES20.glEnable(GLES20.GL_TEXTURE_2D);
}
private void initShapes(){
float triangleCoords[] = {
// X, Y, Z
-1.0f, -1.0f, 0.0f,
1.0f, -1.0f, 0.0f,
-1.0f, 1.0f, 0.0f,
1.0f, 1.0f, 0.0f,
};
float texcoordf[] = {
// X, Y, Z
-1.0f,-1.0f,
1.0f,-1.0f,
-1.0f,1.0f,
1.0f,1.0f,
}; //Even if wrong way around it should produce a texture with these coordinates on the screen.
// initialize vertex Buffer for vertices
ByteBuffer vbb = ByteBuffer.allocateDirect(triangleCoords.length * 4);
vbb.order(ByteOrder.nativeOrder());// use the device hardware's native byte order
vertices = vbb.asFloatBuffer(); // create a floating point buffer from the ByteBuffer
vertices.put(triangleCoords); // add the coordinates to the FloatBuffer
vertices.position(0); // set the buffer to read the first coordinate
// initialize vertex Buffer for texcoords
vbb = ByteBuffer.allocateDirect(texcoordf.length * 4);
vbb.order(ByteOrder.nativeOrder());// use the device hardware's native byte order
texcoords = vbb.asFloatBuffer(); // create a floating point buffer from the ByteBuffer
texcoords.put(texcoordf); // add the coordinates to the FloatBuffer
texcoords.position(0); // set the buffer to read the first coordinate
}
private static String readFile(String path) throws IOException {
//Load file from assets folder using context given by the activity class
AssetManager assetManager = context.getAssets();
InputStream stream = assetManager.open(path);
try {
return new Scanner(stream).useDelimiter("\\A").next();
}
finally {
stream.close();
}
}
private int loadShader(int type, String shaderCode){
int shader = GLES20.glCreateShader(type);
GLES20.glShaderSource(shader, shaderCode);
GLES20.glCompileShader(shader);
return shader;
}
public void onDrawFrame(GL10 unused) {
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);
if(camera_texture == null){
return;
}
// Add program to OpenGL environment
GLES20.glUseProgram(mProgram);
// Prepare the triangle data
GLES20.glVertexAttribPointer(maPositionHandle, 3, GLES20.GL_FLOAT, false, 0, vertices);
GLES20.glVertexAttribPointer(gvTexCoordHandle, 2, GLES20.GL_FLOAT, false, 0, texcoords);
GLES20.glEnableVertexAttribArray(maPositionHandle);
GLES20.glEnableVertexAttribArray(gvTexCoordHandle);
//Bind texture
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, camera_texture[0]);
GLES20.glUniform1i(gvSamplerHandle, 0);
// Draw the triangle
GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4);
//Disable arrays
GLES20.glDisableVertexAttribArray(maPositionHandle);
GLES20.glDisableVertexAttribArray(gvTexCoordHandle);
}
public void onSurfaceChanged(GL10 unused, int width, int height) {
GLES20.glViewport(0, 0, width, height);
}
public void takeContext(Context mcontext) {
context = mcontext;
}
void bindCameraTexture(byte[] data,int w,int h) {
//Takes pixel data from camera and makes texture
byte[] pixels = new byte[256*256*3]; //Testing simple 256x256 texture. Will update for camera resolution
for(int x = 0;x < 256;x++){
for(int y = 0;y < 256;y++){
//Ignore camera data, use test values.
pixels[(x*256+y)*3] = 0;
pixels[(x*256+y)*3+1] = 100;
pixels[(x*256+y)*3+2] = 120;
}
}
//Debug.out("PX:" + pixels[0] + " " + pixels[1] + " " + pixels[2]);
//Make new texture for new data
if (camera_texture == null){
camera_texture = new int[1];
}else{
GLES20.glDeleteTextures(1, camera_texture, 0);
}
GLES20.glGenTextures(1, camera_texture, 0);
int tex = camera_texture[0];
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, tex);
GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_RGB, 256, 256, 0, GLES20.GL_RGB, GLES20.GL_BYTE, ByteBuffer.wrap(pixels));
GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR);
GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
}
}
Here is the vertex shader code:
attribute vec4 vPosition;
attribute vec2 a_texCoord;
varying vec2 v_texCoord;
void main(){
gl_Position = vPosition;
v_texCoord = a_texCoord;
}
Here is the fragment shader code:
precision mediump float;
varying vec2 v_texCoord;
uniform sampler2D s_texture;
void main(){
gl_FragColor = texture2D(s_texture, v_texCoord);
}
We can ignore the camera stuff here because I'm using test values. I'm using a test 256x256 texture. I've done everything I've seen in examples.
Why it is black and how can I make it show?
I see that you're using glGetAttribLocation() to retrieve the location of s_texture. This is a uniform variable, not an attribute. Try using glGetUniformLocation() instead for this one.
I don't know if this will solve all of your problems, but it's something that needs to be done for sure.
it is not seen from your code but it seems to me that you're not calling the bindCameraTexture from the place where there is rendering context (while you should do that in onSurfaceCreated or onSurfaceChanged).
I finished the sample with camera preview as the texture. the key difference with your code is :
I use SurfaceTexture to connect the camera preview to the texture used in the openGL ES.
I use the matrix generated by SurfaceTexture to adjust the output of camera preview, otherwise there is black flicker area.
I do not call the glBindTexture() explicitly on the texture used for camera preview.
Good Luck.

Categories

Resources