OpenGL shader not passing correct texture coordinates - java

After testing a few things in my OpenGL application, I know that my textures are not loading correctly because the texture coordinates are failing to get from the vertex shader to the fragment shader (or atleast they are all passed as (0,0). I just don't know why.
Here are my positions, indices and texture coordinates for a square:
private static final float[] VERTICES = {
-0.5f, 0.5f, 0f,
-0.5f, -0.5f, 0,
0.5f, -0.5f, 0f,
0.5f, 0.5f, 0f
};
private static final int[] INDICES = {
0, 1, 2,
3, 0, 2
};
private static final float[] TEXTURE_COORDINATES = {
0.0f, 0.0f,
0.0f, 1.0f,
1.0f, 1.0f,
1.0f, 0.0f
};
Here is how I insert the texture coordinates into the vao at index 1. For the sake of brevity I won't show the other parts of my vao loader since they are in working order.
int vboTexCoordID = GL15.glGenBuffers();
ModelManager.recordVBO(vboTexCoordID);
GL15.glBindBuffer(GL15.GL_ARRAY_BUFFER, vboTexCoordID);
FloatBuffer texCoordBuffer = BufferUtils.createFloatBuffer(texCoord.length);
verticesBuffer.put(texCoord);
verticesBuffer.flip();
GL15.glBufferData(GL15.GL_ARRAY_BUFFER, texCoordBuffer, GL15.GL_STATIC_DRAW);
GL20.glVertexAttribPointer(1,2,GL11.GL_FLOAT, false, 0,0);
GL15.glBindBuffer(GL15.GL_ARRAY_BUFFER, 0);
This is my render method:
model.getShader().start(); // this calls GL20.glUseProgram()
GL30.glBindVertexArray(model.getVaoID());
GL20.glEnableVertexAttribArray(0);
GL20.glEnableVertexAttribArray(1);
GL13.glActiveTexture(GL13.GL_TEXTURE0);
GL11.glBindTexture(GL11.GL_TEXTURE_2D, model.getTexId());
GL11.glDrawElements(GL11.GL_TRIANGLES, model.getVerticesCount(), GL11.GL_UNSIGNED_INT,0);
GL20.glDisableVertexAttribArray(0);
GL20.glDisableVertexAttribArray(1);
GL30.glBindVertexArray(0);
model.getShader().stop();
and finally mt vertex shader and fragment shader in that order:
#version 330 core
in vec3 position;
in vec2 texCoords;
out vec2 pass_texCoords;
void main(void){
gl_Position = vec4(position.x, position.y, position.z, 1.0);
pass_texCoords = texCoords;
}
fragment:
#version 330 core
in vec2 pass_texCoords;
out vec4 out_Color;
uniform sampler2D textureSampler;
void main(void){
out_Color = texture(textureSampler, pass_texCoords);
}

You never add data to texCoordBuffer:
FloatBuffer texCoordBuffer = BufferUtils.createFloatBuffer(texCoord.length);
verticesBuffer.put(texCoord);
verticesBuffer.flip();
GL15.glBufferData(GL15.GL_ARRAY_BUFFER, texCoordBuffer, GL15.GL_STATIC_DRAW);
texCoordBuffer is empty when you call glBufferData(), since you store the texture coordinates in verticesBuffer. You need to change this to:
FloatBuffer texCoordBuffer = BufferUtils.createFloatBuffer(texCoord.length);
texCoordBuffer.put(texCoord);
texCoordBuffer.flip();
GL15.glBufferData(GL15.GL_ARRAY_BUFFER, texCoordBuffer, GL15.GL_STATIC_DRAW);
You also need to make sure that the locations you use for the vertex attributes match up with the vertex shader. The easiest way to do this is to use layout directives in the vertex shader code:
layout(location=0) in vec3 position;
layout(location=1) in vec2 texCoords;
This specifies that the vertex shader will read the positions from attribute 0, and the texture coordinates from attribute 1.

Related

Draw triangle VBO with and without shader

I am trying to draw a simple, red triangle on screen. Without using an VBO the code works as intended. When trying to draw it by using an VBO (with or without shader), it simply has no effect.
My code:
//Works
glBegin(GL_TRIANGLES);
glVertex3f(0f, 0f, 0.0f);
glVertex3f(0.5f, 0f, 0.0f);
glVertex3f(0.5f, 0.5f, 0.0f);
glEnd();
//Does not work
int vertexArrayID = glGenVertexArrays();
glBindVertexArray(vertexArrayID);
float[] g_vertex_buffer_data = new float[]{
0f, 0f, 0.0f,
0.5f, 00f, 0.0f,
0.5f, 0.5f, 0.0f,
};
int vertexbuffer = glGenBuffers();
glBindBuffer(GL_ARRAY_BUFFER, vertexbuffer);
glBufferData(GL_ARRAY_BUFFER, FloatBuffer.wrap(g_vertex_buffer_data), GL_STATIC_DRAW);
glEnableVertexAttribArray(0);
glBindBuffer(GL_ARRAY_BUFFER, vertexbuffer);
glVertexAttribPointer(
0, // attribute 0. No particular reason for 0, but must match the layout in the shader.
3, // size
GL_FLOAT, // type
false, // normalized?
0, // stride
0 // array buffer offset
);
glDrawArrays(GL_TRIANGLES, 0, 3); // 3 indices starting at 0 -> 1 triangle
glDisableVertexAttribArray(0);
glDeleteBuffers(vertexbuffer);
glDeleteVertexArrays(vertexArrayID);
System.out.println(glGetError());
glGetError() always returns 0.
I use a default, tutorial-copied shader to test my code (I link and bind the program before using above code):
#version 330 core
// Input vertex data, different for all executions of this shader.
layout(location = 0) in vec3 vertexPosition_modelspace;
void main(){
gl_Position.xyz = vertexPosition_modelspace;
gl_Position.w = 1.0;
}
#version 330 core
// Ouput data
out vec3 color;
void main()
{
// Output color = red
color = vec3(1,0,0);
}

Lighting a textured object in OpenGL 2.0+

I’ve been developing a cube program that provides a number of cubes with desired qualities. However, whenever I try to light a textured cube, my cube becomes very dark. The lighting works well with a non-textured cube so I’m led to believe it’s done properly just as a simple textured cube without lighting works. There doesn’t seem to be significant documentation on how to solve this in OpenGL 2.0+ but there are a few things pertaining to older versions.
The following link offers information as to why my cube is behaving as it is, but I’m having trouble translating the solution to a newer version, especially within my shader code where I’m unsure if further changes should occur. I am using Android Studio 2.1.3 if that and its contained emulators would pose issues to the desired effect. If anyone could offer any advice, I’d greatly appreciate it. I have a separate (large) renderer that calls for the Cube to be drawn, let me know if that code would be beneficial as well in addition to my Cube. Below is my Cube:
public class TexturedLightCube {
/** Cube vertices */
private static final float VERTICES[] = {
-0.3f, -0.3f, -0.3f, //top front right
0.3f, -0.3f, -0.3f, //bottom front right
0.3f, 0.3f, -0.3f, //bottom front left
-0.3f, 0.3f, -0.3f, //top front left
-0.3f, -0.3f, 0.3f, //top back right
0.3f, -0.3f, 0.3f, //bottom back right
0.3f, 0.3f, 0.3f, //bottom back left
-0.3f, 0.3f, 0.3f // top back left
};
/** Vertex colors. */
private static final float COLORS[] = {
0.0f, 1.0f, 1.0f, 1.0f,
1.0f, 0.0f, 0.0f, 1.0f,
1.0f, 1.0f, 0.0f, 1.0f,
0.0f, 1.0f, 0.0f, 1.0f,
0.0f, 0.0f, 1.0f, 1.0f,
1.0f, 0.0f, 1.0f, 1.0f,
1.0f, 1.0f, 1.0f, 1.0f,
0.0f, 1.0f, 1.0f, 1.0f,
};
/** Order to draw vertices as triangles. */
private static final byte INDICES[] = {
0, 1, 3, 3, 1, 2, // Front face.
0, 1, 4, 4, 5, 1, // Bottom face.
1, 2, 5, 5, 6, 2, // Right face.
2, 3, 6, 6, 7, 3, // Top face.
3, 7, 4, 4, 3, 0, // Left face.
4, 5, 7, 7, 6, 5, // Rear face.
};
private static final float TEXTURECOORDS[] =
{
0.0f, 1.0f, //left-bottom
0.0f, 0.0f, //right bottom
1.0f, 0.0f, //left top
1.0f, 1.0f, //right top
0.0f, 1.0f, //left-bottom
0.0f, 0.0f, //right bottom
1.0f, 0.0f, //left top
1.0f, 1.0f, //right top
};
private static final float NORMALS[] = {
//set all normals to all light for testing
1.0f, 1.0f, 1.0f, //top front right
1.0f, 0.0f, 1.0f, //bottom front right
0.0f, 0.0f, 1.0f, //bottom front left
0.0f, 1.0f, 1.0f, //top front left
1.0f, 1.0f, 0.0f, //top back right
1.0f, 0.0f, 0.0f, //bottom back right
0.0f, 0.0f, 0.0f, //bottom back left
0.0f, 1.0f, 0.0f //top back left
};
static final int COORDS_PER_VERTEX = 3;
private static final int VALUES_PER_COLOR = 4;
/** Vertex size in bytes. */
final int VERTEX_STRIDE = COORDS_PER_VERTEX * 4;
/** Color size in bytes. */
private final int COLOR_STRIDE = VALUES_PER_COLOR * 4;
/** Shader code for the vertex. */
private static final String VERTEX_SHADER_CODE =
"uniform mat4 uMVPMatrix;" +
"uniform mat4 uMVMatrix;" +
"uniform vec3 u_LightPos;" +
"attribute vec4 vPosition;" +
"attribute vec4 a_Color;" +
"attribute vec3 a_Normal;" +
"varying vec4 v_Color;" +
"attribute vec2 a_TexCoordinate;" +
"varying vec2 v_TexCoordinate;" +
"void main() {" +
"vec3 modelViewVertex = vec3(uMVMatrix * vPosition);"+
"vec3 modelViewNormal = vec3(uMVMatrix * vec4(a_Normal, 0.0));" +
"float distance = length(u_LightPos - modelViewVertex);" +
"vec3 lightVector = normalize(u_LightPos - modelViewVertex);" +
"float diffuse = max(dot(modelViewNormal, lightVector), 0.1);" +
"diffuse = diffuse * (1.0/(1.0 + (0.00000000000002 * distance * distance)));" + //attenuation factor
"v_Color = a_Color * a_Color * diffuse;" +
"gl_Position = uMVPMatrix * vPosition;" +
"v_TexCoordinate = a_TexCoordinate;" +
"}";
/** Shader code for the fragment. */
private static final String FRAGMENT_SHADER_CODE =
"precision mediump float;" +
"varying vec4 v_Color;" +
"uniform sampler2D u_Texture;"+ //The input texture
"varying vec2 v_TexCoordinate;" +
"void main() {" +
" gl_FragColor = v_Color * texture2D(u_Texture, v_TexCoordinate) ;" + //still works with just color
"}";
private int mTextureUniformHandle; //Pass in texture.
private int mTextureCoordinateHandle; //Pass in model texture coordinate information.
private final int mTextureCoordinateDataSize = 2; //Size of texture coordinate data in elements
public static int mTextureDataHandle; //Handle to texturedata;
private final FloatBuffer mTextureBuffer; //Store model data in float buffer.
private final FloatBuffer mVertexBuffer;
private final FloatBuffer mColorBuffer;
private final FloatBuffer mNormalBuffer;
private final ByteBuffer mIndexBuffer;
private final int mProgram;
private final int mPositionHandle;
private final int mColorHandle;
private final int mMVPMatrixHandle;
private final int mNormalHandle;
public static int mLightPosHandle;
public final int mMVMatrixHandle;
public static int loadTexture(final Context context, final int resourceId) {
//Get the texture from the Android resource directory
final int[] textureHandle = new int[1];
InputStream is = context.getResources().openRawResource(+ R.drawable.teneighty);
Bitmap bitmap = null;
try {
//BitmapFactory is an Android graphics utility for images
bitmap = BitmapFactory.decodeStream(is);
} finally {
//Always clear and close
try {
is.close();
is = null;
} catch (IOException e) {
}
}
//Generate one texture pointer...
GLES20.glGenTextures(1, textureHandle, 0);
//and bind it to our array.
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureHandle[0]);
//Create Nearest Filtered Texture.
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_NEAREST);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
//Accounting for different texture parameters.
GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_REPEAT);
GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_REPEAT);
//Use the Android GLUtils to specify a two-dimensional texture image from our bitmap.
GLUtils.texImage2D(GLES20.GL_TEXTURE_2D, 0, bitmap, 0);
//Clean up
bitmap.recycle();
if (textureHandle[0] == 0)
{
throw new RuntimeException("Error loading texture");
}
return textureHandle[0];
}
public TexturedLightCube() {
ByteBuffer byteBuffer = ByteBuffer.allocateDirect(VERTICES.length * 4);
byteBuffer.order(ByteOrder.nativeOrder());
mVertexBuffer = byteBuffer.asFloatBuffer();
mVertexBuffer.put(VERTICES);
mVertexBuffer.position(0);
byteBuffer = ByteBuffer.allocateDirect(COLORS.length * 4);
byteBuffer.order(ByteOrder.nativeOrder());
mColorBuffer = byteBuffer.asFloatBuffer();
mColorBuffer.put(COLORS);
mColorBuffer.position(0);
byteBuffer = ByteBuffer.allocateDirect(NORMALS.length * 4);
byteBuffer.order(ByteOrder.nativeOrder());
mNormalBuffer = byteBuffer.asFloatBuffer();
mNormalBuffer.put(NORMALS);
mNormalBuffer.position(0);
byteBuffer = ByteBuffer.allocateDirect(TEXTURECOORDS.length * 4);
byteBuffer.order(ByteOrder.nativeOrder());
mTextureBuffer = byteBuffer.asFloatBuffer();
mTextureBuffer.put(TEXTURECOORDS);
mTextureBuffer.position(0);
mIndexBuffer = ByteBuffer.allocateDirect(INDICES.length);
mIndexBuffer.put(INDICES);
mIndexBuffer.position(0);
mProgram = GLES20.glCreateProgram();
GLES20.glAttachShader(mProgram, loadShader(GLES20.GL_VERTEX_SHADER, VERTEX_SHADER_CODE));
GLES20.glAttachShader(mProgram, loadShader(GLES20.GL_FRAGMENT_SHADER, FRAGMENT_SHADER_CODE));
GLES20.glLinkProgram(mProgram);
mTextureDataHandle = GLES20.glGetUniformLocation(mProgram, "u_Texture");
mTextureCoordinateHandle = GLES20.glGetAttribLocation(mProgram, "a_TexCoordinate");
mTextureUniformHandle = GLES20.glGetUniformLocation(mProgram, "u_Texture");
mMVPMatrixHandle = GLES20.glGetUniformLocation(mProgram, "uMVPMatrix");
mMVMatrixHandle = GLES20.glGetUniformLocation(mProgram, "uMVMatrix");
mLightPosHandle = GLES20.glGetUniformLocation(mProgram, "u_LightPos");
mNormalHandle = GLES20.glGetAttribLocation(mProgram, "a_Normal");
mPositionHandle = GLES20.glGetAttribLocation(mProgram, "vPosition");
mColorHandle = GLES20.glGetAttribLocation(mProgram, "a_Color");
}
/**
* Encapsulates the OpenGL ES instructions for drawing this shape.
*
* #param mvpMatrix The Model View Project matrix in which to draw this shape
*/
public void draw(float[] mvpMatrix) {
// Add program to OpenGL environment.
GLES20.glUseProgram(mProgram);
//set active texture unit to texture unit 0.
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, mTextureDataHandle);
// Prepare the cube coordinate data.
GLES20.glEnableVertexAttribArray(mPositionHandle);
GLES20.glVertexAttribPointer(mPositionHandle, 3, GLES20.GL_FLOAT, false, VERTEX_STRIDE, mVertexBuffer);
// Prepare the cube color data.
GLES20.glEnableVertexAttribArray(mColorHandle);
GLES20.glVertexAttribPointer(mColorHandle, 4, GLES20.GL_FLOAT, false, COLOR_STRIDE, mColorBuffer);
//Will have the same size as Vertex as we are implementing per vertex lighting
GLES20.glEnableVertexAttribArray(mNormalHandle);
GLES20.glVertexAttribPointer(mNormalHandle, 3, GLES20.GL_FLOAT, false, VERTEX_STRIDE, mNormalBuffer);
// Prepare the cube texture data.
GLES20.glEnableVertexAttribArray(mTextureCoordinateHandle);
//Pass texture coordinate information.
GLES20.glVertexAttribPointer(mTextureCoordinateHandle,4, GLES20.GL_FLOAT, false, mTextureCoordinateDataSize, mTextureBuffer);
// Apply the projection and view transformation.
GLES20.glUniformMatrix4fv(mMVPMatrixHandle, 1, false, mvpMatrix, 0);
GLES20.glUniform3f(LightCube.mLightPosHandle, MyGLRenderer.mLightPosInEyeSpace[0], MyGLRenderer.mLightPosInEyeSpace[1], MyGLRenderer.mLightPosInEyeSpace[2]);
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
GLES20.glUniform1i(mTextureUniformHandle, 0);
// Draw the cube.
GLES20.glDrawElements(GLES20.GL_TRIANGLES, INDICES.length, GLES20.GL_UNSIGNED_BYTE, mIndexBuffer); //-removed indices-
// Disable vertex arrays.
GLES20.glDisableVertexAttribArray(mPositionHandle);
GLES20.glDisableVertexAttribArray(mTextureCoordinateHandle);
GLES20.glDisableVertexAttribArray(mColorHandle);
GLES20.glDisableVertexAttribArray(mNormalHandle);
}
/** Loads the provided shader in the program. */
private static int loadShader(int type, String shaderCode){
int shader = GLES20.glCreateShader(type);
GLES20.glShaderSource(shader, shaderCode);
GLES20.glCompileShader(shader);
return shader;
}
}
You're missing an ambient component to your lighting, which emulates second order (and higher) reflections you would get in real life, but can't get directly in a rasterizer.
Not sure why you are squaring a_Color in your fragment shader. This will definitely make things darker because all values are between 0 and 1; e.g. 0.1^2 == 0.01.
Remember that your dot product might be negative, so you want to clamp out negative diffuse components (e.g. no light intensity on surfaces which are facing away from the light).

LWJGL - Quad not being rendered

I've been trying to make this piece of code work for a while now and I still can't figure it out what I did wrong. (LWJGL - Java)
I have tried to check on the web for other people's code, but I can't find any major difference. I actually learned to use OpenGL with C++ so my mind might be stuck on it and that might be why I can't find my errors.
Here is the init (called once)
FloatBuffer vertices = BufferUtils.createFloatBuffer(4 * 5);
vertices.put(new float[]{
// pos // Color
0.5f, 0.5f, 0.5f, 0.0f, 0.5f,
0.5f, -0.5f, 0.5f, 0.0f, 0.75f,
-0.5f, -0.5f, 0.0f, 1.0f, 0.0f,
-0.5f, 0.5f, 0.5f, 0.5f, 1.0f
});
vertices.flip();
ByteBuffer indices = BufferUtils.createByteBuffer(2 * 3);
indices.put(new byte[]{
0, 1, 3,
1, 2, 3
});
indices.flip();
// VAO
VAO = GL30.glGenVertexArrays();
GL30.glBindVertexArray(VAO);
// VBO
VBO = glGenBuffers();
glBindBuffer(GL_ARRAY_BUFFER, VBO);
glBufferData(GL_ARRAY_BUFFER, vertices, GL_STATIC_DRAW);
// EBO
EBO = glGenBuffers();
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, EBO);
glBufferData(GL_ELEMENT_ARRAY_BUFFER, indices, GL_STATIC_DRAW);
glBindBuffer(GL_ARRAY_BUFFER, VBO);
glEnableVertexAttribArray(0);
// v - position in layout (see shader)
// v - Nb of component per vertex (2 for 2D (x, y))
// v - Normalized ? (between 0 - 1)
// v - Offset between things (size of a line)
// v - Where to start ?
glVertexAttribPointer(0, 2, GL11.GL_FLOAT, false, 5 * Float.SIZE , 0);
glEnableVertexAttribArray(1);
glVertexAttribPointer(1, 3, GL11.GL_FLOAT, false, 5 * Float.SIZE , 2 * Float.SIZE);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, EBO);
// Unbinds the VAO
GL30.glBindVertexArray(0);
glDisableVertexAttribArray(0);
glDisableVertexAttribArray(1);
And here is the render function :
shaderProgram.bind();
GL30.glBindVertexArray(VAO);
GL11.glDrawElements(GL11.GL_TRIANGLES, 6, GL11.GL_BYTE, 0);
GL30.glBindVertexArray(0);
glBindBuffer(GL_ARRAY_BUFFER, 0);
Shaders:
Vertex:
#version 330 core
layout(location = 0) in vec2 position;
layout(location = 1) in vec3 color;
out vec4 Color;
void main()
{
gl_Position = vec4(position, 0.0, 1.0);
Color = vec4(color, 1.0);
}
Framgent :
#version 330 core
in vec4 Color;
out vec4 color;
void main()
{
color = Color;
}
In the the official Java documentation, Float.SIZE is defined as:
The number of bits used to represent a float value.
Since glVertexAttribPointer() expects the stride and offset arguments in bytes, you will have to divide this by 8, and use (Float.SIZE / 8) instead.

OpenGL es 2.0 colors

I drew a cube in openGL es 2.0.
Right now it has just two faces, for testing purposes(the front and the back). So basically there are two planes in space, both with the same color.
Now I want to apply a different color to each face. I tought that expanding the color array was sufficient, but the colors are not changing (there's just the original color).
Do I have to change the shader? Or pass a specific function to the draw method?
The class should explain better
public class Cube {
private FloatBuffer mVer;
private FloatBuffer colMem;
private ShortBuffer ordVer;
private float vertici[] = {
-0.2f, 0.2f, 0.2f, //p1 upper left front plane (0)
-0.2f, -0.2f, 0.2f, //p2 lower left front plane (1)
0.2f, -0.2f, 0.2f, //p3 lower right front plane (2)
0.2f, 0.2f, 0.2f, //p4 upper right front plane (3)
-0.2f, 0.2f, -0.2f, //p1 upper left front plane (4)
-0.2f, -0.2f, -0.2f, //p2 lower left front plane (5)
0.2f, -0.2f, -0.2f, //p3 lower right front plane (6)
0.2f, 0.2f, -0.2f, //p4 upper right front plane (7)
};
private short order[] = {
0, 1, 2, 0, 2, 3, //front face
7, 6, 5, 7, 5, 4, //back face
//3, 2, 6, 3, 6, 7, //right face
// 0, 1, 5, 0, 5, 4 //left face
};
private float color [] = {
0.8f, 0.8f, 0.1f, 1.0f,//color1
0.8f, 0.8f, 0.1f, 1.0f,
0.8f, 0.8f, 0.1f, 1.0f,
0.8f, 0.8f, 0.1f, 1.0f,
0.1f, 0.2f, 0.5f, 1.0f,//color2
0.1f, 0.2f, 0.5f, 1.0f,
0.1f, 0.2f, 0.5f, 1.0f,
0.1f, 0.2f, 0.5f, 1.0f
};
private final String vertCode =
"uniform mat4 uMVPMatrix;"+
"attribute vec4 vPosition;"+
"void main() {"+
"gl_Position = uMVPMatrix * vPosition;"+
"}";
private final String fragCode =
"precision mediump float;"+
"uniform vec4 vColor;"+
"void main() {"+
"gl_FragColor = vColor;"+
"}";
private int prog;
private int pos;
private int col;
private int mHandle;
public Cube () {
mVer = ByteBuffer.allocateDirect(vertici.length * 4).order(ByteOrder.nativeOrder()).asFloatBuffer();
mVer.put(vertici).position(0);
ordVer = ByteBuffer.allocateDirect(order.length * 2).order(ByteOrder.nativeOrder()).asShortBuffer();
ordVer.put(order).position(0);
colMem = ByteBuffer.allocateDirect(color.length * 4).order(ByteOrder.nativeOrder()).asFloatBuffer();
colMem.put(color).position(0);
int vertexShader = Render.loadShader (GLES20.GL_VERTEX_SHADER, vertCode);
int fragmentShader = Render.loadShader (GLES20.GL_FRAGMENT_SHADER, fragCode);
prog = GLES20.glCreateProgram();
GLES20.glAttachShader(prog, vertexShader);
GLES20.glAttachShader(prog, fragmentShader);
GLES20.glLinkProgram(prog);
}
public void draw (float[] mVMatrix) {
GLES20.glUseProgram(prog);
pos = GLES20.glGetAttribLocation(prog, "vPosition");
GLES20.glEnableVertexAttribArray(pos);
GLES20.glVertexAttribPointer(pos, 3, GLES20.GL_FLOAT, false, 12, mVer);
col = GLES20.glGetUniformLocation(prog, "vColor");
GLES20.glUniform4fv(col, 1, color, 0);
mHandle = GLES20.glGetUniformLocation(prog, "uMVPMatrix");
GLES20.glUniformMatrix4fv(mHandle, 1, false, mVMatrix, 0);
GLES20.glEnable(GLES20.GL_CULL_FACE);
GLES20.glDrawElements(GLES20.GL_TRIANGLES, order.length, GLES20.GL_UNSIGNED_SHORT, ordVer);
GLES20.glDisableVertexAttribArray(pos);
}
}
You have a few options:
If you want to pass in the color as a uniform, which seems to be where you were headed, you need to draw each face with a separate draw call. You can't just pass in an array of colors for the uniform, and expect the colors to be applied to the triangles in order. You would call glUniform4v with the first color, call glDrawElements with just the indices f the first face, and then repeat these two calls for each face. This is fairly inefficient.
You make the colors an attribute instead of a uniform, very similar to what you do for the vertex positions. You have to be careful when using this approach because you need an OpenGL vertex for each combination of position and color. For a cube, you typically end up with 24 vertices. You should be able to find details if you search for older questions about similar topics.
There's another method called "instanced rendering" that could be applied, but that is only available in ES 3.0.

Animating Multiple sprites in Android OpenGL ES 2.0

I've spent days searching, trying tutorials, and not actually getting results in this, so here I am.
I'm trying, simply put, to animate a collection of objects (Android Studio) on the screen, in a 2D format, with each independent movements and rotations. However, when I try this, I'm either not getting the object rendered, or its rendering skewed (as if rotated through the vertical Y-axis)
I know the importance of the order in which objects are drawn too (to give correct Z-ordering appearance) however, I'm at a bit of a loss with the matrix manipulation.
Here is what I have so far:
Main Activity - standard stuff
private GLSurfaceView mGLSurfaceView;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
mGLSurfaceView = new GLSurfaceView(this);
//check if device supports ES 2.0
final ActivityManager activityManager = (ActivityManager) getSystemService(Context.ACTIVITY_SERVICE);
final ConfigurationInfo configurationInfo = activityManager.getDeviceConfigurationInfo();
final boolean supportsEs2 = configurationInfo.reqGlEsVersion >= 0x20000;
if (supportsEs2) {
//Get the ES2 compatible context
mGLSurfaceView.setEGLContextClientVersion(2);
//set renderer to my renderer below
mGLSurfaceView.setRenderer(new MyGL20Renderer(this));
} else {
//no support
return;
}
//setContentView(R.layout.activity_main);
setContentView(mGLSurfaceView);
}
GL20Renderer class - Notice I'm now just manually adding 2 objects to my collection to render
public class MyGL20Renderer implements GLSurfaceView.Renderer
{
private final Context mActivityContext;
//Matrix Initializations
private final float[] mMVPMatrix = new float[16];
private final float[] mProjMatrix = new float[16];
private final float[] mVMatrix = new float[16];
private float[] mRotationMatrix = new float[16];
private final float[] mRotateMatrix = new float[16];
private final float[] mMoveMatrix = new float[16];
private final float[] mTempMatrix = new float[16];
private final float[] mModelMatrix = new float[16];
private int numObjects = 2;
private ArrayList<Sprite> spriteList = new ArrayList<Sprite>();
//Declare as volatile because we are updating it from another thread
public volatile float mAngle;
//private Triangle triangle;
//private Sprite sprite;
public MyGL20Renderer(final Context activityContext)
{
mActivityContext = activityContext;
}
public void onSurfaceCreated(GL10 unused, EGLConfig config)
{
//Set the background frame color
GLES20.glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
//Set the camera position (View Matrix) //mtx, offset, eyex,y,z, centrex,y,z, upx,y,z
Matrix.setLookAtM(mVMatrix, 0,
0, 0, -1.5f, //Eye XYZ - position eye behind the origin
0f, 0f, -5.0f, //Look XYZ - We are looking toward the distance
0f, 1.0f, 0.0f); //Up XYZ - Up vector - where head would be pointing if holding the camera
//Initialize Shapes
//triangle = new Triangle();
//sprite = new Sprite(mActivityContext);
//Sprite newSprite;
float xMax = 2.0f;
float yMax = 2.0f;
//rand = 0->1
float newX = (new Random().nextFloat() * xMax * 2) - xMax; //2.0f; //-2 -> +2
float newY = (new Random().nextFloat() * yMax * 2) - yMax; //-3 -> +3
float newZ = 0f;
//for (int i=0; i<numObjects; i++) {
//newSprite = new Sprite(mActivityContext);
//spriteList.add(new Sprite(mActivityContext, newX, newY, newZ));
//}
spriteList.add(new Sprite(mActivityContext, -0.0f, -0.0f, 0.0f));
spriteList.add(new Sprite(mActivityContext, +0.5f, -0.5f, 0.0f));
//spriteList.add(new Sprite(mActivityContext, -1.0f, +1.0f, 0.0f));
//spriteList.add(new Sprite(mActivityContext, +1.0f, +1.0f, 0.0f));
}
public void onDrawFrame(GL10 unused)
{
//init
Sprite currSprite;
//Redraw background color
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
//timing
float jFactor = 0.1f;
long time = SystemClock.uptimeMillis() % 10000L;
float angleInDegrees = (360.0f / 1000.0f) * ((int) time) * jFactor;
/*
//number 1
//Matrix.setIdentityM(mModelMatrix, 0);
//currSprite = spriteList.get(0);
//Matrix.rotateM(mModelMatrix, 0, angleInDegrees, 0.0f, 0.0f, 1.0f);
//currSprite.Draw(mModelMatrix);
//number 2
Matrix.setIdentityM(mModelMatrix, 0);
currSprite = spriteList.get(1);
Matrix.translateM(mModelMatrix, 0, 0.0f, -0.1f, 0.0f);
//Matrix.rotateM(mModelMatrix, 0, 90.0f, 1.0f, 0.0f, 0.0f);
Matrix.rotateM(mModelMatrix, 0, angleInDegrees, 0.0f, 0.0f, 1.0f);
currSprite.Draw(mModelMatrix);
//Matrix.translateM(mModelMatrix, 0, 0, 0, 4.0f);
*/
//Calculate the projection and view transformation
Matrix.multiplyMM(mMVPMatrix, 0, mProjMatrix, 0, mVMatrix, 0);
//zoom out a bit?
Matrix.translateM(mMVPMatrix, 0, 0, 0, 4.0f);
//number 1
//currSprite = spriteList.get(0);
//Matrix.setIdentityM(mMVPMatrix, 0);
//Matrix.rotateM(mMVPMatrix, 0, angleInDegrees, 0.0f, 0.0f, 1.0f);
//Matrix.translateM(mMVPMatrix, 0, currSprite.coordX, 0.0f, 0.0f);
//currSprite.coordX += 0.01f;
//currSprite.Draw(mMVPMatrix);
//number 2
currSprite = spriteList.get(0);
Matrix.setIdentityM(mMVPMatrix, 0);
Matrix.translateM(mMVPMatrix, 0, 0.0f, 0.0f, 0.0f);
Matrix.rotateM(mMVPMatrix, 0, angleInDegrees, 0.0f, 0.0f, +1.0f);
//float[] mTempMatrix = new float[16];
//mTempMatrix = mModelMatrix.clone();
//Matrix.multiplyMM(mMVPMatrix, 0, mMVPMatrix, 0, mRotateMatrix, 0);
//mTempMatrix = mMVPMatrix.clone();
//Matrix.multiplyMM(mMVPMatrix, 0, mTempMatrix, 0, mModelMatrix, 0);
//Matrix.setIdentityM(mMVPMatrix, 0);
currSprite.Draw(mMVPMatrix);
/*
//Set the camera position (View Matrix) //mtx, offset, eyex,y,z, centrex,y,z, upx,y,z
Matrix.setLookAtM(mVMatrix, 0,
0, 0, -10,
0f, 0f, 0f,
0f, 1.0f, 0.0f);
//Calculate the projection and view transformation
Matrix.multiplyMM(mMVPMatrix, 0, mProjMatrix, 0, mVMatrix, 0);
//zoom out a bit?
Matrix.translateM(mMVPMatrix, 0, 0, 0, 4.0f);
for (int i=0; i<numObjects; i++) {
//Create a rotation transformation for the triangle
//Matrix.setRotateM(mRotationMatrix, 0, mAngle, 0, 0, -1.0f);
Matrix.setRotateM(mRotationMatrix, 0, 0, 0, 0, -1.0f); //-1.0 = Z, for some reason need this. Grr
//Combine the rotation matrix with the projection and camera view
Matrix.multiplyMM(mMVPMatrix, 0, mRotationMatrix, 0, mMVPMatrix, 0);
//Draw Shape
//triangle.Draw(mMVPMatrix);
//sprite.Draw(mMVPMatrix);
currSprite = spriteList.get(i);
//Move the object to the passed initial coordinates?
//Matrix.translateM(mMVPMatrix, 0, currSprite.coordX, currSprite.coordY, currSprite.coordZ);
currSprite.Draw(mMVPMatrix);
}
*/
}
public void onSurfaceChanged(GL10 unused, int width, int height)
{
GLES20.glViewport(0, 0, width, height);
if (height == 0) {
height = 1; //incase of div 0 errors
}
float ratio = (float) width / height;
final float left = -ratio;
final float right = ratio;
final float bottom = -1.0f;
final float top = 1.0f;
final float near = 1.0f;
final float far = 10.f;
//This Projection Matrix is applied to object coordinates in the onDrawFrame() method
//Matrix.frustumM(mProjMatrix, 0, -ratio, ratio, -1, 1, 3, 7);
Matrix.frustumM(mProjMatrix, 0, left, right, bottom, top, near, far);
}
public static int loadShader(int type, String shaderCode)
{
//Create a Vertex Shader Type Or a Fragment Shader Type (GLES20.GL_VERTEX_SHADER OR GLES20.GL_FRAGMENT_SHADER)
int shader = GLES20.glCreateShader(type);
//Add The Source Code and Compile it
GLES20.glShaderSource(shader, shaderCode);
GLES20.glCompileShader(shader);
return shader;
}
}
Please excuse the commented code in OnDrawFrame() where I've been experimenting, and failing.
Sprite Class
public class Sprite
{
//Reference to Activity Context
private final Context mActivityContext;
//Added for Textures
private final FloatBuffer mCubeTextureCoordinates;
private int mTextureUniformHandle;
private int mTextureCoordinateHandle;
private final int mTextureCoordinateDataSize = 2;
private int mTextureDataHandle;
private final String vertexShaderCode =
//Test
"attribute vec2 a_TexCoordinate;" +
"varying vec2 v_TexCoordinate;" +
//End Test
"uniform mat4 uMVPMatrix;" +
"attribute vec4 vPosition;" +
"void main() {" +
" gl_Position = vPosition * uMVPMatrix;" +
//Test
"v_TexCoordinate = a_TexCoordinate;" +
//End Test
"}";
private final String fragmentShaderCode =
"precision mediump float;" +
"uniform vec4 vColor;" +
//Test
"uniform sampler2D u_Texture;" +
"varying vec2 v_TexCoordinate;" +
//End Test
"void main() {" +
//"gl_FragColor = vColor;" +
"gl_FragColor = (vColor * texture2D(u_Texture, v_TexCoordinate));" +
"}";
private final int shaderProgram;
private final FloatBuffer vertexBuffer;
private final ShortBuffer drawListBuffer;
private int mPositionHandle;
private int mColorHandle;
private int mMVPMatrixHandle;
public float coordX;
public float coordY;
//public float coordZ;
// number of coordinates per vertex in this array
static final int COORDS_PER_VERTEX = 2;
static float spriteCoords[] = { -0.5f, 0.5f, // top left
-0.5f, -0.5f, // bottom left
0.5f, -0.5f, // bottom right
0.5f, 0.5f }; //top right
private short drawOrder[] = { 0, 1, 2, 0, 2, 3 }; //Order to draw vertices
private final int vertexStride = COORDS_PER_VERTEX * 4; //Bytes per vertex
// Set color with red, green, blue and alpha (opacity) values
//float color[] = { 0.63671875f, 0.76953125f, 0.22265625f, 1.0f };
float color[] = { 1f, 1f, 1f, 1.0f };
public Sprite(final Context activityContext, float initX, float initY, float initZ)
{
mActivityContext = activityContext;
this.coordX = initX;
this.coordY = initY;
//this.coordZ = initZ;
//ergh - will do manually for now. Paxo n00b
//just a 2D array, no need for Z nonsense
for (int i=0; i<spriteCoords.length; i++) {
spriteCoords[i] -= (i%2==0) ? coordX : coordY; //- works better than +
}
//float newPosMatrix[] = { initX, initY, 0f };
//adjust the vector coords accordingly
//Matrix.multiplyMV(spriteCoords, 0, newPosMatrix, 0, spriteCoords, 0);
//Initialize Vertex Byte Buffer for Shape Coordinates / # of coordinate values * 4 bytes per float
ByteBuffer bb = ByteBuffer.allocateDirect(spriteCoords.length * 4);
//Use the Device's Native Byte Order
bb.order(ByteOrder.nativeOrder());
//Create a floating point buffer from the ByteBuffer
vertexBuffer = bb.asFloatBuffer();
//Add the coordinates to the FloatBuffer
vertexBuffer.put(spriteCoords);
//Set the Buffer to Read the first coordinate
vertexBuffer.position(0);
// S, T (or X, Y)
// Texture coordinate data.
// Because images have a Y axis pointing downward (values increase as you move down the image) while
// OpenGL has a Y axis pointing upward, we adjust for that here by flipping the Y axis.
// What's more is that the texture coordinates are the same for every face.
final float[] cubeTextureCoordinateData =
{
//Front face
/*0.0f, 0.0f,
0.0f, 1.0f,
1.0f, 0.0f,
0.0f, 1.0f,
1.0f, 1.0f,
1.0f, 0.0f*/
/*-0.5f, 0.5f,
-0.5f, -0.5f,
0.5f, -0.5f,
0.5f, 0.5f*/
0f, 1f,
0f, 0f,
1f, 0f,
1f, 1f
};
mCubeTextureCoordinates = ByteBuffer.allocateDirect(cubeTextureCoordinateData.length * 4).order(ByteOrder.nativeOrder()).asFloatBuffer();
mCubeTextureCoordinates.put(cubeTextureCoordinateData).position(0);
//Initialize byte buffer for the draw list
ByteBuffer dlb = ByteBuffer.allocateDirect(spriteCoords.length * 2);
dlb.order(ByteOrder.nativeOrder());
drawListBuffer = dlb.asShortBuffer();
drawListBuffer.put(drawOrder);
drawListBuffer.position(0);
int vertexShader = MyGL20Renderer.loadShader(GLES20.GL_VERTEX_SHADER, vertexShaderCode);
int fragmentShader = MyGL20Renderer.loadShader(GLES20.GL_FRAGMENT_SHADER, fragmentShaderCode);
shaderProgram = GLES20.glCreateProgram();
GLES20.glAttachShader(shaderProgram, vertexShader);
GLES20.glAttachShader(shaderProgram, fragmentShader);
//Texture Code
GLES20.glBindAttribLocation(shaderProgram, 0, "a_TexCoordinate");
GLES20.glLinkProgram(shaderProgram);
//Load the texture
mTextureDataHandle = loadTexture(mActivityContext, R.drawable.cube);
}
public void Draw(float[] mvpMatrix)
{
//Add program to OpenGL ES Environment
GLES20.glUseProgram(shaderProgram);
//Get handle to vertex shader's vPosition member
mPositionHandle = GLES20.glGetAttribLocation(shaderProgram, "vPosition");
//Enable a handle to the triangle vertices
GLES20.glEnableVertexAttribArray(mPositionHandle);
//Prepare the triangle coordinate data
GLES20.glVertexAttribPointer(mPositionHandle, COORDS_PER_VERTEX, GLES20.GL_FLOAT, false, vertexStride, vertexBuffer);
//Get Handle to Fragment Shader's vColor member
mColorHandle = GLES20.glGetUniformLocation(shaderProgram, "vColor");
//Set the Color for drawing the triangle
GLES20.glUniform4fv(mColorHandle, 1, color, 0);
//Set Texture Handles and bind Texture
mTextureUniformHandle = GLES20.glGetAttribLocation(shaderProgram, "u_Texture");
mTextureCoordinateHandle = GLES20.glGetAttribLocation(shaderProgram, "a_TexCoordinate");
//Set the active texture unit to texture unit 0.
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
//Bind the texture to this unit.
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, mTextureDataHandle);
//Tell the texture uniform sampler to use this texture in the shader by binding to texture unit 0.
GLES20.glUniform1i(mTextureUniformHandle, 0);
//Pass in the texture coordinate information
mCubeTextureCoordinates.position(0);
GLES20.glVertexAttribPointer(mTextureCoordinateHandle, mTextureCoordinateDataSize, GLES20.GL_FLOAT, false, 0, mCubeTextureCoordinates);
GLES20.glEnableVertexAttribArray(mTextureCoordinateHandle);
//Get Handle to Shape's Transformation Matrix
mMVPMatrixHandle = GLES20.glGetUniformLocation(shaderProgram, "uMVPMatrix");
//Apply the projection and view transformation
GLES20.glUniformMatrix4fv(mMVPMatrixHandle, 1, false, mvpMatrix, 0);
//glTranslatef(0f, 0f, 0f);
//Draw the triangle
GLES20.glDrawElements(GLES20.GL_TRIANGLES, drawOrder.length, GLES20.GL_UNSIGNED_SHORT, drawListBuffer);
//Disable Vertex Array
GLES20.glDisableVertexAttribArray(mPositionHandle);
}
public static int loadTexture(final Context context, final int resourceId)
{
final int[] textureHandle = new int[1];
GLES20.glGenTextures(1, textureHandle, 0);
if (textureHandle[0] != 0)
{
final BitmapFactory.Options options = new BitmapFactory.Options();
options.inScaled = false; // No pre-scaling
// Read in the resource
final Bitmap bitmap = BitmapFactory.decodeResource(context.getResources(), resourceId, options);
// Bind to the texture in OpenGL
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureHandle[0]);
// Set filtering
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_NEAREST);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_NEAREST);
// Load the bitmap into the bound texture.
GLUtils.texImage2D(GLES20.GL_TEXTURE_2D, 0, bitmap, 0);
// Recycle the bitmap, since its data has been loaded into OpenGL.
bitmap.recycle();
}
if (textureHandle[0] == 0)
{
throw new RuntimeException("Error loading texture.");
}
return textureHandle[0];
}
}
Now, I don't know if I'm going about this the right way at all, but I simply want to just animate the collection of Sprite objects in spriteList.
More specifically, have a collection of 3 objects and then respond to screen touch, and animate the objects to that location (but that will come later)
Initially, I just want to be able to correctly render these objects (with initial locations) and then rotate them on the centre point (about the Z axis).
For some reason, TranslateM is warping the texture (as if about the Y axis) and not actually moving an object along the X/Y planes
Many thanks for any help you can offer. As you can see I'm fairly new to OpenGL and have had little luck with the limited tutorials out there that support Android Studio and GLES2.0.
Kind regards,
James
I think the problem is that you have not multiplied the translation matrices into your rotation matrices. A matrix multiply is required to combine those.

Categories

Resources