Texture on mesh doesnt render, just shows black libgdx gl20 - java

edit--
I've updated my code after TenFour04s answer but still just shows black.
I've updated my libgdx and its required me to use gl20 which has lead me to make a few changes
most of it works fine except when trying to do texture the mesh. This currently shows surface mesh as black and doesnt show the ground mesh at all. with some changes I can get it to show both surface and ground meshes as black.
I've played around with binding and the order of surTexture.bind and grdTexture.bind and using numbers less than 16 and render and i've got it to use the surface texture as the texture for everything except the surface and ground.
Can anyone see where I might be going wrong with this?
// creating a mesh with maxVertices set to vertices,size*3
groundMesh = new Mesh(true, vertices.size*3, vertices.size*3,
new VertexAttribute(Usage.Position,2,ShaderProgram.POSITION_ATTRIBUTE),
new VertexAttribute(Usage.TextureCoordinates, 2, ShaderProgram.TEXCOORD_ATTRIBUTE));
groundMesh.setVertices(temp);
short[] indices = new short[vertices.size*2];
for(int i=0;i<vertices.size*2;i++){
indices[i] = (short)i;
}
groundMesh.setIndices(indices);
surfaceMesh = new Mesh(true, vertices.size*3, vertices.size*3,
new VertexAttribute(Usage.Position,3,ShaderProgram.POSITION_ATTRIBUTE),
new VertexAttribute(Usage.TextureCoordinates, 2, ShaderProgram.TEXCOORD_ATTRIBUTE));
...
grdTexture = new Texture(Gdx.files.internal("data/img/leveltest/ground.png"));
// Gdx.graphics.getGL20().glActiveTexture(GL20.GL_TEXTURE16);
//says that setWrap and SetFilter bind the texture so I thought I might have to set
//activetexture here but does nothing.
grdTexture.setWrap(TextureWrap.Repeat, TextureWrap.Repeat);
grdTexture.setFilter(TextureFilter.Linear, TextureFilter.Linear);
surTexture = new Texture(Gdx.files.internal("data/img/leveltest/surface.png"));
// Gdx.graphics.getGL20().glActiveTexture(GL20.GL_TEXTURE17);
surTexture.setWrap(TextureWrap.Repeat, TextureWrap.ClampToEdge);
//TODO change these filters for better quality
surTexture.setFilter(TextureFilter.Linear, TextureFilter.Linear);
drawWorld gets called inside render()
public void drawWorld(SpriteBatch batch,OrthographicCamera camera) {
batch.begin();
batch.setProjectionMatrix(camera.combined);
layers.drawLayers(batch);
if ((spatials != null) && (spatials.size > 0)){
for (int i = 0; i < spatials.size; i++){
spatials.get(i).render(batch);
}
}
batch.end();
drawGround(camera);
}
private void drawGround(OrthographicCamera camera){
shader.begin();
shader.setUniformMatrix("u_projTrans", camera.combined);
grdTexture.bind(0);
shader.setUniformi("u_texture", 0);
//changed GL_TRIANGLES to GL_TRIANGLE_STRIP to render meshes correctly after changing to GL20
groundMesh.render(shader, GL20.GL_TRIANGLE_STRIP);
surTexture.bind(0);
shader.setUniformi("u_texture", 0);
surfaceMesh.render(shader, GL20.GL_TRIANGLE_STRIP);
shader.end();
}
fragment.glsl
#ifdef GL_ES
#define LOWP lowp
precision mediump float;
#else
#define LOWP
#endif
varying LOWP vec4 v_color;
varying vec2 v_texCoords;
uniform sampler2D u_texture;
void main()
{
gl_FragColor = v_color * texture2D(u_texture, v_texCoords);
}
vertex.glsl
attribute vec4 a_position;
attribute vec4 a_color;
attribute vec2 a_texCoord;
uniform mat4 u_projTrans;
varying vec4 v_color;
varying vec2 v_texCoords;
void main() {
v_color = a_color;
v_color.a = v_color.a * (256.0/255.0);
v_texCoords = a_texCoord;
gl_Position = u_projTrans * a_position;
}

In your vertex shader, you are using a_texCoord, but in your mesh constructor, you have effectively named your attributes a_texCoord16 and a_texCoord17 by using ShaderProgram.TEXCOORD_ATTRIBUTE+"16" and ShaderProgram.TEXCOORD_ATTRIBUTE+"17".
Since you are not multi-texturing, I would just replace those with "a_texCoord".
It looks like maybe you are conflating attribute name suffixes with what texture units are, although the two concepts are not necessarily related. The reason you might want to add number suffixes to your texCoords is if your mesh has multiple UV's for each vertex because it is multi-tetxtured. But really you can use any naming scheme you like. The reason you might want to bind to a unit other than 0 is if you're multi-texturing on a single mesh so you need multiple textures bound at once. So if you actually were multi-texturing, using attribute suffixes that match texture unit numbers might help avoid confusion when you are trying to match UV's to textures in the fragment shader.

ok so it turns out the problem was the vertex shader
They code from here doesnt work.
here is the working shader
attribute vec4 a_position;
attribute vec4 a_color;
attribute vec2 a_texCoord;
uniform mat4 u_projTrans;
varying vec4 v_color;
varying vec2 v_texCoords;
void main() {
v_color = vec4(1, 1, 1, 1);
v_texCoords = a_texCoord;
gl_Position = u_projTrans * a_position;
}

Related

Unable to load Attribute into shader

I would like to have some custom data stored inside the shader to be re-used for multiple frame during rendering.
At first what I try to do is to store a color buffer, to see is this can be done.
vertex shader:
attribute vec4 a_position;
attribute vec4 a_color;
attribute vec2 a_texCoord0;
uniform mat4 u_projTrans;
varying vec4 v_color;
varying vec2 v_texCoords;
vec2 u_mouse;
uniform float u_time;
attribute vec3 my_data;
varying vec3 frag_data;
void main() {
v_color = a_color;
v_texCoords = a_texCoord0;
gl_Position = u_projTrans * a_position;
frag_data = my_data;
}
fragment shader:
#ifdef GL_ES
precision mediump float;
#endif
uniform vec2 u_resolution;
uniform vec2 u_mouse;
uniform float u_time;
varying vec3 frag_data;
void main(){
gl_FragColor = vec4(vec3(frag_data),1.0);
}
The shaders compile and work as intended with the exception of the usage of vec3 my_data:
ShaderProgram.pedantic = false;
vertex_shader = Gdx.files.internal("vertex_shader.glsl").readString();
fragment_shader = Gdx.files.internal("fragment_shader.glsl").readString();
shader_program = new ShaderProgram(vertex_shader, fragment_shader);
I try to set the variable from a button-click, like so:
goBtn = new TextButton("Reset", textButtonStyle);
goBtn.setSize(128,128);
goBtn.addListener(new ChangeListener() {
#Override
public void changed(ChangeEvent event, Actor actor) {
FloatBuffer myArray = BufferUtils.newFloatBuffer(3);
myArray.put(1.0f);myArray.put(1.0f);myArray.put(1.0f);
shader_program.begin();
shader_program.setVertexAttribute("my_data", 3, GL20.GL_FLOAT, false, 0, myArray);
shader_program.end();
}
});
But it has no effect..
What am I doing wrong here?
What would be the correct way to do this?
I've never tried setting a vertex attribute via the ShaderProgram. I'm not even sure why that exists as an option, because vertex data is supposed to be per-vertex. Since you are setting this for the whole mesh, it should be a uniform. And since it's a uniform, you don't have to pass it via a varying either.
So remove the varying and attribute from your vertex shader. And change the varying in the fragment shader to a uniform.
And to pass the data:
public void changed(ChangeEvent event, Actor actor) {
shader_program.begin();
shader_program.setUniformf("my_data", 1.0f, 1.0f, 1.0f);
shader_program.end();
}
But you should also call this code to set the default values for this uniform when you are first setting things up. Behavior might be undefined before you set their values.

Shader gives wrong fragment color

I've run into the strangest problem I've ever had in my shader experience. I created some test shader code (see below) and ran it on a simple texture.
Basically what I was trying to do is make my shader check the color of a fragment, and if it was within a certain range it would color that fragment according to a uniform color variable. The problem I'm having is that my shader does not correctly recognize the color of a fragment. I even went as far as to check if the red portion of the color is equal to one and it always returns true for every fragment. Yet if I use the same shader to draw the original texture it works just fine.
Why is this happening? The shader compiles without any errors what so ever. I feel like I'm missing something obvious...
Code (if you have access to LibGDX you can run this code for yourself).
// Vertex
#version 330
in vec4 a_position;
in vec4 a_color;
in vec2 a_texCoord0;
out vec4 v_color;
out vec2 v_texCoord0;
uniform mat4 u_projTrans;
void main() {
v_color = a_color;
v_texCoord0 = a_texCoord0;
gl_Position = u_projTrans * a_position;
}
// Fragment
#version 330
in vec4 v_color;
in vec2 v_texCoord0;
out vec4 outColor;
uniform vec4 color;
uniform sampler2D u_texture;
void main() {
// This is always true for some reason...
if(v_color.r == 1.0) {
outColor = color;
} else {
// But if I run this code it draws the original texture just fine with the correct colors.
outColor = v_color * texture2D(u_texture, v_texCoord0);
}
}
// Java code.
// Creating sprite, shader and sprite batch.
SpriteBatch batch = new SpriteBatch();
Sprite sprite = new Sprite(new Texture("testTexture.png"));
ShaderProgram shader = new ShaderProgram(Gdx.files.internal("vertex.vert"),
Gdx.files.internal("fragment.frag"));
// Check to see if the shader has logged any errors. Prints nothing.
System.out.println(shader.getLog());
// We start the rendering.
batch.begin();
// We begin the shader so we can load uniforms into it.
shader.begin();
// Set the uniform (works fine).
shader.setUniformf("color", Color.RED);
// We end the shader to tell it we've finished loading uniforms.
shader.end();
// We then tell our renderer to use this shader.
batch.setShader(shader);
// Then we draw our sprite.
sprite.draw(batch);
// And finally we tell our renderer that we're finished drawing.
batch.end();
// Dispose to release resources.
shader.dispose();
batch.dispose();
sprite.getTexture().dispose();
The texture:
You have 2 input colors for your fragment shader: one sent as a vertex attribute and one as a texture. You intend to check the texture color but instead you check for the color value sent as the vertex attribute.
// Vertex
#version 330
in vec4 a_position;
in vec4 a_color; // <--- A color value is passed as a vertex attribute
in vec2 a_texCoord0;
out vec4 v_color;
out vec2 v_texCoord0;
uniform mat4 u_projTrans;
void main() {
v_color = a_color; // <--- you are sending it to fragment shader
v_texCoord0 = a_texCoord0;
gl_Position = u_projTrans * a_position;
}
// Fragment
#version 330
in vec4 v_color; // <--- coming from vertex buffer
in vec2 v_texCoord0;
out vec4 outColor;
uniform vec4 color;
uniform sampler2D u_texture;
void main() {
// This is always true for some reason...
if(v_color.r == 1.0) { // <--- and then checking this vertex attribute color
outColor = color; // instead of what you send from texture
} else {
...
}
}

Libgdx shaders in a Stage?

In libgdx, I have a shader loaded, and I want to make my Stage object use that shader to draw. I tried setting a SpriteBatch's shader to my shader, and then the Stage's sprite batch to that one, but it shows up as a black screen. Why doesn't this work:
ShaderProgram shader = new ShaderProgram(Gdx.files.internal("shader.vert"), Gdx.files.internal("shader.frag"));
SpriteBatch batch = new SpriteBatch();
batch.setShader(shader);
Stage stage = new Stage(new StretchViewport(768, 576), batch);
...
stage.act();
shader.begin();
stage.draw();
shader.end();
My shaders look like:
shader.vert
attribute vec4 a_position;
attribute vec4 a_color;
attribute vec2 a_texCoord;
uniform mat4 u_projTrans;
varying vec4 v_color;
varying vec2 v_texCoords;
void main()
{
v_color = a_color;
v_color.a = v_color.a * (256.0/255.0);
v_texCoords = a_texCoord + 0;
gl_Position = u_projTrans * a_position;
}
shader.frag
#ifdef GL_ES
#define LOWP lowp
precision mediump float;
#else
#define LOWP
#endif
varying LOWP vec4 v_color;
varying vec2 v_texCoords;
uniform sampler2D u_texture;
void main()
{
gl_FragColor = v_color * texture2D(u_texture, v_texCoords);
}
It seems to only work with one texture. Every other texture doesn't render. And also, making textures power of 2 doesn't make a difference.
SpriteBatch has certain expected attribute names. You are using a_texCoord, but it expects a_texCoord0, so the shader is going to treat all the UV's as if they are 0,0.
You mention setting the shader on both the SpriteBatch and the Stage's SpriteBatch, but in your code, they are one and the same.
You don't need to call begin and end on the shader, because SpriteBatch (and therefore Stage) does this automatically. I'm not sure if that could cause problems.
I can't explain why what you did would have worked with one of your textures. I would expect it to just draw everything as the same color as the lower left pixel of the texture.
There might be other issues afoot, but I'm assuming it was drawing correctly before you inserted your own shader.

'texture2D' : No matching overloaded function found OpenGL ES2?

I was working on a project, and for that project I had to walk through a book called "OpenGL ES 2 For Android: A quick start guide".
So when I got to texturing, I got the error of:
'texture2D' : No matching overloaded function found
...when I compile the shader.
The shader code:
// Fragment shader
precision mediump float;
uniform sampler2D u_TextureUnit;
varying vec4 v_TextureCoordinates;
void main()
{
gl_FragColor = texture2D(u_TextureUnit, v_TextureCoordinates);
}
// Vertex shader
uniform mat4 u_Matrix;
attribute vec4 a_Position;
attribute vec4 a_TextureCoordinates;
varying vec4 v_TextureCoordinates;
void main()
{
gl_Position = u_Matrix * a_Position;
v_TextureCoordinates = a_TextureCoordinates;
}
I tried the same shaders for my project and for exactly the same code as in the book but it still gives me the same error when I compile the shader, and the viewport on the android device is blank, just the clear color I set is shown.
varying vec4 v_TextureCoordinates;
^^^^
There are exactly two texture2D() overloads in ES 2.0:
vec4 texture2D(sampler2D sampler, vec2 coord)
vec4 texture2D(sampler2D sampler, vec2 coord, float bias)
...neither of which accept a vec4 for coord.
Slice off the last two vector components of v_TextureCoordinates using a swizzle:
gl_FragColor = texture2D(u_TextureUnit, v_TextureCoordinates.xy );

Getting world position for deferred rendering light pass

I have recently began to build some kind of deferred rendering pipeline for the engine I am working on but I'm stuck at reconstructing the world position from depth. I have looked at quite a few examples where the explain that you need either a world position texture or a depth texture to then use for the correct distance and direction calculation of the light.
My problem is that the so called position texture which assumably is the world position doesn't seem to give me correct data. Therefore I tried to find alternative ways of getting a world position and some have suggested that I should use a depth texture instead but then what?
To make it all more clear this picture shows the textures that I currently have stored:
Position(Top left), Normal(Top right), Diffuse(Bottom left) and Depth(Bottom right).
For the light pass I am trying to use a method which works fine if used in the first pass. When I try the same method for the light pass with the exact same variables it stops working.
Here's my Geometry Vertex Shader:
#version 150
uniform mat4 projectionMatrix;
uniform mat4 viewMatrix;
uniform mat4 modelMatrix;
in vec4 in_Position;
in vec3 in_Normal;
in vec2 in_TextureCoord;
out vec3 pass_Normals;
out vec4 pass_Position;
out vec2 pass_TextureCoord;
out vec4 pass_Diffuse;
void main(void) {
pass_Position = viewMatrix * modelMatrix * in_Position;
pass_Normals = (viewMatrix * modelMatrix * vec4(in_Normal, 0.0)).xyz;
pass_Diffuse = vec4(1,1,1,1);
gl_Position = projectionMatrix * pass_Position;
}
Geometry Fragment shader:
#version 150 core
uniform sampler2D texture_diffuse;
uniform mat4 projectionMatrix;
uniform mat4 viewMatrix;
uniform mat4 modelMatrix;
in vec4 pass_Position;
in vec3 pass_Normals;
in vec2 pass_TextureCoord;
in vec4 pass_Diffuse;
out vec4 out_Diffuse;
out vec4 out_Position;
out vec4 out_Normals;
void main(void) {
out_Position = pass_Position;
out_Normals = vec4(pass_Normals, 1.0);
out_Diffuse = pass_Diffuse;
}
Light Vertex Shader:
#version 150
in vec4 in_Position;
in vec2 in_TextureCoord;
out vec2 pass_TextureCoord;
void main( void )
{
gl_Position = in_Position;
pass_TextureCoord = in_TextureCoord;
}
Light Fragment Shader:
#version 150 core
uniform sampler2D texture_Diffuse;
uniform sampler2D texture_Normals;
uniform sampler2D texture_Position;
uniform vec3 cameraPosition;
uniform mat4 viewMatrix;
in vec2 pass_TextureCoord;
out vec4 frag_Color;
void main( void )
{
frag_Color = vec4(1,1,1,1);
vec4 image = texture(texture_Diffuse,pass_TextureCoord);
vec3 position = texture( texture_Position, pass_TextureCoord).rgb;
vec3 normal = texture( texture_Normals, pass_TextureCoord).rgb;
frag_Color = image;
vec3 LightPosition_worldspace = vec3(0,2,0);
vec3 vertexPosition_cameraspace = position;
vec3 EyeDirection_cameraspace = vec3(0,0,0) - vertexPosition_cameraspace;
vec3 LightPosition_cameraspace = ( viewMatrix * vec4(LightPosition_worldspace,1)).xyz;
vec3 LightDirection_cameraspace = LightPosition_cameraspace + EyeDirection_cameraspace;
vec3 n = normal;
vec3 l = normalize( LightDirection_cameraspace );
float cosTheta = max( dot( n,l ), 0);
float distance = distance(LightPosition_cameraspace, vertexPosition_cameraspace);
frag_Color = vec4((vec3(10,10,10) * cosTheta)/(distance*distance)), 1);
}
And finally, here's the current result:
So My question is if anyone please can explain the result or how I should do to get a correct result. I would also appreciate good resources on the area.
Yes, using the depth buffer to reconstruct position is your best bet. This will significantly cut down on memory bandwidth / storage requirements. Modern hardware is biased towards doing shader calculations rather than memory fetches (this was not always the case), and the instructions necessary to reconstruct position per-fragment will always finish quicker than if you were to fetch the position from a texture with adequate precision. Now, you just have to realize what the hardware depth buffer stores (understand how depth range and perspective distribution work) and you will be good to go.
I do not see any attempt at reconstruction of world/view space position from the depth buffer in the code your question lists. You are just sampling from a buffer that stores the position in view-space. Since you are not performing reconstruction in this example, the problem has to do with sampling the view-space position... can you update your question to include the internal formats of the G-Buffer textures. In particular, are you using a format that can represent negative values (this is necessary to express position, otherwise negative values are clamped to 0).
On a final note, your position is also view-space and not world-space, a trained eye can tell this immediately by the way the colors in your position buffer are black in the lower-left corner. If you want to debug your position/normal, you should bias/scale the sampled colors into the visible range:
([-1.0, 1.0] -> [0.0, 1.0]) // Vec = Vec * 0.5 + 0.5
You may need to do this when you output some of the buffers if you want to store the normal G-Buffer more efficiently (e.g. in an 8-bit fixed-point texture instead of floating-point).

Categories

Resources