I'm attempting to get the uniform locations of variables in my shader.
#Override
public void getAllUniformLocations(){
location_transformationMatrix = super.getUniformLocation("transformationMatrix");
location_lightPosition = super.getUniformLocation("lightPosition");
location_lightColour = super.getUniformLocation("lightColour");
System.out.println(location_transformationMatrix + " | "
+ location_lightPosition + " | " + location_lightColour);
}
public int getUniformLocation(String name){
return GL20.glGetUniformLocation(programID, name);
}
However, when it prints out, it prints completely incorrectly:
0 | -1 | -1
0 is for the fixed-function pipeline, so having something bound there is going to completely crash the program, and the other two are simply returning -1, which is a null value.
Here is my shader loading code:
public ShaderProgram(String vertexFile,String fragmentFile){
System.out.println("Comiling shader!");
vertexShaderID = loadShader(vertexFile,GL20.GL_VERTEX_SHADER);
fragmentShaderID = loadShader(fragmentFile,GL20.GL_FRAGMENT_SHADER);
programID = GL20.glCreateProgram();
GL20.glAttachShader(programID, vertexShaderID);
GL20.glAttachShader(programID, fragmentShaderID);
bindAttributes();
getAllUniformLocations();
GL20.glLinkProgram(programID);
GL20.glValidateProgram(programID);
start();
System.out.println(GL11.glGetError());
System.out.println("Comiled shader!");
}
The shaders bind, attach, validate, compile, etc, completely successfully, and the game does successfully run at this point. However, the uniform locations are missing and so I am unable to achieve things like lighting.
Here is my vertex shader:
#version 130
in vec3 position;
in vec2 textureCoords;
in vec3 normal;
out vec2 pass_textureCoords;
out vec3 surfaceNormal;
out vec3 toLightVector;
uniform mat4 transformationMatrix;
uniform vec3 lightPosition;
void main(void){
gl_Position = ftransform();
pass_textureCoords = textureCoords;
surfaceNormal = (transformationMatrix * vec4(normal, 0.0)).xyz;
toLightVector = (vec3(0, 20000, 0)) - (transformationMatrix * vec4(position, 1.0)).xyz;
}
Variables 0, 1, 2, and 3 are all bound to attributes, so I would expect the numbers to return something along the lines of "4 | 5 | 6".
Here is my fragment shader:
#version 130
in vec2 pass_textureCoords;
in vec3 surfaceNormal;
in vec3 toLightVector;
out vec4 out_Color;
uniform sampler2D textureSampler;
uniform vec3 lightColour;
void main(void){
vec3 unitNormal = normalize(surfaceNormal);
vec3 unitLightVector = normalize(toLightVector);
float nDot1 = dot(unitNormal, unitLightVector);
float brightness = max(nDot1, 0.5);
brightness = brightness * 1.5;
vec3 diffuse = brightness * vec3(1, 1, 1);
vec4 text = texture(textureSampler, pass_textureCoords);
vec4 textureColor = vec4(diffuse, 1.0) * text;
if(textureColor.a<0.01){
discard;
}
out_Color = vec4(textureColor.r, textureColor.g, textureColor.b, textureColor.a);
}
The results you are getting are absoutely correct, your interpretations of them are not:
0 is for the fixed-function pipeline, so having something bound there
is going to completely crash the program,
No. 0 is a perfectly valid uniform location, which has nothing to do the fixed function pipeline. You somehow seem to confuse the program object 0 (which represents the fixed-function pipeline in legacy GL and compatibility profiles) with the per-program uniform locations.
and the other two are simply
returning -1, which is a null value.
Well, more or less. -1 means that there is no acivte uniform of that name. Note that calling glUniform*() functions with location -1 is explicitely defined to be no error - such calls just have no effect.
The shader code you have pasted simply does not use the lightPosition and lightColour uniforms, so they are not there. You might intend to use them in the fragment shader, but I'd bet my life on it that you don't do it there in the correct way - i.e. even if you declare and use them in the code, they might still have no effect on the shader outputs, so are still not active.
Related
Why are my uniform vector and float not being initialized? My shader code compiles correctly, my shader is compiling properly, but when I try to get the uniform location of my vec4 lightDirection, and floats specularFactor and diffuseFactor, it gives me an error. Note I haven't actually used these uniforms for anything yet, however, that shouldn't matter.
Here is my vertex shader, which gets all uniform locations properly:
#version 330
layout (location = 0) in vec3 position;
layout (location = 1) in vec2 texture_coord;
layout (location = 2) in vec3 normal;
layout (location = 3) in vec3 fNormal;
uniform mat4 worldMat;
uniform mat4 projection;
uniform mat4 transform;
out vec2 tex;
out vec3 n;
out vec3 fn;
void main() {
fn=fNormal;
n=normal;
tex = texture_coord;
gl_Position = projection * worldMat * transform * vec4(position, 1);
}
Here is my fragment shader, which only gets the texture sample uniform location:
#version 330
uniform sampler2D sampleTexture;
uniform vec4 lightDirection;
uniform float specularFactor;
uniform float diffuseFactor;
in vec2 tex;
in vec3 n;
in vec3 fn;
out vec4 fragColor;
void main() {
fragColor=texture(sampleTexture, tex);
}
Here is the method I use to get uniform locations (I am using Java):
public int loadUniform(String uniformName)throws Exception{
int iD= glGetUniformLocation(program,uniformName);
System.out.println("PROGRAM: "+program+" UNIFORM: "+uniformName+" "+iD);
if(iD==-1) {
throw new Exception("uniform:"+uniformName+" not initialized");
}
return iD;
}
Now, here is what is printed in console by print statment/exception. Am confusion. Numbers definitely seem wrong, but whatevs. I don't understaaaaand.
java.lang.Exception: uniform:lightDirection not initialized
at shader_src.Shader.loadUniform(Shader.java:273)
at shader_src.StaticShader.initValues(StaticShader.java:51)
at application.Main.main(Main.java:94)
PROGRAM: 4 UNIFORM: worldMat 9
PROGRAM: 4 UNIFORM: projection 5
PROGRAM: 4 UNIFORM: transform 0
PROGRAM: 4 UNIFORM: sampleTexture 4
PROGRAM: 4 UNIFORM: lightDirection -1
The glsl compiler and linker optimizes the code. Unnecessary code will be removed, and unnecessary uniforms and attributes will not become active program resources. If uniform variables are not used or are only used in a part of the code that is itself optimized out, they do not become active program resources. lightDirection, specularFactor and diffuseFactor are unused variables and therefore not active resources thus no active resources and therefore you cannot get a uniform location for these variables.
I would like to have some custom data stored inside the shader to be re-used for multiple frame during rendering.
At first what I try to do is to store a color buffer, to see is this can be done.
vertex shader:
attribute vec4 a_position;
attribute vec4 a_color;
attribute vec2 a_texCoord0;
uniform mat4 u_projTrans;
varying vec4 v_color;
varying vec2 v_texCoords;
vec2 u_mouse;
uniform float u_time;
attribute vec3 my_data;
varying vec3 frag_data;
void main() {
v_color = a_color;
v_texCoords = a_texCoord0;
gl_Position = u_projTrans * a_position;
frag_data = my_data;
}
fragment shader:
#ifdef GL_ES
precision mediump float;
#endif
uniform vec2 u_resolution;
uniform vec2 u_mouse;
uniform float u_time;
varying vec3 frag_data;
void main(){
gl_FragColor = vec4(vec3(frag_data),1.0);
}
The shaders compile and work as intended with the exception of the usage of vec3 my_data:
ShaderProgram.pedantic = false;
vertex_shader = Gdx.files.internal("vertex_shader.glsl").readString();
fragment_shader = Gdx.files.internal("fragment_shader.glsl").readString();
shader_program = new ShaderProgram(vertex_shader, fragment_shader);
I try to set the variable from a button-click, like so:
goBtn = new TextButton("Reset", textButtonStyle);
goBtn.setSize(128,128);
goBtn.addListener(new ChangeListener() {
#Override
public void changed(ChangeEvent event, Actor actor) {
FloatBuffer myArray = BufferUtils.newFloatBuffer(3);
myArray.put(1.0f);myArray.put(1.0f);myArray.put(1.0f);
shader_program.begin();
shader_program.setVertexAttribute("my_data", 3, GL20.GL_FLOAT, false, 0, myArray);
shader_program.end();
}
});
But it has no effect..
What am I doing wrong here?
What would be the correct way to do this?
I've never tried setting a vertex attribute via the ShaderProgram. I'm not even sure why that exists as an option, because vertex data is supposed to be per-vertex. Since you are setting this for the whole mesh, it should be a uniform. And since it's a uniform, you don't have to pass it via a varying either.
So remove the varying and attribute from your vertex shader. And change the varying in the fragment shader to a uniform.
And to pass the data:
public void changed(ChangeEvent event, Actor actor) {
shader_program.begin();
shader_program.setUniformf("my_data", 1.0f, 1.0f, 1.0f);
shader_program.end();
}
But you should also call this code to set the default values for this uniform when you are first setting things up. Behavior might be undefined before you set their values.
edit--
I've updated my code after TenFour04s answer but still just shows black.
I've updated my libgdx and its required me to use gl20 which has lead me to make a few changes
most of it works fine except when trying to do texture the mesh. This currently shows surface mesh as black and doesnt show the ground mesh at all. with some changes I can get it to show both surface and ground meshes as black.
I've played around with binding and the order of surTexture.bind and grdTexture.bind and using numbers less than 16 and render and i've got it to use the surface texture as the texture for everything except the surface and ground.
Can anyone see where I might be going wrong with this?
// creating a mesh with maxVertices set to vertices,size*3
groundMesh = new Mesh(true, vertices.size*3, vertices.size*3,
new VertexAttribute(Usage.Position,2,ShaderProgram.POSITION_ATTRIBUTE),
new VertexAttribute(Usage.TextureCoordinates, 2, ShaderProgram.TEXCOORD_ATTRIBUTE));
groundMesh.setVertices(temp);
short[] indices = new short[vertices.size*2];
for(int i=0;i<vertices.size*2;i++){
indices[i] = (short)i;
}
groundMesh.setIndices(indices);
surfaceMesh = new Mesh(true, vertices.size*3, vertices.size*3,
new VertexAttribute(Usage.Position,3,ShaderProgram.POSITION_ATTRIBUTE),
new VertexAttribute(Usage.TextureCoordinates, 2, ShaderProgram.TEXCOORD_ATTRIBUTE));
...
grdTexture = new Texture(Gdx.files.internal("data/img/leveltest/ground.png"));
// Gdx.graphics.getGL20().glActiveTexture(GL20.GL_TEXTURE16);
//says that setWrap and SetFilter bind the texture so I thought I might have to set
//activetexture here but does nothing.
grdTexture.setWrap(TextureWrap.Repeat, TextureWrap.Repeat);
grdTexture.setFilter(TextureFilter.Linear, TextureFilter.Linear);
surTexture = new Texture(Gdx.files.internal("data/img/leveltest/surface.png"));
// Gdx.graphics.getGL20().glActiveTexture(GL20.GL_TEXTURE17);
surTexture.setWrap(TextureWrap.Repeat, TextureWrap.ClampToEdge);
//TODO change these filters for better quality
surTexture.setFilter(TextureFilter.Linear, TextureFilter.Linear);
drawWorld gets called inside render()
public void drawWorld(SpriteBatch batch,OrthographicCamera camera) {
batch.begin();
batch.setProjectionMatrix(camera.combined);
layers.drawLayers(batch);
if ((spatials != null) && (spatials.size > 0)){
for (int i = 0; i < spatials.size; i++){
spatials.get(i).render(batch);
}
}
batch.end();
drawGround(camera);
}
private void drawGround(OrthographicCamera camera){
shader.begin();
shader.setUniformMatrix("u_projTrans", camera.combined);
grdTexture.bind(0);
shader.setUniformi("u_texture", 0);
//changed GL_TRIANGLES to GL_TRIANGLE_STRIP to render meshes correctly after changing to GL20
groundMesh.render(shader, GL20.GL_TRIANGLE_STRIP);
surTexture.bind(0);
shader.setUniformi("u_texture", 0);
surfaceMesh.render(shader, GL20.GL_TRIANGLE_STRIP);
shader.end();
}
fragment.glsl
#ifdef GL_ES
#define LOWP lowp
precision mediump float;
#else
#define LOWP
#endif
varying LOWP vec4 v_color;
varying vec2 v_texCoords;
uniform sampler2D u_texture;
void main()
{
gl_FragColor = v_color * texture2D(u_texture, v_texCoords);
}
vertex.glsl
attribute vec4 a_position;
attribute vec4 a_color;
attribute vec2 a_texCoord;
uniform mat4 u_projTrans;
varying vec4 v_color;
varying vec2 v_texCoords;
void main() {
v_color = a_color;
v_color.a = v_color.a * (256.0/255.0);
v_texCoords = a_texCoord;
gl_Position = u_projTrans * a_position;
}
In your vertex shader, you are using a_texCoord, but in your mesh constructor, you have effectively named your attributes a_texCoord16 and a_texCoord17 by using ShaderProgram.TEXCOORD_ATTRIBUTE+"16" and ShaderProgram.TEXCOORD_ATTRIBUTE+"17".
Since you are not multi-texturing, I would just replace those with "a_texCoord".
It looks like maybe you are conflating attribute name suffixes with what texture units are, although the two concepts are not necessarily related. The reason you might want to add number suffixes to your texCoords is if your mesh has multiple UV's for each vertex because it is multi-tetxtured. But really you can use any naming scheme you like. The reason you might want to bind to a unit other than 0 is if you're multi-texturing on a single mesh so you need multiple textures bound at once. So if you actually were multi-texturing, using attribute suffixes that match texture unit numbers might help avoid confusion when you are trying to match UV's to textures in the fragment shader.
ok so it turns out the problem was the vertex shader
They code from here doesnt work.
here is the working shader
attribute vec4 a_position;
attribute vec4 a_color;
attribute vec2 a_texCoord;
uniform mat4 u_projTrans;
varying vec4 v_color;
varying vec2 v_texCoords;
void main() {
v_color = vec4(1, 1, 1, 1);
v_texCoords = a_texCoord;
gl_Position = u_projTrans * a_position;
}
I have recently began to build some kind of deferred rendering pipeline for the engine I am working on but I'm stuck at reconstructing the world position from depth. I have looked at quite a few examples where the explain that you need either a world position texture or a depth texture to then use for the correct distance and direction calculation of the light.
My problem is that the so called position texture which assumably is the world position doesn't seem to give me correct data. Therefore I tried to find alternative ways of getting a world position and some have suggested that I should use a depth texture instead but then what?
To make it all more clear this picture shows the textures that I currently have stored:
Position(Top left), Normal(Top right), Diffuse(Bottom left) and Depth(Bottom right).
For the light pass I am trying to use a method which works fine if used in the first pass. When I try the same method for the light pass with the exact same variables it stops working.
Here's my Geometry Vertex Shader:
#version 150
uniform mat4 projectionMatrix;
uniform mat4 viewMatrix;
uniform mat4 modelMatrix;
in vec4 in_Position;
in vec3 in_Normal;
in vec2 in_TextureCoord;
out vec3 pass_Normals;
out vec4 pass_Position;
out vec2 pass_TextureCoord;
out vec4 pass_Diffuse;
void main(void) {
pass_Position = viewMatrix * modelMatrix * in_Position;
pass_Normals = (viewMatrix * modelMatrix * vec4(in_Normal, 0.0)).xyz;
pass_Diffuse = vec4(1,1,1,1);
gl_Position = projectionMatrix * pass_Position;
}
Geometry Fragment shader:
#version 150 core
uniform sampler2D texture_diffuse;
uniform mat4 projectionMatrix;
uniform mat4 viewMatrix;
uniform mat4 modelMatrix;
in vec4 pass_Position;
in vec3 pass_Normals;
in vec2 pass_TextureCoord;
in vec4 pass_Diffuse;
out vec4 out_Diffuse;
out vec4 out_Position;
out vec4 out_Normals;
void main(void) {
out_Position = pass_Position;
out_Normals = vec4(pass_Normals, 1.0);
out_Diffuse = pass_Diffuse;
}
Light Vertex Shader:
#version 150
in vec4 in_Position;
in vec2 in_TextureCoord;
out vec2 pass_TextureCoord;
void main( void )
{
gl_Position = in_Position;
pass_TextureCoord = in_TextureCoord;
}
Light Fragment Shader:
#version 150 core
uniform sampler2D texture_Diffuse;
uniform sampler2D texture_Normals;
uniform sampler2D texture_Position;
uniform vec3 cameraPosition;
uniform mat4 viewMatrix;
in vec2 pass_TextureCoord;
out vec4 frag_Color;
void main( void )
{
frag_Color = vec4(1,1,1,1);
vec4 image = texture(texture_Diffuse,pass_TextureCoord);
vec3 position = texture( texture_Position, pass_TextureCoord).rgb;
vec3 normal = texture( texture_Normals, pass_TextureCoord).rgb;
frag_Color = image;
vec3 LightPosition_worldspace = vec3(0,2,0);
vec3 vertexPosition_cameraspace = position;
vec3 EyeDirection_cameraspace = vec3(0,0,0) - vertexPosition_cameraspace;
vec3 LightPosition_cameraspace = ( viewMatrix * vec4(LightPosition_worldspace,1)).xyz;
vec3 LightDirection_cameraspace = LightPosition_cameraspace + EyeDirection_cameraspace;
vec3 n = normal;
vec3 l = normalize( LightDirection_cameraspace );
float cosTheta = max( dot( n,l ), 0);
float distance = distance(LightPosition_cameraspace, vertexPosition_cameraspace);
frag_Color = vec4((vec3(10,10,10) * cosTheta)/(distance*distance)), 1);
}
And finally, here's the current result:
So My question is if anyone please can explain the result or how I should do to get a correct result. I would also appreciate good resources on the area.
Yes, using the depth buffer to reconstruct position is your best bet. This will significantly cut down on memory bandwidth / storage requirements. Modern hardware is biased towards doing shader calculations rather than memory fetches (this was not always the case), and the instructions necessary to reconstruct position per-fragment will always finish quicker than if you were to fetch the position from a texture with adequate precision. Now, you just have to realize what the hardware depth buffer stores (understand how depth range and perspective distribution work) and you will be good to go.
I do not see any attempt at reconstruction of world/view space position from the depth buffer in the code your question lists. You are just sampling from a buffer that stores the position in view-space. Since you are not performing reconstruction in this example, the problem has to do with sampling the view-space position... can you update your question to include the internal formats of the G-Buffer textures. In particular, are you using a format that can represent negative values (this is necessary to express position, otherwise negative values are clamped to 0).
On a final note, your position is also view-space and not world-space, a trained eye can tell this immediately by the way the colors in your position buffer are black in the lower-left corner. If you want to debug your position/normal, you should bias/scale the sampled colors into the visible range:
([-1.0, 1.0] -> [0.0, 1.0]) // Vec = Vec * 0.5 + 0.5
You may need to do this when you output some of the buffers if you want to store the normal G-Buffer more efficiently (e.g. in an 8-bit fixed-point texture instead of floating-point).
At the beginning of my project, I used simple Strings for filling my both Shaders with code. This looked like this:
public final static String chunkDefaultVertexInit = ""
+constantParameters
+"precision mediump float;"
+"uniform mat4 mPMatrix;"
+"uniform mat4 mVMatrix;"
+"uniform mat4 mMMatrix;"
+"uniform mat4 mMVMatrix;"
+"attribute vec4 Vertex;"
+"attribute vec3 Normal;"
+"attribute vec2 TexCoord;"
+"varying vec3 normal;"
+"varying vec2 uv;"
+"varying vec4 positionM;"
+"varying vec4 positionMV;";
etc....
This worked for me, but it was not really clearly. So I thought about how I could make my code a little bit more clean and clearly for everybody. My idea was, to put my whole bunch of code in a real .cc - file and move it into the res/raw folder. No sooner said than done.
I wanted to read out my code via Inputstreams and save it into a String. That also worked fine, and so I fed the shader the String source.
So... now there happend to be a problem, and as I said, I didn't get it yet. I even made me a little bit angry about myself, because I thought about an easy way of fix it and I don't see it.
I even did show my source code I put in... but it looks correct! o.O
Log.i("Llama3D Shader",shaderCode);
(Don't worry about the weird "Debug ID," it's the projects name)
Here's the source code for the shaders:
Vertexshader:
//vertexshader
precision mediump float;
uniform mat4 mPMatrix;
uniform mat4 mVMatrix;
uniform mat4 mMMatrix;
uniform mat4 mMVMatrix;
attribute vec4 aVertex;
attribute vec3 aNormal;
attribute vec2 aTexCoord;
varying vec2 vecTexCoord;
varying vec3 vecNormal;
varying vec4 vecVertex[2];
void main() {
gl_Position = mPMatrix * mMVMatrix * aVertex;
vecVertex[0] = mMMatrix * aVertex;
vecVertex[1] = mMVMatrix * aVertex;
vecTexCoord = aTexCoord;
vecNormal = normalize(vec3(mMMatrix * -vec4(aNormal,0.0)));
}
Fragmentshader:
#define MAX_POINT_LIGHTS 4
precision mediump float;
varying vec2 vecTexCoord;
varying vec3 vecNormal;
varying vec4 vecVertex[2];
uniform vec3 uVecEye;
uniform vec3 uPointLightPosition[MAX_POINT_LIGHTS];
uniform vec3 uPointLightColor[MAX_POINT_LIGHTS];
uniform sampler2D textureHandle;
vec3 V = normalize(uVecEye.xyz-vecVertex[1].xyz);
vec3 N = vNormal;
vec3 vecLight[MAX_POINT_LIGHTS];
vec4 pointDiffuse = vec4(0.0);
vec4 pointSpecular = vec4(0.0);
vec4 ambient = vec4(0.2,0.2,0.2,1.0);
vec4 color = vec4(1.0,1.0,1.0,1.0);
vec4 matSpec = vec4(1.0,1.0,1.0,1.0);
vec4 lightSpec = vec4(1.0,1.0,1.0,1.0);
vec4 spec = matSpec * lightSpec;
float shininess = 20.0;
void main() {
for (int i=0;i<MAX_POINT_LIGHTS;i++) {
vecLight[i].xyz = vecVertex[0].xyz - uPointLightPosition[i].xyz;
float vecDistance = length(vecLight[i].xyz);
if (vecDistance<=25.0) {
vecDistance = 1.0 - max(0.0,vecDistance)/25.0;
vec3 L = normalize(vecLight[i]);
vec3 R = normalize(reflect(L,N));
float LND = max(0.0,dot(N,L)) * vecDistance;
pointDiffuse += color * vec4(uPointLightColor[i].xyz,0.0) * LND;
if (shininess!=0.0 && spec!=0.0) {
pointSpecular += spec * pow(max(0.0,dot(R,V)),shininess) * LND;
} else {
pointSpecular += vec4(0.0,0.0,0.0,0.0);
}
}
}
vec4 colorTexture = texture2D(textureHandle,vec2(+vTexCoord.x,-vTexCoord.y));
gl_FragColor = ambient + colorTexture * pointDiffuse + pointSpecular;
}
Every time I try to run the program, the ShaderlogInfo and ProgramlogInfo say to me:
Invalid fragment shader. Link cannot proceed.*
Am I crazy or just blind?!
I hope you know an answer... I really don't know any... please help me!
The log you got is from the Program linking stage, glGetProgramInfoLog.
What you need to debug is the Fragment Shader log, glGetShaderInfoLog.
Something along these lines:
def _compile(self, source):
ptr = cast(c_char_p(source), POINTER(c_char))
glShaderSource(self.id, 1, byref(ptr), None)
glCompileShader(self.id)
status = c_int(0)
glGetShaderiv(self.id, GL_COMPILE_STATUS, byref(status))
log = self.check()
print(log),
if not status.value:
raise Exception(log)
def check(self):
length = c_int(0)
glGetShaderiv(self.id, GL_INFO_LOG_LENGTH, byref(length))
log = create_string_buffer(length.value)
glGetShaderInfoLog(self.id, length.value, None, log)
return log.value
Though this is not in java but in python, it should give you an idea of how to get your shader compile log.
Compiling your shaders in my environment gives me this log which may or may not be useful to you:
Vertex shader was successfully compiled to run on hardware.
WARNING: 0:2: warning(#260) Keyword 'precision' is supported in GLSL 1.3
Fragment shader failed to compile with the following errors:
WARNING: 0:2: warning(#260) Keyword 'precision' is supported in GLSL 1.3
ERROR: 0:14: error(#143) Undeclared identifier vNormal
WARNING: 0:14: warning(#402) Implicit truncation of vector from size 1 to size 3.
ERROR: 0:50: error(#143) Undeclared identifier vTexCoord
ERROR: 0:50: error(#216) Vector field selection out of range 'y'
ERROR: error(#273) 4 compilation errors. No code generated