How to properly draw a texture with opacity in Android OpenGL? - java

I'm trying to draw 5 different textures on the screen, but I can't seem to make the alpha work. Textures are rendered fine if there is no alpha, but when there is, there's a really weird "effect".
Ok, first off I call draw to all 5 of my textures with opacity (in the given order): 0.5,1.0,0.5,1.0,0.5. But when I start the app I actually get 1.0,0.5,1.0,0.5,1.0 like there's an offset of 1. That's not all that's weird. About a half of second after my app launches the very first texture gets opacity of 0.5, even weirder is that this is not happening on the second draw method, but somewhere between the first and second call to draw. How is this even possible?
This is the final result (it should be 0.5,1.0,0.5,1.0,0.5, but as you can see it's not even close):
Now for some code (I've skipped some unrelevant parts).
Creating the surface:
public void onSurfaceCreated(GL10 gl, EGLConfig config)
{
// Textures are being loaded in here, which is just fine (code skipped)
gl.glEnable(GL10.GL_TEXTURE_2D);
gl.glShadeModel(GL10.GL_SMOOTH);
gl.glClearColor(1.0f, 0.0f, 0.0f, 1.0f);
gl.glClearDepthf(1.0f);
gl.glEnable(GL10.GL_DEPTH_TEST);
gl.glDepthFunc(GL10.GL_LEQUAL);
gl.glEnable(GL10.GL_BLEND);
gl.glBlendFunc(GL10.GL_SRC_ALPHA, GL10.GL_ONE_MINUS_SRC_ALPHA);
}
Updating the surface:
public void onSurfaceChanged(GL10 gl, int width, int height)
{
// Here I pass width & height to my texture objects for future reference (code skipped)
gl.glViewport(0, 0, width, height);
gl.glOrthof(0, width, height, 0, 0, 1);
gl.glMatrixMode(GL10.GL_MODELVIEW);
gl.glLoadIdentity();
}
Drawing:
public void onDrawFrame(GL10 gl)
{
gl.glClear(GL10.GL_COLOR_BUFFER_BIT | GL10.GL_DEPTH_BUFFER_BIT);
gl.glLoadIdentity();
// Here I call draw of each texture object
}
Draw method of the texture object:
public void draw(GL10 gl)
{
gl.glBindTexture(GL10.GL_TEXTURE_2D, textures[0]);
gl.glEnableClientState(GL10.GL_VERTEX_ARRAY);
gl.glEnableClientState(GL10.GL_TEXTURE_COORD_ARRAY);
gl.glFrontFace(GL10.GL_CW);
gl.glVertexPointer(3, GL10.GL_FLOAT, 0, vertexBuffer);
gl.glTexCoordPointer(2, GL10.GL_FLOAT, 0, textureBuffer);
gl.glDrawArrays(GL10.GL_TRIANGLE_STRIP, 0, vertices.length / 3);
gl.glDisableClientState(GL10.GL_VERTEX_ARRAY);
gl.glDisableClientState(GL10.GL_TEXTURE_COORD_ARRAY);
gl.glTexEnvf(GL10.GL_TEXTURE_ENV, GL10.GL_TEXTURE_ENV_MODE, GL10.GL_MODULATE);
gl.glColor4f(1.0f, 1.0f, 1.0f, this.alpha); // This is where the magic doesn't happen (:
}
I've skipped the part where I create vertex/texture buffers and load the textures, that part follows this tutorial and seems to work just fine.
So.. what do you think am I missing? What could be the cause to this weird issue?
If I must guess, I'd say it's some weird hardcore OpenGL issue/bug or more like a flag I forgot to raise or something like that. It can't be the order of drawing or anything like that, I've double checked it all. I've also tried setting say 0.5f as opacity to each texture and it works perfectly, the problem only happens whenever the opacity of the textures differ from each other. I also don't think, that the weird between-draw flicker can be caused by user code, it's gotta be some OpenGL weirdness.
I must point out that I am using a 3rd party library to pack all of this GL magic into a live wallpaper, it's this awesome lib: GLWallpaperService

Why is the place where the magic happens not before the draw call? The default colour value is (1,1,1,1) so at first your start with alpha at 1.0, then you draw the texture and set the alpha to .5 and draw the 2nd texture with this value still set... Ergo the strange offset effect.
When the last texture is drawn you have an alpha of 1.0 (from the one before) and then set it to .5 which is used to draw the first texture in the next refresh: Since the alpha changes from 1.0 to .5 on the top most element it flickers. In the end you clearly get the result seen on the image you posted.
So this truly is where the magic happens:
gl.glColor4f(1.0f, 1.0f, 1.0f, this.alpha); // This is where the magic doesn't happen
So try removing that line and try it like this:
gl.glColor4f(1.0f, 1.0f, 1.0f, this.alpha); // This is where the magic does happen
gl.glDrawArrays(GL10.GL_TRIANGLE_STRIP, 0, vertices.length / 3);
gl.glColor4f(1.0f, 1.0f, 1.0f, 1.0f); // Low mana, stop the magic

Related

Using filledcircle and pixmap in libgdx

I am using Libgdx.
I want to simulate fog in my game using pixmap, but I have a problem during generating the "fogless" circle. First, I make a pixmap, filled with black (it is transparent a little bit). After filling I want to draw a filled circle onto it, but the result is not that I expected.
this.pixmap = new Pixmap(640, 640, Format.LuminanceAlpha);
Pixmap.setBlending(Blending.None); // disable Blending
this.pixmap.setColor(0, 0, 0, 0.9f);
this.pixmap.fill();
//this.pixmap.setColor(0, 0, 0, 0);
this.pixmap.fillCircle(200, 200, 100);
this.pixmapTexture = new Texture(pixmap, Format.LuminanceAlpha, false);
In the procedure render()
public void render() {
mapRenderer.render();
batch.begin();
batch.draw(pixmapTexture, 0, 0);
batch.end();
}
If I use Format. Alpha when creating the Pixmap and Texture, I neither see the more translucent circle.
Here is my problem:
Problem
Could somebody help me? What should I do, what should I init before to draw a full transparent circle? Thanks.
UPDATE
I have found the answer for my problem. I have to disable blending to avoid the problem.
Now my code:
FrameBuffer fbo = new FrameBuffer(Format.RGBA8888, 620, 620, false);
Texture tex = EnemyOnRadar.assetManager.get("data/sugar.png", Texture.class);
batch.begin();
// others
batch.end();
fbo.begin();
batch.setColor(1,1,1,0.7f);
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);
batch.begin();
batch.draw( tex, 100, 100);
batch.end();
fbo.end();
But I don't see the circle (it's a png image, represents transparent bg, white filled circle).
I am not sure if this works for you but i just share it:
You could use FrameBuffers and do the following:
Draw everything you want to draw on screen.
End your SpriteBatch and begin your FrameBuffer, begin the SpriteBatch again.
Draw the Fog, which fills the whole "screen" (FrameBuffer) with a black, non transparent color.
Draw the "Fogless" circle as a white circle, at the position you want to delete the fog.
Set the FrameBuffers alpha channel (transparancy) to 0.7 or something like that.
End the SpriteBatch and the FrameBuffer to draw it to screen.
What happens? You draw the normal scene, without fog. You create a "virtual screen", fill it with black and overdraw the black with a white circle. Now you set a transparacy to this "virtual screen" and overdraw your real screen with it. The part of the screen, which is under the white circle seems to be bright, while the black rest makes your scene darker.
Something to read: 2D Fire effect with libgdx, more or less the same as fog.
My question to this: Libgdx lighting without box2d
EDIT: another Tutorial.
Let me know if it helps!
EDIT: Some Pseudocode:
In create:
fbo = new FrameBuffer(Format.RGBA8888, width, height, false);
In render:
fbo.begin();
glClearColor(0f, 0f, 0f, 1f); // Set the clear color to black, non transparent
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT); // Clear the "virtual screen" with the clear color
spriteBatch.begin(); // Start the SpriteBatch
// Draw the filled circles somehow // Draw your Circle Texture as a white, not transparent Texture
spriteBatch.end(); // End the spritebatch
fbo.end(); // End the FrameBuffer
spriteBatch.begin(); // start the spriteBatch, which now draws to the real screen
// draw your textures, sprites, whatever
spriteBatch.setColor(1f, 1f, 1f, 0.7f); // Sets a global alpha to the SpriteBatch, maybe it applies alo to the stuff you have allready drawn. If so just call spriteBatch.end() before and than spriteBatch.begin() again.
spriteBatch.draw(fbo, 0, 0); // draws the FBO to the screen.
spriteBatch.end();
tell me if it works

JOGL - lighting/camera

Ive added a light source to my JOGL project which seems to work quite well when the object is stationary, when I move the camera it gradually gets darker as it rotates, which is what id expect but as soon as it rotates 90 degree the screen goes completely black, does anyone know why this is? Do I need to another light source for the other side? I was hoping it would kind of act like the sun, i.e. light up the whole scene but be slightly darker when the camera is on the other side of the object.
Lighting
float light_ambient[] = { 0.0f, 0.0f, 0.0f, 1.0f };
float light_diffuse[] = { 1.0f, 1.0f, 1.0f, 1.0f };
float light_specular[] = { 1.0f, 1.0f, 1.0f, 1.0f };
float light_position[] = { 1.0f, 1.0f, 1.0f, 0.0f };
gl.glLightfv(GL2.GL_LIGHT0, GL2.GL_AMBIENT, light_ambient, 0);
gl.glLightfv(GL2.GL_LIGHT0, GL2.GL_DIFFUSE, light_diffuse, 0);
gl.glLightfv(GL2.GL_LIGHT0, GL2.GL_SPECULAR, light_specular, 0);
gl.glLightfv(GL2.GL_LIGHT0, GL2.GL_POSITION, light_position, 0);
gl.glEnable(GL2.GL_LIGHTING);
gl.glEnable(GL2.GL_LIGHT0);
gl.glDepthFunc(GL.GL_LESS);
gl.glEnable(GL.GL_DEPTH_TEST);
Secondly, when the camera rotates some of the shapes seem to deform and look like completely different shapes, i.e. cubes turning pincushion like, sides being stretched an incredible amount and its making my whole object look slightly deformed. Is there an easy way to change this? Ive tried messing with gluPerspective and that doesnt seem to do change what I want either. Is there any way around this?
You have added diffuse and specular light to your scene, but these will not reach surfaces that are facing away from the light source. You could add some ambient light (currently set to 0, 0, 0 in your code snippet) so that all surfaces receive some illumination.
As for the deformed shapes, that is really a separate question, and there is not enough detail given to know why this is happening.

Blended textures show up fine, but ordinary coloured shapes do not

For some reason enabling alpha blending results in me not being able to draw run-of-the-mill coloured shapes. The order in which everything is drawn makes no difference. Even if the only thing being drawn is the coloured shape, it still won't show.
Disabling alpha blending fixes this, but disables alpha blending (obviously). This leads me to believe the problem is in how I'm initializing openGL.
The textured objects are contained in the world, which is commented out. Commenting "world.run();" out makes no difference, only disabling alpha blending does.
public class Core {
int width=800, height=600;
//World world;
public void Start(){
try {
Display.setDisplayMode(new DisplayMode(width,height));
Display.create();
} catch (LWJGLException e) {
e.printStackTrace();
System.exit(0);
}
initGL();
System.out.println("OpenGL version: " + GL11.glGetString(GL11.GL_VERSION));
boolean Close = Display.isCloseRequested();
//world = new World(width, height);
while(!Close){
GL11.glClear(GL11.GL_COLOR_BUFFER_BIT | GL11.GL_DEPTH_BUFFER_BIT);
if(Keyboard.isKeyDown(Keyboard.KEY_ESCAPE) || Display.isCloseRequested())
Close = true;
//world.run();
GL11.glColor4d(1, 0, 0, 1);
GL11.glBegin(GL11.GL_QUADS);
GL11.glVertex2d(0, 0);
GL11.glVertex2d(0, 50);
GL11.glVertex2d(50, 50);
GL11.glVertex2d(50, 0);
GL11.glEnd();
Display.update();
//Display.sync(60);
}
}
public void initGL(){
GL11.glEnable(GL11.GL_TEXTURE_2D);
GL11.glClearColor(1.0f, 1.0f, 1.0f, 0.0f);
// enable alpha blending
GL11.glEnable(GL11.GL_BLEND);
GL11.glBlendFunc(GL11.GL_SRC_ALPHA, GL11.GL_ONE_MINUS_SRC_ALPHA);
GL11.glViewport(0,0,width,height);
GL11.glMatrixMode(GL11.GL_MODELVIEW);
GL11.glMatrixMode(GL11.GL_PROJECTION);
GL11.glLoadIdentity();
GL11.glOrtho(0, width, height, 0, 1, -1);
GL11.glMatrixMode(GL11.GL_MODELVIEW);
}
public static void main(String args[]){
Core m = new Core();
m.Start();
}
}
This is for a 2D app where I'm trying to draw metaballs behind the texture of a black-and-white world map.
Run-of-the-mill coloured shapes refers to the following,
GL11.glColor4d(1, 0, 0, 1);
GL11.glBegin(GL11.GL_QUADS);
GL11.glVertex2d(0, 0);
GL11.glVertex2d(0, 50);
GL11.glVertex2d(50, 50);
GL11.glVertex2d(50, 0);
GL11.glEnd();
Even if drawn on its own, as long as alpha blending is enabled, it won't show up.
UPDATE:
The constructor for world was loading (but not drawing) a texture. Removing that part of the code lets the coloured square show up. I have deduced that the problem will occur as long as a texture is loaded, regardless of whether it is displayed or not.
You've got glEnable(GL_TEXTURE_2D) in initGL, but I don't see it disabled anywhere.
You know you have to disable texturing if you want to draw an untextured object, right?

How to translate the camera in GLES2.0?

I want to create a camera moving above a tiled plane. The camera is supposed to move in the XY-plane only and to look straight down all the time. With an orthogonal projection I expect a pseudo-2D renderer.
My problem is, that I don't know how to translate the camera. After some research it seems to me, that there is nothing like a "camera" in OpenGL and I have to translate the whole world. Changing the eye-position and view center coordinates in the Matrix.setLookAtM-function just leads to distorted results.
Translating the whole MVP-Matrix does not work either.
I'm running out of ideas now; do I have to translate every single vertex every frame directly in the vertex buffer? That does not seem plausible to me.
I derived GLSurfaceView and implemented the following functions to setup and update the scene:
public void onSurfaceChanged(GL10 unused, int width, int height) {
GLES20.glViewport(0, 0, width, height);
float ratio = (float) width / height;
// Setup the projection Matrix for an orthogonal view
Matrix.orthoM(mProjMatrix, 0, -ratio, ratio, -1, 1, 3, 7);
}
public void onDrawFrame(GL10 unused) {
// Draw background color
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
//Setup the camera
float[] camPos = { 0.0f, 0.0f, -3.0f }; //no matter what else I put in here the camera seems to point
float[] lookAt = { 0.0f, 0.0f, 0.0f }; // to the coordinate center and distorts the square
// Set the camera position (View matrix)
Matrix.setLookAtM( vMatrix, 0, camPos[0], camPos[1], camPos[2], lookAt[0], lookAt[1], lookAt[2], 0f, 1f, 0f);
// Calculate the projection and view transformation
Matrix.multiplyMM( mMVPMatrix, 0, projMatrix, 0, vMatrix, 0);
//rotate the viewport
Matrix.setRotateM(mRotationMatrix, 0, getRotationAngle(), 0, 0, -1.0f);
Matrix.multiplyMM(mMVPMatrix, 0, mRotationMatrix, 0, mMVPMatrix, 0);
//I also tried to translate the viewport here
// (and several other places), but I could not find any solution
//draw the plane (actually a simple square right now)
mPlane.draw(mMVPMatrix);
}
Changing the eye-position and view center coordinates in the "LookAt"-function just leads to distorted results.
If you got this from the android tutorial, I think they have a bug in their code. (made a comment about it here)
Try the following fixes:
Use setLookatM to point to where you want the camera to be.
In the shader, change the gl_Position line
from: " gl_Position = vPosition * uMVPMatrix;"
to: " gl_Position = uMVPMatrix * vPosition;"
I'd think the //rotate the viewport section should be removed as well, as this is not rotating the camera properly. You can change the camera's orientation in the setlookat function.

Android Getting correct opacity/colors with glBlendFunc blending in OpenGL-ES

In my program I want to draw a square (with varying opacity, lets say red for arguments sake) onto a textured background. However the square appears the wrong color depending on the background. With alpha=1.0f (opaque) the square should appear red, which it does on a black background but on a white background it appears white (and scales accordingly in-between)
In my GLActivity have set the following flags:
gl.glDisable(GL10.GL_LIGHTING);
gl.glDisable(GL10.GL_CULL_FACE);
gl.glDisable(GL10.GL_DEPTH_BUFFER_BIT);
gl.glDisable(GL10.GL_DEPTH_TEST);
gl.glEnable(GL10.GL_DITHER);
gl.glShadeModel(GL10.GL_SMOOTH);
gl.glEnable(GL10.GL_BLEND);
gl.glBlendFunc(GL10.GL_SRC_ALPHA, GL10.GL_ONE);
Then I draw the background texture:
gl.glClearColor(0.0f,0.0f,0.0f,1.0f);
gl.glBindTexture(GL10.GL_TEXTURE_2D, texturesFPS[1]);
gl.glEnable(GL10.GL_TEXTURE_2D);
gl.glPushMatrix();
gl.glTranslatef(imageOffSetX, 0, 0);
gl.glColor4f(1,1,1,1);
gl.glEnableClientState(GL10.GL_VERTEX_ARRAY);
gl.glEnableClientState(GL10.GL_TEXTURE_COORD_ARRAY);
gl.glVertexPointer(3, GL10.GL_FLOAT, 0,vertexBGBuffer);
gl.glTexCoordPointer(2, GL10.GL_FLOAT, 0, textureBGBuffer);
gl.glDrawElements(GL10.GL_TRIANGLES, indices.length,GL10.GL_UNSIGNED_SHORT, indexBuffer);
gl.glDisableClientState(GL10.GL_TEXTURE_COORD_ARRAY);
gl.glDisable(GL10.GL_TEXTURE_2D);
Then I draw a square onto my background image:
gl.glPushMatrix();
gl.glTranslatef(x, y, 0);
gl.glColor4f(color[0], color[1], color[2], 1.0f); // Set to 1.0f temporarily
gl.glEnableClientState(GL10.GL_VERTEX_ARRAY);
gl.glVertexPointer(3, GL10.GL_FLOAT, 0, vertexBuffer);
gl.glDrawElements(GL10.GL_TRIANGLES, indices.length, GL10.GL_UNSIGNED_SHORT, indexBuffer);
gl.glDisableClientState(GL10.GL_VERTEX_ARRAY);
gl.glPopMatrix();
I've tried different values in the glBlendFunc but cannot understand how to achieve what I want. Any help is much appreciated :)
My answer may evolve but one thing I know for sure is that you should be using the following blend func...
gl.glBlendFunc(GL10.GL_SRC_ALPHA, GL10.GL_ONE_MINUS_SRC_ALPHA);
I just found the solution, for the background texture to be drawn correctly I needed this:
gl.glBlendFunc(GL10.GL_SRC_ALPHA, GL10.GL_ONE);
Then before drawing the square I needed to call this:
gl.glBlendFunc(GL10.GL_SRC_ALPHA, GL10.GL_ONE_MINUS_SRC_ALPHA);

Categories

Resources