I'm trying to develop an Android 2D game using OpenGL ES that uses a tiled map, and I heard that it's best to store the tiles in a texture atlas (one large bitmap with multiple tiles) for performance reasons.
Does anyone have some sample code that demonstrates how to draw a tile from a texture atlas in Android OpenGL ES?
onDrawFrame(GL10 gl) {
...
}
Well I figured out how to do it.
onDrawFrame(GL10 gl) {
...
int[] crop = new int[4];
crop[0] = tileWidth * tileIndex; // tileIndex represents the nth tile in the texture atlas
crop[1] = tileHeight;
crop[2] = tileWidth;
crop[3] = -tileHeight;
// specify the source rectangle
((GL11) gl).glTexParameteriv(GL10.GL_TEXTURE_2D, GL11Ext.GL_TEXTURE_CROP_RECT_OES, crop, 0);
// draw the texture
((GL11Ext)gl).glDrawTexiOES(x, y, 0, tileWidth, tileHeight);
...
}
Related
I'm making a scratchcard mini-game and I want to make scratchcard texture erase-able with other texture, I have scratchcard texture (colorful) and a mask texture (circle while) I'am trying to make so that when mask texture is on scratchcard texture it become a 'hole' you see trough it and see the background.
I've tried making it with blending
This is the code I found on stackoverflow from a different topic which I tried to modify, but it didnt seem to work.
// draw our destination image
super.draw(batch, parentAlpha);
batch.end();
// remember SpriteBatch's current functions
int srcFunc = batch.getBlendSrcFunc();
int dstFunc = batch.getBlendDstFunc();
// Let's enable blending
batch.enableBlending();
batch.begin();
// blend them
batch.setBlendFunction(GL20.GL_ZERO, GL20.GL_ONE_MINUS_DST_ALPHA);
image.setPosition(Gdx.input.getX() - (image.getWidth() / 2), -Gdx.input.getY() + (1280 * GambleRPG.SCALE_Y) - (image.getHeight() / 2));
image.draw(batch, parentAlpha);
// Reset
batch.end();
batch.begin();
batch.setBlendFunction(srcFunc, dstFunc);
I am quite new to programming so bear with me here...
I am making a 2d basic game just to practice programming in android studio and can't get my sprite to the correct position on the screen. Also when I draw the sprite it appears stretched and the quality is very poor. Any help is appreciated!
public class MyGdxGame extends ApplicationAdapter {
SpriteBatch batch;
Texture background;
Texture ball;
#Override
public void create () {
batch = new SpriteBatch();
background = new Texture("gamebackground.png");
ball = new Texture("ball2.png");
}
#Override
public void render () {
batch.begin();
batch.draw(background, 0,0, Gdx.graphics.getWidth(), Gdx.graphics.getHeight());
batch.draw(ball, 0,0, Gdx.graphics.getWidth() / 2, Gdx.graphics.getHeight() / 2);
batch.end();
}
You need to keep the original width/height ratio:
instead of scaling it to half the screen size, define your scaling like that:
float scaleFactor = 2.0f;
batch.draw(ball, 0,0, ball.getWidth()*scaleFactor, ball.getHeight*scaleFactor);
If your image is "blurry", and you want the individual pixels to stay crisp, try loading the texture like that:
ball = new Texture("ball2.png");
ball.setFilter(TextureFilter.Nearest, TextureFilter.Nearest);
This prevents (default) linear interpolation when scaling the texture.
I'm trying to port a .NET app to Android where I capture each frame from the camera and then modify it accordingly to user settings before displaying it. Doing it in .NET was simple since I was able to simply query the camera for the next image and I would get a bitmap that I could access at will.
One of the many processing options requires the application to obtain the intensity histogram of each captured image and then do some modifications to the captured image before displaying the result (based on user settings). What I'm attempting to do is to capture and modify the camera preview in Android.
I understand that the "best" way to achieve some sort of real time-ish camera processing is by using OpenGL as the preview framework by using a GLES11Ext.GL_TEXTURE_EXTERNAL_OES texture.
I am able to capture the preview and do some processing in my fragment shader like turning the image gray scale, modifying the colors of the fragment, threshold clipping, etc., but to do stronger processing like computing histogram, or applying Fast Fourier Transform, I need (fast) access (read/write) to all the pixels in RGB format in the captured image contained in the texture before displaying it.
I'm using Java with OpenGL ES 2.0 for Android.
My current code for drawing does the following:
private int mTexture; // texture handle created to hold the captured image
...
// Called for every captured frame
public void onDraw()
{
int mPositionHandle;
int mTextureCoordHandle;
GLES20.glUseProgram(mProgram);
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, mTexture);
// TODO:
// Obtain RGB pixels of the texture and manipulate here.
// TODO:
// Put the resulting RGB pixels in a texture for display.
// prepare for drawing
mPositionHandle = GLES20.glGetAttribLocation(mProgram, "position");
GLES20.glEnableVertexAttribArray(mPositionHandle);
GLES20.glVertexAttribPointer(mPositionHandle, COORDS_PER_VERTEX, GLES20.GL_FLOAT, false,vertexStride, vertexBuffer);
mTextureCoordHandle = GLES20.glGetAttribLocation(mProgram, "inputTextureCoordinate");
GLES20.glEnableVertexAttribArray(mTextureCoordHandle);
GLES20.glVertexAttribPointer(mTextureCoordHandle, COORDS_PER_VERTEX, GLES20.GL_FLOAT, false,vertexStride, textureVerticesBuffer);
// draw the texture
GLES20.glDrawElements(GLES20.GL_TRIANGLES, drawOrder.length,
GLES20.GL_UNSIGNED_SHORT, drawListBuffer);
}
My vertex and fragment shaders are very simple:
Vertex shader:
attribute vec4 position;
attribute vec2 inputTextureCoordinate;
varying vec2 textureCoordinate;
void main()
{
gl_Position = position;
textureCoordinate = inputTextureCoordinate;
}
Fragment shader (accesses the captured image directly):
/* Shader: Gray Scale*/
#extension GL_OES_EGL_image_external : require
precision mediump float;
varying vec2 textureCoordinate;
uniform samplerExternalOES s_texture;
void main()
{
float clr = dot(texture2D(s_texture, textureCoordinate), vec4(0.299, 0.587, 0.114, 0.0));
gl_FragColor = vec4(clr, clr, clr, 1.0);
}
It would be ideal if I were able to obtain the width and height of the captured texture and be able to obtain and modify (or be able to write into another texture) the RGB value for every pixel in the capture, such as in an array of bytes where each byte represented a color channel, for processing before displaying.
I am starting to learn OpenGL ES and I got this project on the way. Any help is deeply appreciated, thank you.
I'm drawing some points in OpenGL (JOGL) as follows:
BufferedImage image = loadMyTextureImage();
Texture tex = TextureIO.newTexture(image, false);
tex.setTexParameteri(GL.GL_TEXTURE_MIN_FILTER, GL.GL_LINEAR);
tex.setTexParameteri(GL.GL_TEXTURE_MAG_FILTER, GL.GL_LINEAR);
tex.bind();
gl.glColor4f(r,g,b,a);
gl.glBegin(GL_POINTS);
for ( int i = 0; i < numPoints; i++ ) {
// compute x,y,z
gl.glVertex3f(x,y,z);
}
gl.glEnd();
My image is a white image, so I can reuse that same texture and just color it using gl.glColor4f, but I would like to draw an outline around it in a different color. Is there a way to do that?
If you're using the texture to determine the shape of the point, then the obvious way to do the outline would be to add a second texture to draw the outline of the point on top.
The outline texture would also be white, so you could colour it to any colour you like in the same way.
Depending on the alpha-blending mode you use, this can also be used to give a "glowing" edge effect.
I'm drawing a texture mapped quad with a texture that has some transparent pixels on it. So, I load the texture and then draw it
gl.glEnable(GL.GL_ALPHA_TEST);
gl.glAlphaFunc(GL.GL_GREATER,0);
gl.glBindTexture(GL.GL_TEXTURE_2D, texture);
gl.glTexImage2D(GL.GL_TEXTURE_2D, 0, GL.GL_RGBA, width, height, 0, GL.GL_RGBA, GL.GL_UNSIGNED_BYTE, buff);
gl.glTexParameteri(GL.GL_TEXTURE_2D, GL.GL_TEXTURE_MIN_FILTER, GL.GL_LINEAR);
gl.glTexParameteri(GL.GL_TEXTURE_2D, GL.GL_TEXTURE_MAG_FILTER, GL.GL_LINEAR);
gl.glTexEnvi(GL.GL_TEXTURE_ENV, GL.GL_TEXTURE_ENV_MODE, GL.GL_REPLACE);
gl.glColor4i(1,1,1,1);
gl.glBegin(GL.GL_QUADS);
// draw my quad here
gl.glEnd():
When I draw this, the transparent pixels do not show as expected. However, I want to apply a fragment shader. In this case I'll test something simple:
void main() {
gl_FragColor = vec4(0.0);
if ( gl_Color.a > 0.0 ) {
gl_FragColor = vec4(1.0,0,0,1.0);
}
}
In this case, ALL of the pixels should up as red, instead of just the non-transparent ones. Can someone explain how to just color the non-transparent pixels using a shader?
thanks,
Jeff
gl_Color is the color output by the vertex shader. Unless your vertex shader did your texture fetching for you, then your vertex shader probably passes the color attribute directly to your fragment shader via gl_FrontColor.
If you're not using a vertex shader, and just using fixed-function vertex processing, then it's a certainty that the fragment shader was given only the color. Remember that fragment shaders override all per-fragment glTexEnv operations, including the fetching of the texture.
If you want to test the texture's opacity, then you need to fetch from the texture yourself. That requires using a sampler2D object and the texture2D function (assuming you're using GLSL version 1.20. If you're using later versions, you want the texture function). It's been a while since I did anything with OpenGL's built-in inputs and outputs, but the shader would look something like this:
#version 120
uniform sampler2D myTexture;
void main()
{
gl_FragColor = vec4(0.0);
vec4 texColor = texture2D(myTexture, gl_TexCoord[0]); //Assuming you're using texture coordinate 0.
if(texColor.a < 1.0)
gl_FragColor = vec4(1.0,0,0,1.0);
}
You will also have to set the program uniform's value to the texture unit you're using. You'll have to look that one up though.