Android OGLES2.0 : All applications getting error - java

I'm new to android SDK and programming using OGLES2.0. my problem is, most of the programs are not running on my PC.
I'm using Android virtual Device Nexus 4 with 512 Mb Ram, VM Heap 64, Internal Storage 512 and Android 4.3 with API 18 (No SD Card).
A sample code which I'm trying to run is
package com.example.mynewsample;
//
// Book: OpenGL(R) ES 2.0 Programming Guide
// Authors: Aaftab Munshi, Dan Ginsburg, Dave Shreiner
// ISBN-10: 0321502795
// ISBN-13: 9780321502797
// Publisher: Addison-Wesley Professional
// URLs: http://safari.informit.com/9780321563835
// http://www.opengles-book.com
//
// Hello_Triangle
//
// This is a simple example that draws a single triangle with
// a minimal vertex/fragment shader. The purpose of this
// example is to demonstrate the basic concepts of
// OpenGL ES 2.0 rendering.
import java.nio.ByteBuffer;
import java.nio.ByteOrder;
import java.nio.FloatBuffer;
import javax.microedition.khronos.egl.EGLConfig;
import javax.microedition.khronos.opengles.GL10;
import android.content.Context;
import android.opengl.GLES20;
import android.opengl.GLSurfaceView;
import android.util.Log;
public class myTriangleRenderer implements GLSurfaceView.Renderer
{
///
// Constructor
//
public myTriangleRenderer(Context context)
{
mVertices = ByteBuffer.allocateDirect(mVerticesData.length * 4)
.order(ByteOrder.nativeOrder()).asFloatBuffer();
mVertices.put(mVerticesData).position(0);
}
///
// Create a shader object, load the shader source, and
// compile the shader.
//
private int LoadShader(int type, String shaderSrc)
{
int shader;
int[] compiled = new int[1];
// Create the shader object
shader = GLES20.glCreateShader(type);
if (shader == 0)
return 0;
// Load the shader source
GLES20.glShaderSource(shader, shaderSrc);
// Compile the shader
GLES20.glCompileShader(shader);
// Check the compile status
GLES20.glGetShaderiv(shader, GLES20.GL_COMPILE_STATUS, compiled, 0);
if (compiled[0] == 0)
{
Log.e(TAG, GLES20.glGetShaderInfoLog(shader));
GLES20.glDeleteShader(shader);
return 0;
}
return shader;
}
///
// Initialize the shader and program object
//
public void onSurfaceCreated(GL10 glUnused, EGLConfig config)
{
String vShaderStr =
"attribute vec4 vPosition; \n"
+ "void main() \n"
+ "{ \n"
+ " gl_Position = vPosition; \n"
+ "} \n";
String fShaderStr =
"precision mediump float; \n"
+ "void main() \n"
+ "{ \n"
+ " gl_FragColor = vec4 ( 1.0, 0.0, 0.0, 1.0 );\n"
+ "} \n";
int vertexShader;
int fragmentShader;
int programObject;
int[] linked = new int[1];
// Load the vertex/fragment shaders
vertexShader = LoadShader(GLES20.GL_VERTEX_SHADER, vShaderStr);
fragmentShader = LoadShader(GLES20.GL_FRAGMENT_SHADER, fShaderStr);
// Create the program object
programObject = GLES20.glCreateProgram();
if (programObject == 0)
return;
GLES20.glAttachShader(programObject, vertexShader);
GLES20.glAttachShader(programObject, fragmentShader);
// Bind vPosition to attribute 0
GLES20.glBindAttribLocation(programObject, 0, "vPosition");
// Link the program
GLES20.glLinkProgram(programObject);
// Check the link status
GLES20.glGetProgramiv(programObject, GLES20.GL_LINK_STATUS, linked, 0);
if (linked[0] == 0)
{
Log.e(TAG, "Error linking program:");
Log.e(TAG, GLES20.glGetProgramInfoLog(programObject));
GLES20.glDeleteProgram(programObject);
return;
}
// Store the program object
mProgramObject = programObject;
GLES20.glClearColor(0.0f, 0.0f, 0.0f, 0.0f);
}
// /
// Draw a triangle using the shader pair created in onSurfaceCreated()
//
public void onDrawFrame(GL10 glUnused)
{
// Set the viewport
GLES20.glViewport(0, 0, mWidth, mHeight);
// Clear the color buffer
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
// Use the program object
GLES20.glUseProgram(mProgramObject);
// Load the vertex data
GLES20.glVertexAttribPointer(0, 3, GLES20.GL_FLOAT, false, 0, mVertices);
GLES20.glEnableVertexAttribArray(0);
GLES20.glDrawArrays(GLES20.GL_TRIANGLES, 0, 3);
}
// /
// Handle surface changes
//
public void onSurfaceChanged(GL10 glUnused, int width, int height)
{
mWidth = width;
mHeight = height;
}
// Member variables
private int mProgramObject;
private int mWidth;
private int mHeight;
private FloatBuffer mVertices;
private static String TAG = "HelloTriangleRenderer";
private final float[] mVerticesData =
{ 0.0f, 0.5f, 0.0f, -0.5f, -0.5f, 0.0f, 0.5f, -0.5f, 0.0f };
}
I had tried different virtual devices, but each time it says Unfortunately stops running.
I am getting this with all OGLES2.0 programs, that won't use CANVAS. A Canvas Program is running accurately.

My experience thus far has always been that the Android emulator does not fully support OpenGL ES 2.0, only ES 1.x. By far the easiest approach is to test on a physical device.
However please checkout this question which suggests it can now be done:
Android OpenGL ES 2.0 emulator

OpenGL ES 2.0 emulation on AVDs actually works pretty well now since the Jelly Bean version. However, the critical factor is the underlying OpenGL driver you have installed on your host development system. It really must be a recent Nvidia or AMD driver. Also, installing Intel's HAXM makes it run much faster. See the third article here:
http://montgomery1.com/opengl/

Related

Problem with GLSL not working as i intended [closed]

Closed. This question needs debugging details. It is not currently accepting answers.
Edit the question to include desired behavior, a specific problem or error, and the shortest code necessary to reproduce the problem. This will help others answer the question.
Closed 2 years ago.
Improve this question
So is was wondering is this is recreatable?
While i was debugging a java opengl project i found a shader :
#version 420 core
uniform sampler2D texture1;
uniform sampler2D texture2;
in vec2 uv;
out vec4 fragColor;
void main(){
//fragColor = texture(texture1, uv);
fragColor = texture(texture2, uv);
}
looks simple right but now when I uncommand the //fragColor = texture(texture1, uv) and keep the rest I get the texture1 rendered to the screen. WHY ? my brains says that that's not right, shouldn't it just render texture2 because I override fragColor? IDK can somebody explain this?
UPDATE 1:
I believe its a problem with glsl compilation.
Is it possible to bind a texture to sampler1 when there is not texture bound to sampler0
UPDATE 2:
creating the texture:
in my case its just a texture with 1 sample so TEXTURE_2D
and its format is .png so 4 channels
and there is no interpolation applied
texType = samples > 1 ? GL_TEXTURE_2D_MULTISAMPLE : GL_TEXTURE_2D;
int format;
if (channels == 3) {
format = GL_RGB;
} else if (channels == 4) {
format = GL_RGBA;
} else {
throw new AspectGraphicsException("textures can't be initialized with " + channels + " channels");
}
ID = glGenTextures();
glBindTexture(texType, ID);
glTexParameteri(texType,
GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexParameteri(texType,
GL_TEXTURE_MAG_FILTER, interpolation ? GL_LINEAR : GL_NEAREST);
glTexParameteri(texType, GL_TEXTURE_WRAP_S, GL_REPEAT);
glTexParameteri(texType, GL_TEXTURE_WRAP_T, GL_REPEAT);
if (samples > 1) {
if (pixels == null) {
glTexImage2DMultisample(texType, samples, format,
width, height, true);
} else {
throw new AspectGraphicsException("textures with defined with pixels can't be multisampled");
}
} else {
if (pixels == null) {
glTexImage2D(texType, 0, format, width, height,
0, format, GL_UNSIGNED_BYTE, NULL);
} else {
glTexImage2D(texType, 0, format,
width, height, 0, format,
GL_UNSIGNED_BYTE, pixels);
}
}
glBindTexture(texType, 0);
binding the texture:
texType is just GL_TEXTURE_2D
and samplerName is "texture1" or "texture2" (see in the glsl shader)
and the sampler is just for "texture1":0 and for "texture2":1
glActiveTexture(GL_TEXTURE0 + sampler);
glBindTexture(texType, ID);
shader.uniform1i(samplerName, sampler);
It's most likely that you didn't assign a texture unit to your sampler uniforms, so they were both set to point to GL_TEXTURE0. You can specify it in the shader like so:
#version 420 core
layout(binding=0) uniform sampler2D texture1;
layout(binding=1) uniform sampler2D texture2;
// ...
Then you bind the textures with:
glActiveTexture(GL_TEXTURE0 + 0);
glBindTexture(GL_TEXTURE_2D, your_texture);
glActiveTexture(GL_TEXTURE0 + 1);
glBindTexture(GL_TEXTURE_2D, other_texture);
glDrawArrays(...);
If done this way, you'll get the right result irrespectively of what uniforms are left out.
See Binding textures to samplers.

Keras Neural Network output different than Java TensorFlowInferenceInterface output

I have created a neural network in Keras using the InceptionV3 pretrained model:
base_model = applications.inception_v3.InceptionV3(weights='imagenet', include_top=False)
# add a global spatial average pooling layer
x = base_model.output
x = GlobalAveragePooling2D()(x)
# let's add a fully-connected layer
x = Dense(2048, activation='relu')(x)
x = Dropout(0.5)(x)
predictions = Dense(len(labels_list), activation='sigmoid')(x)
I trained the model successfully and want to following image: https://imgur.com/a/hoNjDfR. Therefore, the image is cropped to 299x299 and normalized (just devided by 255):
def img_to_array(img, data_format='channels_last', dtype='float32'):
if data_format not in {'channels_first', 'channels_last'}:
raise ValueError('Unknown data_format: %s' % data_format)
# Numpy array x has format (height, width, channel)
# or (channel, height, width)
# but original PIL image has format (width, height, channel)
x = np.asarray(img, dtype=dtype)
if len(x.shape) == 3:
if data_format == 'channels_first':
x = x.transpose(2, 0, 1)
elif len(x.shape) == 2:
if data_format == 'channels_first':
x = x.reshape((1, x.shape[0], x.shape[1]))
else:
x = x.reshape((x.shape[0], x.shape[1], 1))
else:
raise ValueError('Unsupported image shape: %s' % (x.shape,))
return x
def load_image_as_array(path):
if pil_image is not None:
_PIL_INTERPOLATION_METHODS = {
'nearest': pil_image.NEAREST,
'bilinear': pil_image.BILINEAR,
'bicubic': pil_image.BICUBIC,
}
# These methods were only introduced in version 3.4.0 (2016).
if hasattr(pil_image, 'HAMMING'):
_PIL_INTERPOLATION_METHODS['hamming'] = pil_image.HAMMING
if hasattr(pil_image, 'BOX'):
_PIL_INTERPOLATION_METHODS['box'] = pil_image.BOX
# This method is new in version 1.1.3 (2013).
if hasattr(pil_image, 'LANCZOS'):
_PIL_INTERPOLATION_METHODS['lanczos'] = pil_image.LANCZOS
with open(path, 'rb') as f:
img = pil_image.open(io.BytesIO(f.read()))
width_height_tuple = (IMG_HEIGHT, IMG_WIDTH)
resample = _PIL_INTERPOLATION_METHODS['nearest']
img = img.resize(width_height_tuple, resample)
return img_to_array(img, data_format=K.image_data_format())
img_array = load_image_as_array('https://imgur.com/a/hoNjDfR')
img_array = img_array/255
Then I predict it with the trained model in Keras:
predict(img_array.reshape(1,img_array.shape[0],img_array.shape[1],img_array.shape[2]))
The result is the following:
array([[0.02083278, 0.00425783, 0.8858412 , 0.17453966, 0.2628744 ,
0.00428194, 0.2307986 , 0.01038828, 0.07561868, 0.00983179,
0.09568241, 0.03087404, 0.00751176, 0.00651798, 0.03731382,
0.02220723, 0.0187968 , 0.02018479, 0.3416505 , 0.00586909,
0.02030778, 0.01660049, 0.00960067, 0.02457979, 0.9711478 ,
0.00666443, 0.01468313, 0.0035468 , 0.00694743, 0.03057212,
0.00429407, 0.01556832, 0.03173089, 0.01407397, 0.35166138,
0.00734553, 0.0508953 , 0.00336689, 0.0169737 , 0.07512951,
0.00484502, 0.01656419, 0.01643038, 0.02031735, 0.8343202 ,
0.02500874, 0.02459189, 0.01325032, 0.00414564, 0.08371573,
0.00484318]], dtype=float32)
The important point is that it has four values with a value greater than 0.8:
>>> y[y>=0.8]
array([0.9100583 , 0.96635956, 0.91707945, 0.9711707 ], dtype=float32))
Now I have converted my network to .pb and imported it in an android project. I wanted to predict the same image in android. Therefore I also resize the image and normalize it like I did in Python by using the following code:
// Resize image:
InputStream imageStream = getAssets().open("test3.jpg");
Bitmap bitmap = BitmapFactory.decodeStream(imageStream);
Bitmap resized_image = utils.processBitmap(bitmap,299);
and then normalize by using the following function:
public static float[] normalizeBitmap(Bitmap source,int size){
float[] output = new float[size * size * 3];
int[] intValues = new int[source.getHeight() * source.getWidth()];
source.getPixels(intValues, 0, source.getWidth(), 0, 0, source.getWidth(), source.getHeight());
for (int i = 0; i < intValues.length; ++i) {
final int val = intValues[i];
output[i * 3] = Color.blue(val) / 255.0f;
output[i * 3 + 1] = Color.green(val) / 255.0f;
output[i * 3 + 2] = Color.red(val) / 255.0f ;
}
return output;
}
But in java I get other values. None of the four indices has a value greater than 0.8.
The value of the four indices are between 0.1 and 0.4!!!
I have checked my code several times, but I don't understand why in android I don't get the same values for the same image? Any idea or hint?

Send shadow map to shader in OpenGL

I am trying to implement shadow-mapping in my scene, but all I get is zeros in my fragment shader when I call texture() (I've tested it with == 0.0). My question: Am I sending the depth texture to the shader correctly?
Here is my fragment shader code:
bool getShadow() {
vec4 lightProjPositionScaled = lightProjPosition/lightProjPosition.w;
vec2 texCoords = lightProjPositionScaled.xy*0.5 + 0.5; // bias
return lightProjPositionScaled.z + 0.0005 > texture(shadowMap, texCoords).x;
}
Here is my relevant java codeInit (edited due to BDL's comment)
gl.glEnable(GL2.GL_TEXTURE_2D);
// generate stuff
IntBuffer ib = IntBuffer.allocate(1);
gl.glGenFramebuffers(1, ib);
frameBuffer = ib.get(0);
ib = IntBuffer.allocate(1);
gl.glGenTextures(1, ib);
shadowMap = ib.get(0);
gl.glBindFramebuffer(GL2.GL_FRAMEBUFFER, frameBuffer);
gl.glBindTexture(GL2.GL_TEXTURE, shadowMap);
gl.glTexImage2D(GL2.GL_TEXTURE_2D, 0, GL2.GL_DEPTH_COMPONENT, 1024, 1024, 0, GL2.GL_DEPTH_COMPONENT, GL2.GL_FLOAT, null);
gl.glDrawBuffer(GL2.GL_NONE);
gl.glReadBuffer(GL2.GL_NONE);
gl.glBindFramebuffer(GL2.GL_FRAMEBUFFER, 0);
// prevents 'shadow acne'
gl.glPolygonOffset(2.5f, 0);
// prevents multiple shadows
gl.glTexParameteri(GL2.GL_TEXTURE_2D, GL2.GL_TEXTURE_WRAP_S, GL2.GL_CLAMP_TO_EDGE);
gl.glTexParameteri(GL2.GL_TEXTURE_2D, GL2.GL_TEXTURE_WRAP_T, GL2.GL_CLAMP_TO_EDGE);
// prevents (or expects!!!) pixel-y textures
gl.glTexParameteri(GL2.GL_TEXTURE_2D, GL2.GL_TEXTURE_MIN_FILTER, GL2.GL_NEAREST);
gl.glTexParameteri(GL2.GL_TEXTURE_2D, GL2.GL_TEXTURE_MAG_FILTER, GL2.GL_NEAREST);
// store one value in all four components of pixel
gl.glTexParameteri(GL2.GL_TEXTURE_2D, GL2.GL_DEPTH_TEXTURE_MODE, GL2.GL_INTENSITY);
Display (1st pass, for shadows):
// render shadows
gl.glUseProgram(shadowProgram);
gl.glUniformMatrix4fv(lightMatrixLocShadow, 1, false, lightMatrix.getMatrix(), 0); // yep (haha change viewMatrix -> lightMatrix)
gl.glUniformMatrix4fv(projMatrixLocShadow, 1, false, projMatrix.getMatrix(), 0);
gl.glBindFramebuffer(GL2.GL_FRAMEBUFFER, sha.frameBuffer);
gl.glViewport(0, 0, 1024, 1024);
gl.glClear(GL2.GL_DEPTH_BUFFER_BIT);
renderScene(gl, sunMatrix);
gl.glCopyTexImage2D(GL2.GL_TEXTURE_2D, 0, GL2.GL_DEPTH_COMPONENT, 0, 0, 1024, 1024, 0);
gl.glBindFramebuffer(GL2.GL_FRAMEBUFFER, 0);
Display (2nd pass, for rendering the scene):
// render display (regular)
gl.glUseProgram(displayProgram);
gl.glDrawBuffer(GL2.GL_FRONT);
gl.glReadBuffer(GL2.GL_FRONT);
gl.glUniformMatrix4fv(viewMatrixLoc, 1, false, viewMatrix.getMatrix(), 0);
gl.glUniformMatrix4fv(projMatrixLocDisplay, 1, false, projMatrix.getMatrix(), 0);
gl.glUniformMatrix4fv(lightMatrixLocDisplay, 1, false, lightMatrix.getMatrix(), 0);
gl.glUniform4fv(sunPositionLoc, 1, sunWorldPosition, 0); // send sun's position to shader
gl.glUniform1f(sunBrightnessLoc, sunBrightness);
gl.glUniform1i(shadowMapLoc, 0);
gl.glViewport(0, 0, screenWidth, screenHeight);
// day-night cycle
float[] color = SkyManager.getSkyColor(time);
gl.glClearColor(color[0], color[1], color[2], 1);
gl.glClear(GL2.GL_COLOR_BUFFER_BIT | GL2.GL_DEPTH_BUFFER_BIT);
gl.glActiveTexture(GL2.GL_TEXTURE0);
gl.glBindTexture(GL2.GL_TEXTURE_2D, sha.shadowMap);
renderScene(gl, sunMatrix);
Another strange outcome is that only fragments on the z=0 plane relative to the light matrix (the light's rotating, and the plane rotates with it) are lit. All other fragments, behind and in front of the light, are shadowed.
One issue was with the line gl.glBindTexture(GL2.GL_TEXTURE, shadowMap);
I was binding the texture to GL_TEXTURE instead of GL_TEXTURE_2D.

Java: How capturing pen data in Wacom STU-530 Tablet?

I'm trying to capture all the pen data: touch, pressure at one point, coordinates of the touch ...
Any suggestions?
SigCtl sigCtl = new SigCtl();
DynamicCapture dc = new DynamicCapture();
int rc = dc.capture(sigCtl, "who", "why", null, null);
if(rc == 0) {
System.out.println("signature captured successfully");
String fileName = "sig1.png";
SigObj sig = sigCtl.signature();
sig.extraData("AdditionalData", "CaptureImage.java Additional Data");
int flags = SigObj.outputFilename | SigObj.color32BPP | SigObj.encodeData;
sig.renderBitmap(fileName, 200, 150, "image/png", 0.5f, 0xff0000, 0xffffff, 0.0f, 0.0f, flags );
}
I managed to solve the problem using the wgssStu library, this jar there is PenData class, which has the following methods: getPressure, getX, getY, getRdy, getSw...

OpenGL wireFrame render of a OBJ file

I am writing a program that just needs to read a .obj file and render it in wire frame mode.
I had already read the .obj file (correctly - i believe) that i want to render. But i am having some problems... It is suppose to be in wire frame but instead... (image bellow)
Here is the code:
public void render(GL gl){
float xMiddle = (m.getXVertexMax() + m.getXVertexMin())/2;
float yMiddle = (m.getYVertexMax() + m.getYVertexMin())/2;
float zMiddle = (m.getZVertexMax() + m.getZVertexMin())/2;
gl.glScalef(1/(m.getXVertexMax() - m.getXVertexMin()), 1, 1);
gl.glScalef(1, 1/(m.getYVertexMax() - m.getYVertexMin()), 1);
gl.glScalef(1, 1, 1/(m.getZVertexMax() - m.getZVertexMin()));
gl.glBegin(GL.GL_TRIANGLES);
{
gl.glEnable(GL.GL_DEPTH_TEST);
gl.glPolygonMode(GL.GL_FRONT_AND_BACK, GL.GL_LINES);
for(int i = 0; i < m.faces.size(); i++ ){
Vector3f n1 = m.normals.get(m.faces.get(i).getNormalIndices()[0] - 1);
Vector3f v1 = m.vertices.get(m.faces.get(i).getVertexIndices()[0] - 1);
gl.glVertex3f(v1.x - xMiddle, v1.y - yMiddle, v1.z - zMiddle);
gl.glNormal3f(n1.x, n1.y, n1.z);
Vector3f n2 = m.normals.get(m.faces.get(i).getNormalIndices()[1] - 1);
Vector3f v2 = m.vertices.get(m.faces.get(i).getVertexIndices()[1] - 1);
gl.glVertex3f(v2.x - xMiddle, v2.y - yMiddle, v2.z - zMiddle);
gl.glNormal3f(n2.x, n2.y, n2.z);
Vector3f n3 = m.normals.get(m.faces.get(i).getNormalIndices()[2] - 1);
Vector3f v3 = m.vertices.get(m.faces.get(i).getVertexIndices()[2] - 1);
gl.glVertex3f(v3.x - xMiddle, v3.y - yMiddle, v3.z - zMiddle);
gl.glNormal3f(n3.x, n3.y, n3.z);
}
}
gl.glEnd();
}
NOTE: Vector3f in the code, is a data structure that i made.
I have tried everything i could find but still, it wont render the image as wire frame! :-/
Can anyone give a hand?
gl.glBegin(GL.GL_TRIANGLES);
{
gl.glEnable(GL.GL_DEPTH_TEST);
gl.glPolygonMode(GL.GL_FRONT_AND_BACK, GL.GL_LINES);
...
Very few GL commands are valid inside a glBegin()/glEnd() block:
Only a subset of GL commands can be used between glBegin and glEnd.
The commands are
glVertex, glColor, glSecondaryColor, glIndex, glNormal, glFogCoord, glTexCoord, glMultiTexCoord, glVertexAttrib, glEvalCoord, glEvalPoint,
glArrayElement, glMaterial, and glEdgeFlag. Also, it is acceptable
to use glCallList or glCallLists to execute display lists that include
only the preceding commands.
If any other GL command is executed between glBegin and glEnd, the error flag is set and the command is ignored.
glEnable() and glPolygonMode() are not on that list.
Move them outside your glBegin() block.
gl.glVertex3f(v1.x - xMiddle, v1.y - yMiddle, v1.z - zMiddle);
gl.glNormal3f(n1.x, n1.y, n1.z);
Wrong way around. You want normal, then vertex:
gl.glNormal3f(n1.x, n1.y, n1.z);
gl.glVertex3f(v1.x - xMiddle, v1.y - yMiddle, v1.z - zMiddle);
glNormal() only sets the current normal, glVertex() is what actually sends that down the pipeline.

Categories

Resources