I have a mesh generated from data in Renderables. Environment is set. Material is a simple new Material().
... /*init renderable*/
/*set mesh parameters*/
renderable.mesh = new Mesh(false,
(int)(meshVertexArray.length/SurfaceBuilder.__ELEMENTSPERVERTEX__), /*!vertices, not cordinates*/
meshIndexArray.length,
new VertexAttribute(Usage.Position,3,"a_position"),
new VertexAttribute(Usage.Normal,3,"a_normal"),
new VertexAttribute(Usage.TextureCoordinates,2,"a_texCoords"),
new VertexAttribute(Usage.ColorPacked,4, "a_color")
);
... /*set vertices*/
The mesh is generated properly, but I can't see the textures, only the gray (shaded) triangles. I did try the rtfm method, but so far I saw no way to bind a texture, so it displays properly in libGDX, only with shaders, and I'm not using them (I'm catching up on them after this feature is implemented). Is there a way in libGDX to bind textures to a mesh without shaders?
Without seeing your texturing code, maybe try specify a texture for your material using the following format:
Material mat = new Material();
//set the diffuse channel on the texture using some texture
mat.set(TextureAttribute.createDiffuse(new Texture("crate.jpg")));
Related
I am working on an AR application using ARCore and Sceneform. I want to add texture to face landmarks like nose, lips, face and eyes. I want to know how to create texture so that I can overlay it on Augmented Face mesh?
Texture.builder()
.setSource(this, R.drawable.makeupforlips)
.setUsage(Texture.Usage.COLOR)
.build()
.thenAccept(texture -> faceMeshTexture = texture);
addOnUpdateListener:
for (AugmentedFace face: faceList) {
if (!faceNodeMap.containsKey(face)) {
AugmentedFaceNode faceNode = new AugmentedFaceNode(face);
faceNode.setParent(scene);
// faceNode.setFaceRegionsRenderable(faceRegionsRenderable);
faceNode.setFaceMeshTexture(faceMeshTexture);
faceNodeMap.put(face, faceNode);
}
}
When you're building ARCore app with Augmented Faces functionality you need to use a canonical face mesh that was saved in .fbx, .obj, or .glTF file format. Import this canonical face into Autodesk Maya and using UV Texture Editor, create a UV-mapped texture that can be repainted in Adobe Photoshop or Pixelmator.
What i'm trying to do:
Pull an image from sd-card on phone using Java Plugin.
Unity passes a texture ID to plugin.
Plugin uses opengl to assign the image to the texture in Unity through the ID.
Will (eventually) be used to play a video clip from the phone in Unity, for now, it's just trying to change a texture outside of unity.
My issue:
When i call the method in the plugin, passing texture.GetNativeTextureID() into it, the texture does not change. I'm currently only using a simple black 50x50 texture for testing, and the original texture is a flat white.
I'm worried that i've missed something significant, as this is my first time working with Gl calls in java. Much of the answers to similar problems involve using native C++ instead of Java, but I can't find a concrete answer saying that C++ must be used. I'd like to do my best to avoid writing another set of plugins and plugin handlers for C++, but if it's the most efficient/only way to get this working, i'll do it as i'm not unfamiliar with OpenGL and C++
Code:
The plugin method is called from OnPreRender() in a script attached to the main camera:
if (grabTex) {
int texPtr = m_VideoTex.GetNativeTextureID();
Debug.Log( "texPtr = " + texPtr );
m_JVInterface.SetTex( texPtr );
}
m_VideoTex is a basic Texture2D( 50, 50 ) with all pixels set to white, attached to the diffuse shader on the quad in the scene.
The Java plugin code is as follows:
public void SetTexture(Context cont, int _texPointer) {
if (_texPointer != 0) {
final BitmapFactory.Options options = new BitmapFactory.Options();
options.inScaled = false;
options.inJustDecodeBounds = false;
final Bitmap bitmap = BitmapFactory.decodeFile("/storage/emulated/0/Pictures/black.jpg", options);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, _texPointer);
GLUtils.texImage2D(GLES20.GL_TEXTURE_2D, 0, bitmap, 0);
Log.i("VideoHandler", "Recieved ID: " + _texPointer);
bitmap.recycle();
}
}
This is most likely a problem with the OpenGL Context. The easiest way would be to send the texture as raw bytes to Unity and then upload as texture inside Unity.
I created a simple model of a barrel (.zip) in Blender 2.69. Then I created a UV map in Blender and made a UV mapped texture out of it (its in the archive, too). Then I imported my texture in Blender, now the mapping matches:
In Blender the model looks fine so far:
By using the Ogre exporter plugin that I installed via the jmonkeyengine SDK, I exported the model. The result of this is my OgreXML format file of the barrel (I did not export material).
Now, I tried to add the barrel to my world like this:
this.barrel = this.assetManager.loadModel("models/barrel/Barrel.mesh.xml");
Material barrelMat = new Material(this.assetManager,
"Common/MatDefs/Light/Lighting.j3md");
barrelMat.setTexture("DiffuseMap",
this.assetManager.loadTexture("models/barrel/Barrel.jpg"));
barrelMat.setBoolean("UseMaterialColors", true);
barrelMat.setColor("Diffuse", ColorRGBA.White);
barrelMat.setColor("Specular", new ColorRGBA(0.3f, 0.1f, 0, 1));
barrelMat.setFloat("Shininess", 4f);
this.barrel.setMaterial(barrelMat);
this.rootNode.attachChild(this.barrel);
The result is this:
Is there something else I have to consider when setting the texture for my UV mapped model?
Often when transferring models from Blender to something like JME, the textures will be upside down. Where you load the texture:
barrelMat.setTexture(“DiffuseMap”,
assetManager.loadTexture(“models/barrel/Barrel.jpg”));
Instead use the TextureKey form of the loadTexture() method and pass yFlip false since true is the default.
assetManager.loadTexture(new TextureKey(“models/barrel/Barrel.jpg”, false));
That should fix your issue.
References:
loadTexture() : http://hub.jmonkeyengine.org/javadoc/com/jme3/asset/AssetManager.html#loadTexture(com.jme3.asset.TextureKey)
TextureKey : http://hub.jmonkeyengine.org/javadoc/com/jme3/asset/TextureKey.html#TextureKey(java.lang.String,%20boolean)
I have a java 3d application and this application I load an OBJ file into my scene. How can I assign a texture (a jpg file) to this model?
To be more precise, when I want to assign texture to a primitive java object (e.g. sphere) I use the following:
Sphere sphere = new Sphere(Radius, Primflags, Appearance);
However, when loading and adding an obj file I do:
Scene scene = getSceneFromFile("OBJ file");
myBranchGroup = scene.getSceneGroup();
And in second case, I can find no way of assigning the texture. How should I do that?
You would have to use a program that you made the obj file or were you can load the file. Paint it, then export that file. Then add this code to it outside any methods
static TextureLoader loader = new TextureLoader("C:\\Users\\Sawyera\\Desktop\\Paint Layer 1.jpg",
"RGP", new Container());
static Texture texture = loader.getTexture();
Then
texture.setBoundaryModeS(Texture.WRAP);
texture.setBoundaryModeT(Texture.WRAP);
texture.setBoundaryColor(new Color4f(0.0f, 1.0f, 0.0f, 0.0f));
TextureAttributes texAttr = new TextureAttributes();
texAttr.setTextureMode(TextureAttributes.MODULATE);
Appearance ap = new Appearance();
ap.setTexture(texture);
ap.setTextureAttributes(texAttr);
int primflags = Primitive.GENERATE_NORMALS
+ Primitive.GENERATE_TEXTURE_COORDS;
ObjectFile loader = new ObjectFile(ObjectFile.RESIZE);
Then add this before you assign the model to the scene. Assuming the 3D model varrible is called model
model.setAppearance(ap);
IIRC you need to get the Shape3D node you want to apply the texture to (calling setAppearance(...)) from your branch group, e.g. by using getChild(index) etc. Note that you might to iterate recursively through the children, since the branch group you get might actually contain other groups, so you might find the shapes further down the group tree.
Alternatively you should be able to add an AlternateAppearance object to the branch group.
I am learing openGL using lwjgl. I am trying to see the effects of different filters on textures. to load the textures i am using SlickUtil.TextureLoader to load the textures and only the first 2 texture filter options work and the others result in a blank texture. if anyone knows what i am doing wrong or how to manually apply filters after a texture has been loaded please say so;
Texture[] textures = new Texture[6];
textures[0] = TextureLoader.getTexture("BMP", new FileInputStream("src/Textures/Glass.bmp"), GL_NEAREST);
textures[1] = TextureLoader.getTexture("BMP", new FileInputStream("src/Textures/Glass.bmp"), GL_LINEAR);
textures[2] = TextureLoader.getTexture("BMP", new FileInputStream("src/Textures/Glass.bmp"), GL_NEAREST_MIPMAP_NEAREST);
textures[3] = TextureLoader.getTexture("BMP", new FileInputStream("src/Textures/Glass.bmp"), GL_LINEAR_MIPMAP_NEAREST);
textures[4] = TextureLoader.getTexture("BMP", new FileInputStream("src/Textures/Glass.bmp"), GL_NEAREST_MIPMAP_LINEAR);
textures[5] = TextureLoader.getTexture("BMP", new FileInputStream("src/Textures/Glass.bmp"), GL_LINEAR_MIPMAP_LINEAR);
The *MIPMAP* filters require a complete mipmap chain, and it's pretty obvious that the texture loader isn't creating the mipmap chain. Mipmaps can be loaded from a file, or generated by OpenGL using glGenerateMipmap().
The usual symptom of a incomplete mipmap chain and a mipmap filter, is that the texture looks completely white (or black).
For LWJGL, you can use glGenerateMipmap (GL30, OpenGL 3.0) or glGenerateMipmapEXT (GL_EXT_framebuffer_object extension).
To use it, just bind the texture with glBindTexture, and call glGenerateMipmap with the correct texture target (the same as glBindTexture).