LibGDX - modify pixels after rendering a 3D scene - java

I'm writing a 3D game that is drawn with ASCII art just to achieve a special look. I render my models to a ModelBatch and now I would like to convert every pixel to an ASCII symbol before drawing the final result to the screen. I already have the code to convert pixels to ASCII, but I have no idea how to get the pixels of the render result.
Code so far (not including the ASCII convertation):
import com.badlogic.gdx.ApplicationListener;
import com.badlogic.gdx.Gdx;
import com.badlogic.gdx.graphics.Color;
import com.badlogic.gdx.graphics.GL10;
import com.badlogic.gdx.graphics.PerspectiveCamera;
import com.badlogic.gdx.graphics.VertexAttributes.Usage;
import com.badlogic.gdx.graphics.g3d.Model;
import com.badlogic.gdx.graphics.g3d.ModelBatch;
import com.badlogic.gdx.graphics.g3d.ModelInstance;
import com.badlogic.gdx.graphics.g3d.materials.ColorAttribute;
import com.badlogic.gdx.graphics.g3d.materials.Material;
import com.badlogic.gdx.graphics.g3d.utils.ModelBuilder;
public class MyGame implements ApplicationListener {
private PerspectiveCamera cam;
private ModelBatch batch;
public Model model;
public ModelInstance instance;
#Override
public void create() {
//float w = Gdx.graphics.getWidth();
//float h = Gdx.graphics.getHeight();
cam = new PerspectiveCamera(67, Gdx.graphics.getWidth(), Gdx.graphics.getHeight());
cam.position.set(10f, 10f, 10f);
cam.lookAt(0,0,0);
cam.near = 0.1f;
cam.far = 300f;
cam.update();
batch = new ModelBatch();
ModelBuilder modelBuilder = new ModelBuilder();
model = modelBuilder.createBox(5f, 5f, 5f,
new Material(ColorAttribute.createDiffuse(Color.GREEN)),
Usage.Position | Usage.Normal);
instance = new ModelInstance(model);
}
#Override
public void dispose() {
batch.dispose();
model.dispose();
}
#Override
public void render() {
Gdx.gl.glViewport(0, 0, Gdx.graphics.getWidth(), Gdx.graphics.getHeight());
Gdx.gl.glClear(GL10.GL_COLOR_BUFFER_BIT | GL10.GL_DEPTH_BUFFER_BIT);
batch.begin(cam);
batch.render(instance);
batch.end();
}
#Override
public void resize(int width, int height) {
}
#Override
public void pause() {
}
#Override
public void resume() {
}
}

I think you want to render to a texture, convert the texture to a Pixmap, and then parse the pixmap to draw your ASCII characters to the real screen.
First, to render to a texture (offscreen buffer) in LibGDX, use a FrameBufferObject:
fbo = new FrameBuffer();
fbo.begin();
// draw your stuff
// ...
fbo.end();
Second, to get bytes that the CPU can use, you can use the [ScreenUtils class](http://libgdx.badlogicgames.com/nightlies/docs/api/com/badlogic/gdx/utils/ScreenUtils.html
). You can either get a Pixmap or a raw byte[]. For example, replace the // ... above with:
Pixmap p = ScreenUtils.getFrameBufferPixmap(0, 0, width, height);
Now you can wander over the pixmap checking out pixels (using [Pixmap.getPixel](http://libgdx.badlogicgames.com/nightlies/docs/api/com/badlogic/gdx/graphics/Pixmap.html#getPixel(int, int))).

Related

libGDX font being drawn out of the viewport

I'm having problems to draw some text in the screen, and I found out that depending on the viewport size, the text may get out of the display despite being calculating the coordinates as the center of the screen. Here's the code, it is a just slightly modified project as created by libgdxgenerator by default:
package net.iberdroid.libgdxtestfonts;
import com.badlogic.gdx.ApplicationAdapter;
import com.badlogic.gdx.Gdx;
import com.badlogic.gdx.graphics.Color;
import com.badlogic.gdx.graphics.GL20;
import com.badlogic.gdx.graphics.OrthographicCamera;
import com.badlogic.gdx.graphics.Texture;
import com.badlogic.gdx.graphics.g2d.BitmapFont;
import com.badlogic.gdx.graphics.g2d.SpriteBatch;
import com.badlogic.gdx.utils.viewport.FitViewport;
import com.badlogic.gdx.utils.viewport.Viewport;
public class LibGdxTestFonts extends ApplicationAdapter {
SpriteBatch batch;
Texture img;
BitmapFont defaultFont;
private OrthographicCamera camera;
private Viewport viewport;
private float textY;
private float textX;
#Override
public void create() {
camera = new OrthographicCamera();
viewport = new FitViewport(
640,
480,
camera);
camera.setToOrtho(false);
batch = new SpriteBatch();
img = new Texture("badlogic.jpg");
defaultFont = new BitmapFont();
textX = viewport.getWorldWidth() / 2;
textY = viewport.getWorldHeight() / 2;
}
#Override
public void render() {
Gdx.gl.glClearColor(0, 0, 0, 1);
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);
batch.begin();
batch.setProjectionMatrix(camera.combined);
defaultFont.setColor(Color.WHITE);
Gdx.app.log("render", String.format("TextX: %f TextY: %f", textX, textY));
defaultFont.draw(batch, "HELLO WORLD!", textX, textY);
batch.end();
}
#Override
public void dispose() {
batch.dispose();
img.dispose();
}
}
As far as I understand, that should draw a HELLO WORLD! text starting around the center of the screen. And it actually does it with that viewport size. Now, if you try with a bigger viewport, let's say 800x600, the text will move to the right and top, if try even with higher values, it will become a point in which the text will get out the boundaries of the screen by the top-right corner.
The same happens in the opposite direction. The smaller viewport you try, the further from the center and closer to the bottom-left corner the text will appear, until it eventually get out the boundaries too.
So either I am failing to grasp something here, or the BitMap.draw method seems to be ignoring the viewport size and using others that I cannot figure out.
If someone else does it, please, make me know.
Thanks a lot in advance!
P.S. I've tried also with a Hiero generated font and had the same issue.
Adding this seems to solve the problem with the position, though I still have a problem with the scale, but that's a different issue.
#Override public void resize (int width, int height) {
viewport.update(width, height, true);
}

libgdx sprite-sheet based animation scaling

I am quite new to libgdx and android programming in general. I am having problems rendering a sprite-sheet-based animation, and getting it to be the same size on different screen sizes.
If I run the following code on my note 4, the animation is quite small, on the Zenfone 2 instead it's quite big, and lastly on my laptop it is jut so small it can barely be seen.
I really don't understand why this happens, and how to make it the same on the two phones. I thought that using an orthographic camera with ingame units and a viewport would do the job, but I might be doing something wrong because it doesn't.
I am following the book "libgdx cross-platform game development cookbook".
I would hugely appreciate any help on how to properly use in game units to get the game to be the same on different screen sizes, so that a 512x512 px image isn't tiny on the note4 and huge on the Zenfone (each frame of my animation is 512px squared).
And as far as the pc goes, I just have no clue what is going on, I would really appreciate any explanation on why that happens!
Thank you all!
package com.mygdxGame;
import com.badlogic.gdx.ApplicationAdapter;
import com.badlogic.gdx.Gdx;
import com.badlogic.gdx.graphics.GL20;
import com.badlogic.gdx.graphics.OrthographicCamera;
import com.badlogic.gdx.graphics.Texture;
import com.badlogic.gdx.graphics.g2d.Animation;
import com.badlogic.gdx.graphics.g2d.Animation.PlayMode;
import com.badlogic.gdx.graphics.g2d.SpriteBatch;
import com.badlogic.gdx.graphics.g2d.TextureAtlas;
import com.badlogic.gdx.graphics.g2d.TextureAtlas.AtlasRegion;
import com.badlogic.gdx.graphics.g2d.TextureRegion;
import com.badlogic.gdx.utils.Array;
import com.badlogic.gdx.utils.viewport.FitViewport;
import com.badlogic.gdx.utils.viewport.Viewport;
import java.util.Comparator;
public class MyGdxGame extends ApplicationAdapter {
private static final float WORLD_TO_SCREEN = 1.0f / 100.0f;
private static final float SCENE_WIDTH = 12.80f;
private static final float SCENE_HEIGHT = 7.20f;
private static final float FRAME_DURATION = 1.0f / 20.0f;
private TextureAtlas techmanAtlas;
private Animation techmanRun;
private float animationTime;
private OrthographicCamera camera;
public Viewport viewport;
public SpriteBatch batch;
#Override
public void create(){
camera = new OrthographicCamera();
viewport = new FitViewport(Gdx.graphics.getWidth(), Gdx.graphics.getHeight(), camera);
batch = new SpriteBatch();
animationTime = 0.0f;
techmanAtlas = new TextureAtlas(Gdx.files.internal("TechMan.atlas"));
Array<TextureAtlas.AtlasRegion> techmanRegions = new Array<TextureAtlas.AtlasRegion>(techmanAtlas.getRegions());
techmanRegions.sort(new RegionComparator());
techmanRun = new Animation(FRAME_DURATION, techmanRegions, PlayMode.LOOP);
camera.position.set(SCENE_WIDTH * 0.5f, SCENE_HEIGHT * 0.5f, 0.0f);
}
#Override
public void dispose(){
batch.dispose();
techmanAtlas.dispose();
}
#Override
public void render(){
Gdx.gl.glClearColor(1, 0, 0, 1);
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);
animationTime += Gdx.graphics.getDeltaTime();
camera.update();
batch.setProjectionMatrix(camera.combined);
batch.begin();
TextureRegion techmanFrame = techmanRun.getKeyFrame(animationTime);
int width = techmanFrame.getRegionWidth();
int height = techmanFrame.getRegionWidth();
float originX = width * 0.5f;
float originY = height * 0.5f;
batch.draw(techmanFrame,
1.0f - originX, 3.70f - originY,
originX, originY,
width, height, //width, height
WORLD_TO_SCREEN, WORLD_TO_SCREEN,
0.0f);
batch.draw(techmanRun.getKeyFrame(animationTime), 100.0f, 275.0f);
batch.end();
}
#Override
public void resize(int width, int height){
viewport.update(width, height, false);
}
private static class RegionComparator implements Comparator<AtlasRegion> {
#Override
public int compare(AtlasRegion region1, AtlasRegion region2){
return region1.name.compareTo(region2.name);
}
}
}
That's not the proper way to use viewport:
viewport = new FitViewport(Gdx.graphics.getWidth(),Gdx.graphics.getHeight(), camera);
You should initialize viewport with constant values, 800x600, for example, and use them when writing code:
viewport = new FitViewport(WORLD_WIDTH,WORLD_HEIGHT, camera);
The image will be streched/resized automaticly when rendering, you dont need to do that:
batch.draw(techman.Frame,x,y,width,height);
PS: Do not call batch.setProjectionMatrix(camera.combined); inside the render() method. You only need to do that once.
Documentation:
https://github.com/libgdx/libgdx/wiki/Viewports
https://github.com/libgdx/libgdx/wiki/2D-Animation

importing an object with extension md2 in libgdx

i'm new in java game programing i want to import an md2 object i used this tuto http://code.google.com/p/libgdx-users/wiki/MD2_Keyframe_Animation
but the probleme is that i cant instanciate an instance from class KeyframedModelViewer this is my code
package com.ELISA.ELISAgame.Screens;
import com.ELISA.ELISAgame.ELISA;
import com.badlogic.gdx.ApplicationListener;
import com.badlogic.gdx.Gdx;
import com.badlogic.gdx.Screen;
import com.badlogic.gdx.assets.AssetManager;
import com.badlogic.gdx.assets.loaders.ModelLoader;
import com.badlogic.gdx.graphics.GL20;
import com.badlogic.gdx.graphics.PerspectiveCamera;
import com.badlogic.gdx.graphics.g3d.Material;
import com.badlogic.gdx.graphics.g3d.Model;
import com.badlogic.gdx.graphics.g3d.ModelBatch;
import com.badlogic.gdx.graphics.g3d.ModelInstance;
/*import com.badlogic.gdx.graphics.g3d.attributes.TextureAttribute;*/
import com.badlogic.gdx.graphics.g3d.loader.ObjLoader;
import com.badlogic.gdx.graphics.g3d.utils.CameraInputController;
public class Game implements ApplicationListener, Screen {
ELISA game;
PerspectiveCamera cam;
CameraInputController camController;
ModelBatch modelBatch;
ModelLoader loader;
AssetManager assets;
Model model;
Material material;
ModelInstance instance;
public Game(ELISA game) {
this.game = game;
}
public void create() {
}
#Override
public void render(float delta) {
camController.update();
Gdx.gl.glViewport(0, 0, Gdx.graphics.getWidth(),
Gdx.graphics.getHeight());
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT | GL20.GL_DEPTH_BUFFER_BIT);
modelBatch.begin(cam);
modelBatch.render(instance);
modelBatch.end();
}
#Override
public void resize(int width, int height) {
}
#Override
public void show() {
//new JoglApplication(new KeyframedModelViewer("data/antigene.md2", "data/antigene.png"), "KeframedModel Viewer", 800, 480, false);
modelBatch = new ModelBatch();
cam = new PerspectiveCamera(67, Gdx.graphics.getWidth(),Gdx.graphics.getHeight());
cam.position.set(0f, 6f, 11.5f);
cam.lookAt(0, 0, 0);
cam.near = 0.8f;
cam.far = 300f;
cam.update();
loader = new ObjLoader();
//model = loader.loadModel(Gdx.files.internal("data/labo.obj"));
instance = new ModelInstance(model);
Material material = new Material("material", new TextureAttribute(texture, 0, "s_tex"));
model.setMaterial(material);
camController = new CameraInputController(cam);
Gdx.input.setInputProcessor(camController);
}
#Override
public void hide() {
}
#Override
public void pause() {
}
#Override
public void resume() {
}
#Override
public void dispose() {
modelBatch.dispose();
model.dispose();
}
}
That wiki, "libgdx-users" is highly outdated. MD2 is not supported right now, this is how 3d animations are handled now:
https://github.com/libgdx/libgdx/wiki/3D-animations-and-skinning
"When using fbx-conv https://github.com/libgdx/fbx-conv to convert your model from FBX to G3DB/G3DJ, animations are automatically converted along with it. Just like FBX, G3DB/G3DJ files can contain multiple animations in a single file along with the other model data. Animations applied to nodes which are not present in the source FBX file, will not be converted. So make sure to select all the nodes and animations when exporting to FBX."

LIBGDX 3D not working

So I am just starting off with the 3D part of LIBGDX. I downloaded the latest nightly build and when I followed the tutorial I got a nullPointerException on a line(Which will mark in the code). On the line has a nullPointer is this code
modelBatch.render(instance);
All the values for this line are there. The instance has all its properties and pretty much everything else in my code does to. Any ideas why I am getting a nullPointerException? Thanks in advance.
import com.badlogic.gdx.ApplicationListener;
import com.badlogic.gdx.Gdx;
import com.badlogic.gdx.graphics.Color;
import com.badlogic.gdx.graphics.GL10;
import com.badlogic.gdx.graphics.PerspectiveCamera;
import com.badlogic.gdx.graphics.VertexAttributes.Usage;
import com.badlogic.gdx.graphics.g3d.Model;
import com.badlogic.gdx.graphics.g3d.ModelBatch;
import com.badlogic.gdx.graphics.g3d.ModelInstance;
import com.badlogic.gdx.graphics.g3d.materials.ColorAttribute;
import com.badlogic.gdx.graphics.g3d.materials.Material;
import com.badlogic.gdx.graphics.g3d.utils.ModelBuilder;
public class threeDTest implements ApplicationListener {
public PerspectiveCamera camera;
public ModelBatch modelBatch;
public Model model;
public ModelInstance instance;
#Override
public void create() {
camera = new PerspectiveCamera(67, Gdx.graphics.getWidth(), Gdx.graphics.getHeight());
camera.position.set(10f, 10f, 10f);
camera.lookAt(0, 0, 0);
camera.near = 0.1f;
camera.far = 300f;
camera.update();
ModelBuilder modelBuilder = new ModelBuilder();
model = modelBuilder.createBox(5f, 5f, 5f,
new Material(ColorAttribute.createDiffuse(Color.GREEN)),
Usage.Position | Usage.Normal);
instance = new ModelInstance(model);
}
#Override
public void resize(int width, int height) {
}
#Override
public void render() {
Gdx.gl.glViewport(0, 0, Gdx.graphics.getWidth(), Gdx.graphics.getHeight());
Gdx.gl.glClear(GL10.GL_COLOR_BUFFER_BIT | GL10.GL_DEPTH_BUFFER_BIT);
modelBatch.begin(camera);//Begin Rendering
modelBatch.render(instance);<--Null Pointer on this line
modelBatch.end();//End Rendering
}
#Override
public void pause() {
}
#Override
public void resume() {
}
#Override
public void dispose() {
model.dispose();
}
}
Looking at you code you never construct (and dispose) the modelBatch instance. Therefor the modelBatch will be null, causing the NPE you got.
Add the following line in your create method:
modelBatch = new ModelBatch();
and the following line in your dispose method:
modelBatch.dispose();

How to draw a BitmapFont in LibGDX?

I'm seriously betting that I did something effing stupid and just can't seem to notice it.
package com.me.mygdxgame;
import com.badlogic.gdx.ApplicationListener;
import com.badlogic.gdx.Gdx;
import com.badlogic.gdx.graphics.GL10;
import com.badlogic.gdx.graphics.OrthographicCamera;
import com.badlogic.gdx.graphics.Texture;
import com.badlogic.gdx.graphics.Texture.TextureFilter;
import com.badlogic.gdx.graphics.g2d.BitmapFont;
import com.badlogic.gdx.graphics.g2d.Sprite;
import com.badlogic.gdx.graphics.g2d.SpriteBatch;
import com.badlogic.gdx.graphics.g2d.TextureRegion;
public class Locked implements ApplicationListener
{
private OrthographicCamera camera;
private SpriteBatch batch;
private Texture texture;
private Sprite sprite;
private BitmapFont font;
private CharSequence str = "Hello World!";
private float width;
private float height;
#Override
public void create()
{
width = Gdx.graphics.getWidth();
height = Gdx.graphics.getHeight();
camera = new OrthographicCamera(1, height / width);
batch = new SpriteBatch();
texture = new Texture(Gdx.files.internal("data/libgdx.png"));
texture.setFilter(TextureFilter.Linear, TextureFilter.Linear);
TextureRegion region = new TextureRegion(texture, 0, 0, 512, 275);
sprite = new Sprite(region);
sprite.setSize(0.9f, 0.9f * sprite.getHeight() / sprite.getWidth());
sprite.setOrigin(sprite.getWidth() / 2, sprite.getHeight() / 2);
sprite.setPosition(-sprite.getWidth() / 2, -sprite.getHeight() / 2);
font = new BitmapFont(Gdx.files.internal("data/digib.fnt"),
Gdx.files.internal("data/digib.png"), false);
}
#Override
public void dispose()
{
batch.dispose();
texture.dispose();
}
#Override
public void render()
{
Gdx.gl.glClearColor(1, 1, 1, 1);
Gdx.gl.glClear(GL10.GL_COLOR_BUFFER_BIT);
batch.setProjectionMatrix(camera.combined);
batch.begin();
font.setColor(0.0f, 0.0f, 0.0f, 1.0f);
//sprite.draw(batch);
font.draw(batch, str, width*0.5f, height*0.5f);
batch.end();
}
#Override
public void resize(int width, int height)
{
}
#Override
public void pause()
{
}
#Override
public void resume()
{
}
}
The project was generated with the template tool they provide gdx-setup-ui.jar
As you can see in the code, I didn't bother to get rid of the default codes (Just some simple draw codes to render the LibGDX logo).
So, with the cleanly generated project, I followed this guide here
http://code.google.com/p/libgdx-users/wiki/addingText2D
and finally arriving with the provided code above.
The problem is, why won't the !##$ing text show!? I changed the position so many times and still no luck :\
Did I miss something?
FYI: The fonts are fine, I dropped them into another game and it works.
Try to change projection matrix like this:
Matrix4 normalProjection = new Matrix4().setToOrtho2D(0, 0, Gdx.graphics.getWidth(), Gdx.graphics.getHeight());
batch.setProjectionMatrix(normalProjection);
All I do is
spriteBatch = new SpriteBatch();
font = new BitmapFont(Gdx.files.internal("data/nameOfFont.fnt"),
Gdx.files.internal("data/nameOfFont.png"), false);
and in render method
spriteBatch.begin();
font.setColor(1.0f, 1.0f, 1.0f, 1.0f);
font.draw(spriteBatch, "some string", 25, 160);
spriteBatch.end();
You can read something more about it on my blog: http://algorhymes.wordpress.com/2012/11/17/javalibgdx-fonts/
Personally I'm not a big fan of converting all the fonts to .fnt format. If you need different sizes for a certain font you have to spend a lot of time (and app space) to make all the conversions.
You can just use the FreeType Extension and load straight from a .ttf
FreeTypeFontGenerator generator = new FreeTypeFontGenerator(fontFile);
BitmapFont font15 = generator.generateFont(15);
BitmapFont font22 = generator.generateFont(22);
generator.dispose();
More info here
Rendering is done in the same way as explained by watis.
create a .fnt file using hiero which is provided by libgdx website
set the size of font 150 ,it will create a .fnt file and a png file
copy both of file in your assests folder
now declare the font
BitmapFont font;
nw in create method
font = new BitmapFont(Gdx.files.internal("data/100.fnt"), false);//100 is the font name you can give your font any name
in render
font.setscale(.2f);
font.draw(batch, "whatever you want to write", x,y);
this will work smoothly
The main problem with your code is that you have created camera with
viewportWidth = 1 &
viewportHeight = width/height
and you are drawing font at width*0.5f & height*0.5f which is out of scope from camera
Either change the camera initialization to
camera = new OrthographicCamera(width, height);
....
or change the draw font statement to
font.setScale(1,height/width);
font.draw(batch, str, 0.5f, height/width*0.5f);
Did you try giving position manually like this. I hope this will work
batch.setProjectionMatrix(camera.combined);
batch.enableBlending();
batch.begin();
font.draw(batch, yourString, 100,100);
batch.end();

Categories

Resources