I am quite new to libgdx and android programming in general. I am having problems rendering a sprite-sheet-based animation, and getting it to be the same size on different screen sizes.
If I run the following code on my note 4, the animation is quite small, on the Zenfone 2 instead it's quite big, and lastly on my laptop it is jut so small it can barely be seen.
I really don't understand why this happens, and how to make it the same on the two phones. I thought that using an orthographic camera with ingame units and a viewport would do the job, but I might be doing something wrong because it doesn't.
I am following the book "libgdx cross-platform game development cookbook".
I would hugely appreciate any help on how to properly use in game units to get the game to be the same on different screen sizes, so that a 512x512 px image isn't tiny on the note4 and huge on the Zenfone (each frame of my animation is 512px squared).
And as far as the pc goes, I just have no clue what is going on, I would really appreciate any explanation on why that happens!
Thank you all!
package com.mygdxGame;
import com.badlogic.gdx.ApplicationAdapter;
import com.badlogic.gdx.Gdx;
import com.badlogic.gdx.graphics.GL20;
import com.badlogic.gdx.graphics.OrthographicCamera;
import com.badlogic.gdx.graphics.Texture;
import com.badlogic.gdx.graphics.g2d.Animation;
import com.badlogic.gdx.graphics.g2d.Animation.PlayMode;
import com.badlogic.gdx.graphics.g2d.SpriteBatch;
import com.badlogic.gdx.graphics.g2d.TextureAtlas;
import com.badlogic.gdx.graphics.g2d.TextureAtlas.AtlasRegion;
import com.badlogic.gdx.graphics.g2d.TextureRegion;
import com.badlogic.gdx.utils.Array;
import com.badlogic.gdx.utils.viewport.FitViewport;
import com.badlogic.gdx.utils.viewport.Viewport;
import java.util.Comparator;
public class MyGdxGame extends ApplicationAdapter {
private static final float WORLD_TO_SCREEN = 1.0f / 100.0f;
private static final float SCENE_WIDTH = 12.80f;
private static final float SCENE_HEIGHT = 7.20f;
private static final float FRAME_DURATION = 1.0f / 20.0f;
private TextureAtlas techmanAtlas;
private Animation techmanRun;
private float animationTime;
private OrthographicCamera camera;
public Viewport viewport;
public SpriteBatch batch;
#Override
public void create(){
camera = new OrthographicCamera();
viewport = new FitViewport(Gdx.graphics.getWidth(), Gdx.graphics.getHeight(), camera);
batch = new SpriteBatch();
animationTime = 0.0f;
techmanAtlas = new TextureAtlas(Gdx.files.internal("TechMan.atlas"));
Array<TextureAtlas.AtlasRegion> techmanRegions = new Array<TextureAtlas.AtlasRegion>(techmanAtlas.getRegions());
techmanRegions.sort(new RegionComparator());
techmanRun = new Animation(FRAME_DURATION, techmanRegions, PlayMode.LOOP);
camera.position.set(SCENE_WIDTH * 0.5f, SCENE_HEIGHT * 0.5f, 0.0f);
}
#Override
public void dispose(){
batch.dispose();
techmanAtlas.dispose();
}
#Override
public void render(){
Gdx.gl.glClearColor(1, 0, 0, 1);
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);
animationTime += Gdx.graphics.getDeltaTime();
camera.update();
batch.setProjectionMatrix(camera.combined);
batch.begin();
TextureRegion techmanFrame = techmanRun.getKeyFrame(animationTime);
int width = techmanFrame.getRegionWidth();
int height = techmanFrame.getRegionWidth();
float originX = width * 0.5f;
float originY = height * 0.5f;
batch.draw(techmanFrame,
1.0f - originX, 3.70f - originY,
originX, originY,
width, height, //width, height
WORLD_TO_SCREEN, WORLD_TO_SCREEN,
0.0f);
batch.draw(techmanRun.getKeyFrame(animationTime), 100.0f, 275.0f);
batch.end();
}
#Override
public void resize(int width, int height){
viewport.update(width, height, false);
}
private static class RegionComparator implements Comparator<AtlasRegion> {
#Override
public int compare(AtlasRegion region1, AtlasRegion region2){
return region1.name.compareTo(region2.name);
}
}
}
That's not the proper way to use viewport:
viewport = new FitViewport(Gdx.graphics.getWidth(),Gdx.graphics.getHeight(), camera);
You should initialize viewport with constant values, 800x600, for example, and use them when writing code:
viewport = new FitViewport(WORLD_WIDTH,WORLD_HEIGHT, camera);
The image will be streched/resized automaticly when rendering, you dont need to do that:
batch.draw(techman.Frame,x,y,width,height);
PS: Do not call batch.setProjectionMatrix(camera.combined); inside the render() method. You only need to do that once.
Documentation:
https://github.com/libgdx/libgdx/wiki/Viewports
https://github.com/libgdx/libgdx/wiki/2D-Animation
Related
I'm having problems to draw some text in the screen, and I found out that depending on the viewport size, the text may get out of the display despite being calculating the coordinates as the center of the screen. Here's the code, it is a just slightly modified project as created by libgdxgenerator by default:
package net.iberdroid.libgdxtestfonts;
import com.badlogic.gdx.ApplicationAdapter;
import com.badlogic.gdx.Gdx;
import com.badlogic.gdx.graphics.Color;
import com.badlogic.gdx.graphics.GL20;
import com.badlogic.gdx.graphics.OrthographicCamera;
import com.badlogic.gdx.graphics.Texture;
import com.badlogic.gdx.graphics.g2d.BitmapFont;
import com.badlogic.gdx.graphics.g2d.SpriteBatch;
import com.badlogic.gdx.utils.viewport.FitViewport;
import com.badlogic.gdx.utils.viewport.Viewport;
public class LibGdxTestFonts extends ApplicationAdapter {
SpriteBatch batch;
Texture img;
BitmapFont defaultFont;
private OrthographicCamera camera;
private Viewport viewport;
private float textY;
private float textX;
#Override
public void create() {
camera = new OrthographicCamera();
viewport = new FitViewport(
640,
480,
camera);
camera.setToOrtho(false);
batch = new SpriteBatch();
img = new Texture("badlogic.jpg");
defaultFont = new BitmapFont();
textX = viewport.getWorldWidth() / 2;
textY = viewport.getWorldHeight() / 2;
}
#Override
public void render() {
Gdx.gl.glClearColor(0, 0, 0, 1);
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);
batch.begin();
batch.setProjectionMatrix(camera.combined);
defaultFont.setColor(Color.WHITE);
Gdx.app.log("render", String.format("TextX: %f TextY: %f", textX, textY));
defaultFont.draw(batch, "HELLO WORLD!", textX, textY);
batch.end();
}
#Override
public void dispose() {
batch.dispose();
img.dispose();
}
}
As far as I understand, that should draw a HELLO WORLD! text starting around the center of the screen. And it actually does it with that viewport size. Now, if you try with a bigger viewport, let's say 800x600, the text will move to the right and top, if try even with higher values, it will become a point in which the text will get out the boundaries of the screen by the top-right corner.
The same happens in the opposite direction. The smaller viewport you try, the further from the center and closer to the bottom-left corner the text will appear, until it eventually get out the boundaries too.
So either I am failing to grasp something here, or the BitMap.draw method seems to be ignoring the viewport size and using others that I cannot figure out.
If someone else does it, please, make me know.
Thanks a lot in advance!
P.S. I've tried also with a Hiero generated font and had the same issue.
Adding this seems to solve the problem with the position, though I still have a problem with the scale, but that's a different issue.
#Override public void resize (int width, int height) {
viewport.update(width, height, true);
}
I am attempting to learn more about displaying and interacting with graphics and GUI elements. As part of this, I have been following along with a Tower Defense Tutorial that uses LWJGL and Slick-Util: https://www.youtube.com/watch?v=rfR09erJu7U&list=PLFUqwj4q1Zr8GHs6bO4d6gxMGUh_2pcNg
Extending some things out a bit on my own, I am trying to draw some basic untextured shapes with textured shapes. I can get a 2d line to draw, however, it only appears when drawn after certain textured shapes are drawn, but doesn't appear after other textured shapes are drawn. I'm wondering what I don't understand that is causing this divergent behavior.
Here is my main class, LineTest.java:
package main;
import static helpers.Artist.BeginSession;
import static helpers.Artist.DrawQuadTex;
import static helpers.Artist.QuickLoad;
import static helpers.Artist.drawLine;
import org.lwjgl.opengl.Display;
import org.newdawn.slick.opengl.Texture;
import helpers.Artist;
public class LineTest {
public LineTest() {
BeginSession();
Texture bg = QuickLoad("bg");
//Texture bg = QuickLoad("exitbutton");
while(!Display.isCloseRequested()) {
DrawQuadTex(bg,0,0,Artist.WIDTH,Artist.HEIGHT);
drawLine(0,0,800,800);
Display.update();
Display.sync(60);
}
Display.destroy();
}
public static void main(String[] args) {
new LineTest();
}
}
And my helper class, Artist.java:
package helpers;
import static org.lwjgl.opengl.GL11.GL_BLEND;
import static org.lwjgl.opengl.GL11.GL_LINES;
import static org.lwjgl.opengl.GL11.GL_MODELVIEW;
import static org.lwjgl.opengl.GL11.GL_ONE_MINUS_SRC_ALPHA;
import static org.lwjgl.opengl.GL11.GL_PROJECTION;
import static org.lwjgl.opengl.GL11.GL_QUADS;
import static org.lwjgl.opengl.GL11.GL_SRC_ALPHA;
import static org.lwjgl.opengl.GL11.GL_TEXTURE_2D;
import static org.lwjgl.opengl.GL11.glBegin;
import static org.lwjgl.opengl.GL11.glBlendFunc;
import static org.lwjgl.opengl.GL11.glClearColor;
import static org.lwjgl.opengl.GL11.glClearDepth;
import static org.lwjgl.opengl.GL11.glColor4f;
import static org.lwjgl.opengl.GL11.glEnable;
import static org.lwjgl.opengl.GL11.glEnd;
import static org.lwjgl.opengl.GL11.glLoadIdentity;
import static org.lwjgl.opengl.GL11.glMatrixMode;
import static org.lwjgl.opengl.GL11.glOrtho;
import static org.lwjgl.opengl.GL11.glTexCoord2f;
import static org.lwjgl.opengl.GL11.glTranslatef;
import static org.lwjgl.opengl.GL11.glVertex2f;
import java.io.IOException;
import java.io.InputStream;
import org.lwjgl.LWJGLException;
import org.lwjgl.opengl.Display;
import org.lwjgl.opengl.DisplayMode;
import org.newdawn.slick.opengl.Texture;
import org.newdawn.slick.opengl.TextureLoader;
import org.newdawn.slick.util.ResourceLoader;
public class Artist {
public static final int WIDTH = 640, HEIGHT = 480;
public static void BeginSession() {
Display.setTitle("Line Test");
try {
Display.setDisplayMode(new DisplayMode(WIDTH, HEIGHT));
Display.create();
} catch (LWJGLException e) {
e.printStackTrace();
}
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(0, WIDTH, HEIGHT, 0, 1, -1);
glMatrixMode(GL_MODELVIEW);
glEnable(GL_TEXTURE_2D);
glClearColor(0.0f, 0.0f, 0.0f, 0.0f);
glClearDepth(1);
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
}
public static void drawLine(int x1, int y1, int x2, int y2) {
glColor4f(0.0f, 1.0f, 0.2f, 1.0f);
glBegin(GL_LINES);
glVertex2f((float) x1, (float) y1);
glVertex2f((float) x2, (float) y2);
glEnd();
glColor4f(1f,1f,1f,1f);
}
public static void DrawQuadTex(Texture tex, float x, float y, float width, float height) {
tex.bind();
glTranslatef(x, y, 0);
glBegin(GL_QUADS);
glTexCoord2f(0, 0);
glVertex2f(0, 0);
glTexCoord2f(1, 0);
glVertex2f(width, 0);
glTexCoord2f(1, 1);
glVertex2f(width, height);
glTexCoord2f(0,1);
glVertex2f(0, height);
glEnd();
glLoadIdentity();
}
public static Texture LoadTexture(String path, String fileType) {
Texture tex = null;
InputStream in = ResourceLoader.getResourceAsStream(path);
try {
tex = TextureLoader.getTexture(fileType, in);
in.close();
} catch (IOException e) {
e.printStackTrace();
}
return tex;
}
public static Texture QuickLoad(String name) {
Texture tex = null;
tex = LoadTexture(name + ".png", "PNG");
return tex;
}
}
The important part of my problem is within the main class, right here:
Texture bg = QuickLoad("bg");
//Texture bg = QuickLoad("exitbutton");
while(!Display.isCloseRequested()) {
DrawQuadTex(bg,0,0,Artist.WIDTH,Artist.HEIGHT);
drawLine(0,0,800,800);
When my Texture bg is getting my 'bg' graphic (a solid black PNG file), the drawLine function doesn't seem to actually draw a line. However, if I change my Texture bg to get my 'exitbutton' graphic (a blue square with "Exit" written in it, still a PNG file) the drawLine function does create a visible line.
This imgur album contains the output for both: Texture bg = QuickLoad("bg"); & Texture bg = QuickLoad("exitbutton");
http://imgur.com/a/OVOzD
If necessary, I can also upload both PNG files that I using, but I currently cannot include more than 2 links in my question. bg.png is a mono-black 64x64 PNG. exitbutton.png is 512x512.
Just looking to understand what is causing this. Thank you!
The problem is most likely that you enable texturing in the BeginSession() method, and then keep it enabled for the whole rendering:
glEnable(GL_TEXTURE_2D);
This means that your lines will be textured. Now, you may say: "But... I'm not specifying texture coordinates for the lines!" That does not matter. The immediate mode rendering API in OpenGL is state based, so whichever texture coordinates were specified last are the ones used. Since you call DrawQuadTex() before drawLine(), the last texture coordinates specified in DrawQuadTex() will be your current texture coordinates when drawing the line:
glTexCoord2f(0,1);
So the color of the line will be the texel at position (0, 1) of the currently bound texture, modulated with the color specified for the line:
glColor4f(0.0f, 1.0f, 0.2f, 1.0f);
If the two colors multiplied together result in black, e.g. because the given texel is black, the result will be a black line on black background. Which looks a lot like not rendering anything at all.
The cleanest solution is to enable texturing only while drawing primitives that you want to texture, and disable it afterwards. For example, in the DrawQuadTex() method:
public static void DrawQuadTex(Texture tex, float x, float y, float width, float height) {
tex.bind();
glEnable(GL_TEXTURE_2D);
...
glDisable(GL_TEXTURE_2D);
}
A few hints on how to track down these types of problems, or avoid them altogether:
Set the clear color to something other than black during development. Then you can easily tell if you're rendering black geometry, or nothing at all.
Use shader based rendering. The fixed function pipeline may look easier at first, but it's really not. With shaders, the resulting color is exactly what you implement, not some magic combination of values based on a bunch of fixed state.
While you're at it, you may also want to avoid using immediate mode rendering. While not directly causing this problem, it's just as obsolete as using the fixed function pipeline.
I am creating a game and trying to set up a splash screen.
Whenever I render the sprite that i want to tween to by using the sprite.draw method which looks like this:
#Override
public void render(float delta)
{
Gdx.gl20.glClearColor(0.2F, 0.5F, 1F, 1F);
Gdx.gl20.glClear(GL20.GL_COLOR_BUFFER_BIT);
tm.update(delta);
cam.update();
sb.setProjectionMatrix(cam.combined);
sb.begin();
Assets.splash_spr_bg.draw(sb);
sb.end();
}
The tweening works great except i can only see 1/4 of the picture on my screen, it is completely out of position.
And whenever I try to use this code in order to render the sprite by using the spritebatch to draw it, which looks like this:
#Override
public void render(float delta)
{
Gdx.gl20.glClearColor(0.2F, 0.5F, 1F, 1F);
Gdx.gl20.glClear(GL20.GL_COLOR_BUFFER_BIT);
tm.update(delta);
cam.update();
sb.setProjectionMatrix(cam.combined);
sb.begin();
sb.draw(Assets.splash_spr_bg, 0, 0);
sb.end();
}
I can see the background great, great quality, correct size and position and all that. However, the tweening doesn't work at all; nothing happens.
Why does this not work? How can I fix it?
Here is some other code.
Initializion of the TweenHandler class:
package com.heavenapps.jumpdodge.handlers;
import com.badlogic.gdx.graphics.g2d.Sprite;
import aurelienribon.tweenengine.TweenAccessor;
public class TweenHandler implements TweenAccessor<Sprite>
{
public static final int ALPHA = 1;
#Override
public int getValues(Sprite target, int tweenType, float[] returnValues)
{
switch(tweenType)
{
case ALPHA:
returnValues[0] = target.getColor().a;
return 1;
default:
return 0;
}
}
#Override
public void setValues(Sprite target, int tweenType, float[] newValues)
{
switch(tweenType)
{
case ALPHA:
target.setColor(1, 1, 1, newValues[0]);
break;
}
}
}
Initializion of the sprite/texture:
package com.heavenapps.jumpdodge.handlers;
import com.badlogic.gdx.Gdx;
import com.badlogic.gdx.graphics.Texture;
import com.badlogic.gdx.graphics.Texture.TextureFilter;
import com.badlogic.gdx.graphics.g2d.Sprite;
public class Assets
{
public static Texture splash_tex_bg;
public static Sprite splash_spr_bg;
public static void init()
{
// Splash Screen
splash_tex_bg = new Texture(Gdx.files.internal("Splash Screen/Background.png"));
splash_tex_bg.setFilter(TextureFilter.Linear, TextureFilter.Linear);
splash_spr_bg = new Sprite(splash_tex_bg);
splash_spr_bg.setOrigin(splash_spr_bg.getWidth() / 2, splash_spr_bg.getHeight() / 2);
splash_spr_bg.setPosition(Gdx.graphics.getWidth(), Gdx.graphics.getHeight());
splash_spr_bg.setColor(1, 1, 1, 0);
}
}
Usage of the tweening:
public void fadeSplashScreen()
{
Tween.to(Assets.splash_spr_bg, TweenHandler.ALPHA, 2F).target(1).ease(TweenEquations.easeInBounce).start(tm);
}
You need to set the sprite's position if you want to draw it using your first method. (sprite.draw(spriteBatch)). And you need to use your first method if you want to use the sprite's color, which is being controlled by the tween.
It looks like you did give the background sprite an initial position, but you put it off screen. So change
splash_spr_bg.setPosition(Gdx.graphics.getWidth(), Gdx.graphics.getHeight());
to
splash_spr_bg.setPosition(0, 0);
The reason the second method you showed (spriteBatch.draw(sprite, x, y)) draws it in the correct position is that this method of drawing ignores the fact that it's a sprite and just draws the texture region owned by the sprite in whatever position you give it. And this method doesn't fade the sprite because it's ignoring the fact that it is a sprite (which has a color).
I'm writing a 3D game that is drawn with ASCII art just to achieve a special look. I render my models to a ModelBatch and now I would like to convert every pixel to an ASCII symbol before drawing the final result to the screen. I already have the code to convert pixels to ASCII, but I have no idea how to get the pixels of the render result.
Code so far (not including the ASCII convertation):
import com.badlogic.gdx.ApplicationListener;
import com.badlogic.gdx.Gdx;
import com.badlogic.gdx.graphics.Color;
import com.badlogic.gdx.graphics.GL10;
import com.badlogic.gdx.graphics.PerspectiveCamera;
import com.badlogic.gdx.graphics.VertexAttributes.Usage;
import com.badlogic.gdx.graphics.g3d.Model;
import com.badlogic.gdx.graphics.g3d.ModelBatch;
import com.badlogic.gdx.graphics.g3d.ModelInstance;
import com.badlogic.gdx.graphics.g3d.materials.ColorAttribute;
import com.badlogic.gdx.graphics.g3d.materials.Material;
import com.badlogic.gdx.graphics.g3d.utils.ModelBuilder;
public class MyGame implements ApplicationListener {
private PerspectiveCamera cam;
private ModelBatch batch;
public Model model;
public ModelInstance instance;
#Override
public void create() {
//float w = Gdx.graphics.getWidth();
//float h = Gdx.graphics.getHeight();
cam = new PerspectiveCamera(67, Gdx.graphics.getWidth(), Gdx.graphics.getHeight());
cam.position.set(10f, 10f, 10f);
cam.lookAt(0,0,0);
cam.near = 0.1f;
cam.far = 300f;
cam.update();
batch = new ModelBatch();
ModelBuilder modelBuilder = new ModelBuilder();
model = modelBuilder.createBox(5f, 5f, 5f,
new Material(ColorAttribute.createDiffuse(Color.GREEN)),
Usage.Position | Usage.Normal);
instance = new ModelInstance(model);
}
#Override
public void dispose() {
batch.dispose();
model.dispose();
}
#Override
public void render() {
Gdx.gl.glViewport(0, 0, Gdx.graphics.getWidth(), Gdx.graphics.getHeight());
Gdx.gl.glClear(GL10.GL_COLOR_BUFFER_BIT | GL10.GL_DEPTH_BUFFER_BIT);
batch.begin(cam);
batch.render(instance);
batch.end();
}
#Override
public void resize(int width, int height) {
}
#Override
public void pause() {
}
#Override
public void resume() {
}
}
I think you want to render to a texture, convert the texture to a Pixmap, and then parse the pixmap to draw your ASCII characters to the real screen.
First, to render to a texture (offscreen buffer) in LibGDX, use a FrameBufferObject:
fbo = new FrameBuffer();
fbo.begin();
// draw your stuff
// ...
fbo.end();
Second, to get bytes that the CPU can use, you can use the [ScreenUtils class](http://libgdx.badlogicgames.com/nightlies/docs/api/com/badlogic/gdx/utils/ScreenUtils.html
). You can either get a Pixmap or a raw byte[]. For example, replace the // ... above with:
Pixmap p = ScreenUtils.getFrameBufferPixmap(0, 0, width, height);
Now you can wander over the pixmap checking out pixels (using [Pixmap.getPixel](http://libgdx.badlogicgames.com/nightlies/docs/api/com/badlogic/gdx/graphics/Pixmap.html#getPixel(int, int))).
I'm seriously betting that I did something effing stupid and just can't seem to notice it.
package com.me.mygdxgame;
import com.badlogic.gdx.ApplicationListener;
import com.badlogic.gdx.Gdx;
import com.badlogic.gdx.graphics.GL10;
import com.badlogic.gdx.graphics.OrthographicCamera;
import com.badlogic.gdx.graphics.Texture;
import com.badlogic.gdx.graphics.Texture.TextureFilter;
import com.badlogic.gdx.graphics.g2d.BitmapFont;
import com.badlogic.gdx.graphics.g2d.Sprite;
import com.badlogic.gdx.graphics.g2d.SpriteBatch;
import com.badlogic.gdx.graphics.g2d.TextureRegion;
public class Locked implements ApplicationListener
{
private OrthographicCamera camera;
private SpriteBatch batch;
private Texture texture;
private Sprite sprite;
private BitmapFont font;
private CharSequence str = "Hello World!";
private float width;
private float height;
#Override
public void create()
{
width = Gdx.graphics.getWidth();
height = Gdx.graphics.getHeight();
camera = new OrthographicCamera(1, height / width);
batch = new SpriteBatch();
texture = new Texture(Gdx.files.internal("data/libgdx.png"));
texture.setFilter(TextureFilter.Linear, TextureFilter.Linear);
TextureRegion region = new TextureRegion(texture, 0, 0, 512, 275);
sprite = new Sprite(region);
sprite.setSize(0.9f, 0.9f * sprite.getHeight() / sprite.getWidth());
sprite.setOrigin(sprite.getWidth() / 2, sprite.getHeight() / 2);
sprite.setPosition(-sprite.getWidth() / 2, -sprite.getHeight() / 2);
font = new BitmapFont(Gdx.files.internal("data/digib.fnt"),
Gdx.files.internal("data/digib.png"), false);
}
#Override
public void dispose()
{
batch.dispose();
texture.dispose();
}
#Override
public void render()
{
Gdx.gl.glClearColor(1, 1, 1, 1);
Gdx.gl.glClear(GL10.GL_COLOR_BUFFER_BIT);
batch.setProjectionMatrix(camera.combined);
batch.begin();
font.setColor(0.0f, 0.0f, 0.0f, 1.0f);
//sprite.draw(batch);
font.draw(batch, str, width*0.5f, height*0.5f);
batch.end();
}
#Override
public void resize(int width, int height)
{
}
#Override
public void pause()
{
}
#Override
public void resume()
{
}
}
The project was generated with the template tool they provide gdx-setup-ui.jar
As you can see in the code, I didn't bother to get rid of the default codes (Just some simple draw codes to render the LibGDX logo).
So, with the cleanly generated project, I followed this guide here
http://code.google.com/p/libgdx-users/wiki/addingText2D
and finally arriving with the provided code above.
The problem is, why won't the !##$ing text show!? I changed the position so many times and still no luck :\
Did I miss something?
FYI: The fonts are fine, I dropped them into another game and it works.
Try to change projection matrix like this:
Matrix4 normalProjection = new Matrix4().setToOrtho2D(0, 0, Gdx.graphics.getWidth(), Gdx.graphics.getHeight());
batch.setProjectionMatrix(normalProjection);
All I do is
spriteBatch = new SpriteBatch();
font = new BitmapFont(Gdx.files.internal("data/nameOfFont.fnt"),
Gdx.files.internal("data/nameOfFont.png"), false);
and in render method
spriteBatch.begin();
font.setColor(1.0f, 1.0f, 1.0f, 1.0f);
font.draw(spriteBatch, "some string", 25, 160);
spriteBatch.end();
You can read something more about it on my blog: http://algorhymes.wordpress.com/2012/11/17/javalibgdx-fonts/
Personally I'm not a big fan of converting all the fonts to .fnt format. If you need different sizes for a certain font you have to spend a lot of time (and app space) to make all the conversions.
You can just use the FreeType Extension and load straight from a .ttf
FreeTypeFontGenerator generator = new FreeTypeFontGenerator(fontFile);
BitmapFont font15 = generator.generateFont(15);
BitmapFont font22 = generator.generateFont(22);
generator.dispose();
More info here
Rendering is done in the same way as explained by watis.
create a .fnt file using hiero which is provided by libgdx website
set the size of font 150 ,it will create a .fnt file and a png file
copy both of file in your assests folder
now declare the font
BitmapFont font;
nw in create method
font = new BitmapFont(Gdx.files.internal("data/100.fnt"), false);//100 is the font name you can give your font any name
in render
font.setscale(.2f);
font.draw(batch, "whatever you want to write", x,y);
this will work smoothly
The main problem with your code is that you have created camera with
viewportWidth = 1 &
viewportHeight = width/height
and you are drawing font at width*0.5f & height*0.5f which is out of scope from camera
Either change the camera initialization to
camera = new OrthographicCamera(width, height);
....
or change the draw font statement to
font.setScale(1,height/width);
font.draw(batch, str, 0.5f, height/width*0.5f);
Did you try giving position manually like this. I hope this will work
batch.setProjectionMatrix(camera.combined);
batch.enableBlending();
batch.begin();
font.draw(batch, yourString, 100,100);
batch.end();