Libgdx Spritebatch bug - java

however, i have a weird issue, when drawing, it seems the outside 1px of an image is stretched to fit a rectangle, but the inside is only stetched to an extend, i was drawing to 48x48 tiles, but drew a 500x500 tile to show the issue. [ 500x500 draws fine ]
the worst part seems to be, it chooses when to stretch and not to stretch. and also what to strech. im sorry this is hard to explain but i have attached a image that i hope does a better job.
it could just be misunderstanding how to use a draw with spritebatch
edit: Tile is 48x48 not 64x64, ive just been working all day.

This is because you are not rendering "pixel perfect" which means your image does not line up with the pixel grid of your monitor. A quick fix might be to set a linear filter for your textures, since by default it uses nearest and thus a pixel on the screen will inherit the closest color it can get. A linear filter will interpolate colors and make that line "look" thinner.
texture.setFilter(Texture.TextureFilter.Linear, Texture.TextureFilter.Linear);
If you are using texturepacker you can do this in one go by altering it's settings.
texturePackerSetting.filterMin = Texture.TextureFilter.Linear;
texturePackerSetting.filterMag = Texture.TextureFilter.Linear;
Or you could edit the atlas file itself by by changing the filter parameter to:
filter: Linear,Linear
This obviously costs more power since it needs to do more calculations for each pixel you drawn to the screen but I would not worry about this until your drawing is starting to get a bottleneck.
Another solutions is to draw pixel perfect which means you need to set your viewport to the size of the device gdx.graphics.getWidth, gdx.graphics.getHeight, in other words a ScreenViewport and draw your textures at exact sizes you want them. Of course this means a screen with more pixels sees more of your game world then a screen with less pixels and the more pixels a device has the smaller your textures will look. Another drawback of this is that you have to forget about any zooming or draw sprites for each level of zoom so they line up with the pixel grid of the device again.

Related

Drawing many textured squads OpenGL ES 2.0

I'm programming an Android App where a grid is drawn, which you can move around and move in it's direction. The grid consists of about 2000 to 5000 quads, each with a different texture. I defined 4 vertices and use an index buffer to draw each quad. Before drawing I position it using a model matrix. As you can move in my scene I use view frustum culling, which increases the performance in some situations. Unfortunately there might be the case where I will need to draw all of the quads, so I want to ask how I prevent slow drawing.
I can't use a texture atlas as all of the textures are pretty big (from 256x256 to 1024x1024). I think calling glDrawElements() for each squad is what makes me slow, but I don't know how I can change it.
Another idea I had would be to draw the scene to a texture and just bind this texture to a single quad to create an illusion of the scene being drawn. As the user gets closer I could redraw it for better resolution. Could this work?
I look forward for any kind of help.
I can't use a texture atlas as all of the textures are pretty big (from 256x256 to 1024x1024).
You can fit 64 256x256 textures into a 2048x2048 atlas, that's a huge amount, so you should definitely atlas. Even getting 4 1024x1024 onto a 2048x2048 is worth doing, it can quarter your draw call count.
And as WLGfx says in the comments to your question, you should batch up any quads that use the same texture (with atlasing there will be a lot more of these).
I think this would be enough, but you still might have a pretty high drawcall count in your fully zoomed-out view. After implementing atlasing and batching, if performance here is still a problem, you could create a separate asset set of thumb-nail textures at, say, quarter resolution (so a 256x256 becomes 64x64). This thumbnail asset set would fit onto just a handful of 2048x2048 atlas sheets, and you could switch to it when zoomed out far enough.
Another idea I had would be to draw the scene to a texture and just bind this texture to a single quad to create an illusion of the scene being drawn. As the user gets closer I could redraw it for better resolution. Could this work?
This could work, as long as your scene is very static, if the quads are moving/changing every frame, then it might not help. Also, there might be a noticeable framerate hitch when you have to do the full redraw.

Pixelated texture filtering distorted

I've created a isometric tile based game in Libgdx. The textures I'm using are 64x64 and packed using TexturePacker into a TextureAtlas. They are then drawn onto the screen. However, while moving around the pixelated edges of the 64x64 texture flicker and they are distorted, which can be seen in the images below. I have used all filters available in texturepacker, below you can see the results of the Linear and Nearest filters. Apart from flickering, the linear filter adds a black outline to the textures. I would be fine with this if it wasn't for the flickering when the camera moves around.
How the tile should appear:
Linear filtering (You can clearly see the black lines distorting):
Nearest filtering (Harder to see, but the pixelated lines are not straight):
The easiest place to spot it is on the top and bottom of the brown cube. The distortion happens on different places depending on camera movement (this causes flickering).
Anyone know what causes this, or has a possible solution? I'm not sure if any code snippets are needed.
It is also worth mentioning that the camera is set to windowHeight/ppm (ppm = 64) and windowWidth/ppm, then the textures are drawn onto a batch that has its projection matrix set to camera.combined.
Edit: Somehow it's better when reducing the window height from 800 to 710 (nearest):
Turn on the premultiplyAlpha option in TexturePacker and set setBlendFunction.(GL20.GL_ONE, GL20.GL_ONE_MINUS_SRC_ALPHA) on the SpriteBatch. This should get rid of the flickering black fringing. Basically, when using linear filtering, when the sprite's edges don't exactly line up with the pixels on the screen, the color of the pixel is linearly sampled from an image pixel on the edge of your sprite and an image pixel in the invisible black space (RGBA = 0000) next to it, so the edges can appear darker and more transparent than intended. Pre-multiplying the alpha cures this problem by changing the order of operations of the interpolation. Detailed explanation here and here.
Also, use filterMin of MipMapLinearNearest or MipMapLinearLinear to make sure you aren't getting minifying artifacts. (The first one performs better and the second one looks better at certain zoom levels and should be used if your camera zooms in and out.)
And finally, filterMax should be Linear.
Nearest filtering will always produce uneven artifacts if the sprites are not drawn at exactly 1X, 2X, 3X, etc. of their original size, because there will be certain rows and columns of the screen where a pixel in the image is drawn twice.

Rotate sprite on sprite pixel level, not screen pixel level, in LibGDX

I'm making a pixelated game, and I'm trying to rotate a sprite. However, I'm not achieving the sort of rotating effect I'm aiming for.
Currently, my sprite looks like this when it rotates:
As you can see, it rotates relatively smoothly. You can see that the 'big pixels' rotate smoothly. However, this isn't the rotating effect I'm looking for. Instead, this is how I want it to rotate:
However, preferably in a way that doesn't distort the pixels as much. You can see the difference. I want the actual 'big pixels' to rotate, not the 'screen pixels'.
I think the issue might lie in how I scale the pixels to become bigger. What I'm doing, is that I'm zooming the camera in, moving it closer sorta. What I instead want to do, is to render like normal, then just scale up the screen pixels. That way I'd automatically achieve the rotation effect I want. I don't know how to do that, though.
This is how I currently 'scale up the pixels':
camera = new OrthographicCamera();
camera.setToOrtho(false, 1280 / 4, 720 / 4);
The game's resolution is 1280x720, so the way I make the pixels bigger is that I just zoom in 4x times.
However, what I instead want to do, is to render like normal, then just stretch the screen 4x times.
Any help on how I could do this would be greatly appreciated.
Have a look at this post. Here is a kind of solution for your problem. Just render first to small frame buffer with nearest neighbor interpolation and then to screen. Perhaps it is not effective way, but definitely the way to achieve such behavior.
Good luck!

LWJGL Camera stretches and shrinks shapes

I've been looking around and i couldn't find an answer to this but what I have done is create a cube / box and the camera will squash and stretch depending on where I am looking at. This all seems to resolve it self when the screen is perfectly square but when I'm using 16:9 it stretches and squashes the shapes. Is it possible to change this?
16:9
and this is 500px X 500px
As a side question would it be possible to change the color of background "sky"?
OpenGL uses a cube [-1,1]^3 to represent the frustum in normalized device coordinates. The Viewport transform strechtes this in x and y direction to [0,width] and [0,height]. So to get the correct output aspect ratio, you have to take the viewport dimensions into account when transfroming the vertices into clip space. Usually, this is part of the projection matrix. The old fixed-function gluPerspective() function has a parameter to directly create a frustum for a given aspect ratio. As you do not show any code, it is hard to suggest what you actually should change, but it should be quite easy, as it boils down to a simple scale operation along x and y.
To the side question: That color is defined by the values the color buffer is set to when clearing the it. You can set the color via glClearColor().

AndEngine - artifacts while scrolling map (TextureOptions related)

I'm developing 2D Side Scroll Android Game, using AndEngine.
I have problem with tiles quality.
If I will use DEFAULT texture option, for my texture congaing tiles, it doesn't look perfect, contours ARE NOT smooth, etc:
DEFAULT Texture options, uses such OPEN GL parameters:
new TextureOptions(GL10.GL_NEAREST, GL10.GL_NEAREST, GL10.GL_CLAMP_TO_EDGE, GL10.GL_CLAMP_TO_EDGE, GL10.GL_MODULATE, true);
But lately I realized, that if I will use such parameters (similar to BILINEAR parameters, except last one)
new TextureOptions(GL10.GL_LINEAR, GL10.GL_LINEAR, GL10.GL_CLAMP_TO_EDGE, GL10.GL_CLAMP_TO_EDGE, GL10.GL_MODULATE, true)
graphic looks smooth (i would say perfect, check image below)
Everything would be perfect, but while moving camera (Camera is chasing player) there are visible contours of those sprites, like for example on this screen:
I have been trying to use different OPEN GL parameters, but with no luck. I would be grateful for some help. With DEFAULT texture option, such problem doesn't exist, but quality is bad. Thanks.
Ps: I have been trying to cast integer on my setCenter method inside camera, but with no luck, some people were saying it should help, but it didn't.
This occurs because the function that is used for smoothing out the textures uses pixels that are outside of the pictures on the Texture Atlas. These are black by default so the pixels on the edges are poisoned by the black area outside.
I have temporarily fixed the issue by extending the picture an all sides by 1px and putting there a copy of the adjacent 1px line from the picture. Then I set my TextureRegion to contain only the middle of the picture with the padding being outside. The results are probably not perfect but the lines are no longer noticeable.
I have seen someone on the AndEngine forums say that in the newest version the problem is fixed, so you may try updating.

Categories

Resources