I draw lots of dots on the Swing(Jpanel).
I make that panel zoomable, zoom-in works fine, but I have a problem at zoom-out when it was zoomed-out to the limit, I draw dots per pixel.
I want to zoom-out it more, how can I draw it on Graphics2D
And those dots on the panel are editable(by using a mouse). Since they are drawn less than one pixel, how can I get their actually drawn position?
Related
however, i have a weird issue, when drawing, it seems the outside 1px of an image is stretched to fit a rectangle, but the inside is only stetched to an extend, i was drawing to 48x48 tiles, but drew a 500x500 tile to show the issue. [ 500x500 draws fine ]
the worst part seems to be, it chooses when to stretch and not to stretch. and also what to strech. im sorry this is hard to explain but i have attached a image that i hope does a better job.
it could just be misunderstanding how to use a draw with spritebatch
edit: Tile is 48x48 not 64x64, ive just been working all day.
This is because you are not rendering "pixel perfect" which means your image does not line up with the pixel grid of your monitor. A quick fix might be to set a linear filter for your textures, since by default it uses nearest and thus a pixel on the screen will inherit the closest color it can get. A linear filter will interpolate colors and make that line "look" thinner.
texture.setFilter(Texture.TextureFilter.Linear, Texture.TextureFilter.Linear);
If you are using texturepacker you can do this in one go by altering it's settings.
texturePackerSetting.filterMin = Texture.TextureFilter.Linear;
texturePackerSetting.filterMag = Texture.TextureFilter.Linear;
Or you could edit the atlas file itself by by changing the filter parameter to:
filter: Linear,Linear
This obviously costs more power since it needs to do more calculations for each pixel you drawn to the screen but I would not worry about this until your drawing is starting to get a bottleneck.
Another solutions is to draw pixel perfect which means you need to set your viewport to the size of the device gdx.graphics.getWidth, gdx.graphics.getHeight, in other words a ScreenViewport and draw your textures at exact sizes you want them. Of course this means a screen with more pixels sees more of your game world then a screen with less pixels and the more pixels a device has the smaller your textures will look. Another drawback of this is that you have to forget about any zooming or draw sprites for each level of zoom so they line up with the pixel grid of the device again.
I've created a isometric tile based game in Libgdx. The textures I'm using are 64x64 and packed using TexturePacker into a TextureAtlas. They are then drawn onto the screen. However, while moving around the pixelated edges of the 64x64 texture flicker and they are distorted, which can be seen in the images below. I have used all filters available in texturepacker, below you can see the results of the Linear and Nearest filters. Apart from flickering, the linear filter adds a black outline to the textures. I would be fine with this if it wasn't for the flickering when the camera moves around.
How the tile should appear:
Linear filtering (You can clearly see the black lines distorting):
Nearest filtering (Harder to see, but the pixelated lines are not straight):
The easiest place to spot it is on the top and bottom of the brown cube. The distortion happens on different places depending on camera movement (this causes flickering).
Anyone know what causes this, or has a possible solution? I'm not sure if any code snippets are needed.
It is also worth mentioning that the camera is set to windowHeight/ppm (ppm = 64) and windowWidth/ppm, then the textures are drawn onto a batch that has its projection matrix set to camera.combined.
Edit: Somehow it's better when reducing the window height from 800 to 710 (nearest):
Turn on the premultiplyAlpha option in TexturePacker and set setBlendFunction.(GL20.GL_ONE, GL20.GL_ONE_MINUS_SRC_ALPHA) on the SpriteBatch. This should get rid of the flickering black fringing. Basically, when using linear filtering, when the sprite's edges don't exactly line up with the pixels on the screen, the color of the pixel is linearly sampled from an image pixel on the edge of your sprite and an image pixel in the invisible black space (RGBA = 0000) next to it, so the edges can appear darker and more transparent than intended. Pre-multiplying the alpha cures this problem by changing the order of operations of the interpolation. Detailed explanation here and here.
Also, use filterMin of MipMapLinearNearest or MipMapLinearLinear to make sure you aren't getting minifying artifacts. (The first one performs better and the second one looks better at certain zoom levels and should be used if your camera zooms in and out.)
And finally, filterMax should be Linear.
Nearest filtering will always produce uneven artifacts if the sprites are not drawn at exactly 1X, 2X, 3X, etc. of their original size, because there will be certain rows and columns of the screen where a pixel in the image is drawn twice.
I'm working in a simple game for desktop and Android using LibGDX framework. The game spawn some balls that move from one side of the screen to the opposite one. I want to show white circles over a black background but I also want to draw some white texts. My problem with this is that I want to draw the intersection between balls and between balls and text in black. I want something like this:
Other of my problems is that I'm using ShapeRenderer to draw the circles and they are too pixelated.
What is the best way to render circles in libGDX (Ill have to render between 1 and 100). Is it possible to get the effect shown in the image?
What I would like is to give circles a fill color with a gradient that starts from the middle and then as it moves out to the edges becomes progressively more transparent, giving a blur effect. What is the simplest way of doing this?
Try setting an appropiately defined java.awt.RadialGradientPaint (using Colors with alpha), then render your circles using that. You may need to translate the graphics coordinate system to get the gradient centered in the circle. (http://docs.oracle.com/javase/7/docs/api/java/awt/RadialGradientPaint.html)
Or just make an image in a graphics program and simply draw the image.
I came across this problem in a game I'm writing but have reproduced the issue on a separate jar that consists only of a frame, a panel, and a mouse motion listener.
The issue is that I draw a rectangle - for example at x:512, y:384 (48x48).
Using a mouse motion listener on the frame, it always reports the Y axis around 25 pixels less. So while the coord of the rectangle should be x:512, y:384, the mouse motion listener reports as x:512, y:409.
I theoretically could just add the difference to the mouse Y, but I need to understand why this happening.
Full code for the three example classes
it always reports the Y axis around 25 pixels less
Maybe you added the MouseListener to the frame instead of the panel.
You draw your Rectangles on the panel, but the panel is located (on my OS) 30 pixels down from the top of the frame so your coordinates don't match. The X value should also be out by the width of the frame border.
Try adding the listener to the panel.
The origin of Frame and Board is different. The origin of Board is the (0, 25) of Frame.
Not a good image, though. First, screenshot won't show cursor, so I drew it. Second, I didn't put my cursor exactly on (0, 25) but just around it.