I want to create an Effect in opengl that works a bit like windows aero but in a 3D environment to make frosted glass more realistic. But even after hours of searching I couldn't find a solution.
I'd recommend starting by blurring your glass (Here I'm assuming your glass texture is an RGBA texture, i.e, its partially translucent) by using a Gaussian Filter. This would have to be done in a shader as you want whatever is behind the glass to be blurred as well. Then blend a white color with the resulting texture. To make it look better, have the white color be "strongest" at the edges of the texture and get progressively weaker towards the middle. Linear Interpolation would help with this.
Related
however, i have a weird issue, when drawing, it seems the outside 1px of an image is stretched to fit a rectangle, but the inside is only stetched to an extend, i was drawing to 48x48 tiles, but drew a 500x500 tile to show the issue. [ 500x500 draws fine ]
the worst part seems to be, it chooses when to stretch and not to stretch. and also what to strech. im sorry this is hard to explain but i have attached a image that i hope does a better job.
it could just be misunderstanding how to use a draw with spritebatch
edit: Tile is 48x48 not 64x64, ive just been working all day.
This is because you are not rendering "pixel perfect" which means your image does not line up with the pixel grid of your monitor. A quick fix might be to set a linear filter for your textures, since by default it uses nearest and thus a pixel on the screen will inherit the closest color it can get. A linear filter will interpolate colors and make that line "look" thinner.
texture.setFilter(Texture.TextureFilter.Linear, Texture.TextureFilter.Linear);
If you are using texturepacker you can do this in one go by altering it's settings.
texturePackerSetting.filterMin = Texture.TextureFilter.Linear;
texturePackerSetting.filterMag = Texture.TextureFilter.Linear;
Or you could edit the atlas file itself by by changing the filter parameter to:
filter: Linear,Linear
This obviously costs more power since it needs to do more calculations for each pixel you drawn to the screen but I would not worry about this until your drawing is starting to get a bottleneck.
Another solutions is to draw pixel perfect which means you need to set your viewport to the size of the device gdx.graphics.getWidth, gdx.graphics.getHeight, in other words a ScreenViewport and draw your textures at exact sizes you want them. Of course this means a screen with more pixels sees more of your game world then a screen with less pixels and the more pixels a device has the smaller your textures will look. Another drawback of this is that you have to forget about any zooming or draw sprites for each level of zoom so they line up with the pixel grid of the device again.
I am about to create a little 2D desktop game with libGdx and I want it to have that retro pixelated look you know from games like "Flappy Bird". To achieve that effect, I thought of the following:
Create a game window (e.g. 640x480)
Create a framebuffer half that size (i.e. 320x200)
Render everything to the framebuffer
Get the texture from the framebuffer
Draw the texture to the screen with SpriteBatch, scaling it 2 times up and using TextureFilter.Nearest.
I know I could scale each sprite individually with SpriteBatch.draw() but I thought, rendering everything at its original resolution and just scale up the final composition might be easier.
So would the above technique be an appropriate way of getting that pixelated look?
What you have in mind sounds like a perfectly fine approach. The downside is that it does involve an additional data copy, but on the other hand your original rendering is for only 1/4 of the pixels, which saves you quite a bit of rendering overhead.
In plain OpenGL, you could use glBlitFramebuffer() for step 5. This requires OpenGL 3.0 or higher. It's essentially the same operation as drawing a textured quad, but it's a single call, and the underlying implementation could potentially be more efficient.
I'm quite new to LibGDX and OpenGL, but I managed to make a simple liquid simulation using Box2D API. See this link (this is someone's else animation):
Physics Liquid
Currently I render the liquid particles as circles just like in the first image, but I want to make it look more natural like on the third one.
The answer might be to use distance field and I tried this approach, but with no effect. I'm drawing each particle as a texture using SpriteBatch class, but that can be changed. I made a texture (from a procedural Pixmap) that represents each particle as a filled circle, with alpha channel decreasing further from the center, so the effect is similar to the second picture.
Now, I must enable a threshold filter to alpha channel, something like: "draw only pixels with alpha > 0.5". This is post-processing step, because it matters what is the alpha channel of a pixel after all particles have been draw. Might or might not be done with shaders (ProgramShader), but after some research I still have no clue how to do this. Thanks for ANY help.
EDIT: this example explains the method, but it's implemented in ActionScript.
This can easily be done using shaders, but funny thing is that you don't need to write them.
"draw only pixels with alpha > 0.5" is also used while rendering distance field fonts (fonts which look good even when scaled up).
Follow this link : https://github.com/libgdx/libgdx/wiki/Distance-field-fonts
Directly jump to the last step.
You should find exactly what you want.
Hope this helps.
I have two pixel arrays, foreground and lighting. When I draw a white radial gradient (blurry circle) onto the lighting array, I want it to make the foreground visible - much like a torch in terraria/starbound. However, I also want to be able to mix different colors of lighting, rather than be stuck with black to white.
So how do I manipulate the pixel arrays so that they 'multiply' (I believe it's called)? Or is there an easier way using RGBA which I have not been able to get to work (flickering black on white or image getting ever more posterized)?
So far most responses regarding alpha channels / opacity have been using libraries from java which for this project I want to refrain from using.
Any help much appreciated!
If you want to know how the blending modes (such as "multiply") work in image editing programs on the pixel-value level, read this: How does photoshop blend two images together?.
Note that if you want to make the image lighter, you need a mode like "screen", because "multiply" makes it darker.
I have a small block engine similar to a very early version of Minecraft using LWJGL. I now want to actually implement lighting. I understand how it works I'm just confused as to how I'm supposed to render lighting. Am I supposed to change the "brightness" of the texture to simulate bright terrain? I'm asking how to actually change the light value of a quad, maybe there are some tutorials out there? I want it to be block by block, no smooth lighting. I have figured out that blocks need to have a light value, and for every block next to it, you decrease that light value by a little bit until its "black".
(At least) two options, assuming a fixed-function renderer:
Use a single texture (your basic texture atlas, with the default GL_MODULATE texenv.) and set a per-vertex gray-scale color to darken the texture in proportion to the light level. With GL_MODULATE the texture RGB channels are multiplied by the vertex color RGB channels. So a vertex color of RGB(255,255,255) would be fully lit, RGB(0,0,0) would be pure black, and RGB(128,128,128) would be somewhere in the middle.
Use two textures (appearance atlas and lightmap atlas) and multitexture. The light level is set for a given face by supplying texture coordinates that select the appropriate lightmap square. If you animate the lightmap you can get a day/night cycle "for free" without having to iterate over the entire volume fixing up vertex colors like in #1.