I am using Box2D and OpenGL. I found that (at a 60 frame rate) when I apply quick changes in direction to a fast moving object, the rendering seems to jump or perhaps skip frames.
(I am only operating in 2D). I want to push the physics to the very edge. (Lots of objects moving simultaneously, possibly breaking down shapes with welds etc.)
If I speed up the display.sync() from 60 to 180, it is much cleaner.
What is an ideal frame rate?
Are there any other ways to keep the rendering clean?
With speed and only basic drawing being the priority, are there better libraries?
Such as Slick2D?
Sometimes the problem isn't in the renderer, but rather in the fact your time step in your simulation is causing problems and making it look like your frame rate is off. I noticed similar problems in a program of mine using OpenGL and Box2D and fixing my timestep helped smooth things out significantly.
Really good article here.
MtRoad's answer as a possible implementation.
public class GameStateRunning {
private final int TICKS_PER_SECOND = 30;
private final double timePerTick = 1000 / TICKS_PER_SECOND;
private final int MAX_FRAMESKIP = 5;
private double next_game_tick = System.currentTimeMillis();
private int loops;
private double extrapolation;
public void update() {
loops = 0;
while (System.currentTimeMillis() > next_game_tick && loops < MAX_FRAMESKIP) {
// YOUR GAME UPDATE CODE GOES HERE
next_game_tick += timePerTick;
loops++;
}
if (next_game_tick < System.currentTimeMillis()) {
next_game_tick = System.currentTimeMillis();
}
extrapolation = 1 - (next_game_tick - System.currentTimeMillis()) / timePerTick;
}
public void render() {
// YOUR GAME RENDER CODE GOES HERE
}
Slick2d probably won't be faster than OpenGL, as Slick2d uses OpenGL.
Related
I am currently developing a 2D game using Swing components.
Each time I run my game it stutters randomly at some points. This is my game loop code:
public class FixedTSGameLoop implements Runnable
{
private MapPanel _gamePanel;
public FixedTSGameLoop(MapPanel panel)
{
this._gamePanel = panel;
}
#Override
public void run()
{
long lastTime = System.nanoTime(), now;
double amountOfTicks = 60.0;
double amountOfRenders = 120.0;
double nsTick = 1000000000 / amountOfTicks;
double nsRender = 1000000000 / amountOfRenders;
double deltaTick = 0;
double deltaRender = 0;
while (this._gamePanel.isRunning())
{
now = System.nanoTime();
deltaTick += (now - lastTime) / nsTick;
deltaRender += (now - lastTime) / nsRender;
lastTime = now;
while (deltaTick >= 1)
{
tick();
deltaTick--;
}
while (deltaRender >= 1)
{
render();
deltaRender--;
}
}
}
private void tick()
{
/**
* Logic goes here:
*/
this._gamePanel.setLogic();
}
private void render()
{
/**
* Rendering the map panel
*/
this._gamePanel.repaint();
}
}
I have tried multiple times to omit certain code parts, thinking that they cause lag, but I have found nothing that caused it particularly, so I think the problem lies within my game loop mechanism.
Thank you for your help!
Your game loop must contain a "Thread.sleep" on order to sleep the amount of time needed to respect your target FPS.
The main loop is supposed to contain 1 tick() and 1 render().
Your current implementation is flooding the paint manager, slowdowns will appear when the underlying buffers will be full and when the garbage collector will do its job.
While it's good that you've split your render and logic in two different methods, the problem lies in that they exist in the same thread.
What needs to happen to reduce the lag is to have them in separate threads. The render thread would request a snapshot of the current state from the logic thread (to prevent concurrent modification) and render that snapshot.
Right now, if one render takes too long, or if one logic step takes too long, the other check is going to have to wait for it to finish before it can begin working.
I have recently started on developing my own game engines for fun. I implemented a method to load Blender .OBJ models from files, and I am successfully rendering them. However, when doing a stress-test I ran into an unusual predicament with my Delta-time based fps system. While running at 1500FPS, it takes me 4 seconds to look from one end of a wall of models to another. If I cap the FPS at 120FPS, however, it only takes my 0.84 seconds to look from one end of the wall to another. As I explored further it would seem that, in fact, the game-movement speed decreases as FPS increases.
Here is my Timer class:
class Timer
{
long lastFrame;
int fps;
long lastFPS;
int delta;
int fpsToReturn;
public Timer()
{
lastFPS = getTime();
}
public long getTime()
{
return (Sys.getTime() * 1000) / Sys.getTimerResolution();
}
public int getDelta()
{
long time = getTime();
delta = (int) (time - lastFrame);
lastFrame = time;
return delta;
}
public void updateFPS()
{
if (getTime() - lastFPS > 1000)
{
fpsToReturn = fps;
fps = 0;
lastFPS += 1000;
}
fps++;
}
public int getFPS()
{
return fpsToReturn;
}
}
And of course movement is just something along the lines of:
camera.rotX += (0.5 * timer.getDelta());
Does anyone have any ideas? Is this how delta-time is supposed to work? When running at 16FPS it returns around 65, at 120FPS Delta is returning around 8-9, while uncapped FPS it always returns 0 or 1 if that makes a difference or helps you spot if something is wrong with the uncapped FPS. I really appreciate any help, thank you guys.
Solved my own question, and glad I learned in the process.
My issue, which I later discovered when I was implementing angular movement, was that I was using the method getDelta() and getFPS() every time i needed it more than once-per-frame, which was throwing off the delta variable. I solved the issue by using a static variable, one for FPS and one for Delta, and updating each variable at the end of each frame.
Pseudo Code:
public static double globalDelta;
while(nextFrame) //Loops once per frame
{
updateGame();
globalDelta = calculateDelta();
}
I am trying to create a game in which objects spawn on a timer then start moving. I would like to keep the frame rate independent of the game speed, so I have tried to implement delta time:
lastTime = thisTime;
thisTime = (int) SystemClock.currentThreadTimeMillis();
deltaTime = thisTime - lastTime;
for (int i = 0;i < objectList.size();i++) {
objectList.get(i).updatePosition(deltaTime);
objectList.get(i).display(bitmapCanvas);
}
And my updatePosition method:
public void updatePosition(float deltaTime) {
this.y += 10/1000 * deltaTime;
}
From my understanding, the rate (10 in this case) should be divided by 1000 since deltaTime is in milliseconds, but I have tried it with various other values as well, and the objects just spawn on the screen and do not move. I had no issues making them move before trying to implement interpolation.
If it helps any, I was logging the delta time and it was usually around 20/30. Is this normal?
I have looked at probably 5 or 6 questions on GameDev and a couple on Stack Overflow but I think I must be misunderstanding something as I cannot get this to work.
So this doesn't appear unanswered, Jon Skeet was (obviously) correct:
The calculation of 10/1000 is being performing (at compile time) in integer arithmetic, so it's 0... Try 10/1000f, or just 0.01f.
I;m making a game in which I have to move little squares on an n by n grid, and they have to transition smoothly. This is the swap method I made which should be able to paint my transition on screen, but for some reason it is not doing it. I tried a much simpler version of my code on a simple project to move a Square back and forth and it worked like a charm, so I'm really not sure why this isn't repainting. This is just a bit of my code so if there's doubts about anything else on my code, ask away.
Thanks in advance. (:
public void swap( int y, int x ) {
long time = System.currentTimeMillis();
int counter = 0;
swapNum = tiles[y][x];
rect = (Rectangle) rectangles[y][x].clone();
while(counter < rect.height) {
if(System.currentTimeMillis() - time > 5) {
rect.translate(this.y-y, this.x-x);
time = System.currentTimeMillis();
counter++;
repaint();
}
}
swapNum = 0;
rect = new Rectangle();
int temporary = tiles[this.y][this.x];
tiles[this.y][this.x] = tiles[y][x];
tiles[y][x] = temporary;
this.x = x;
this.y = y;
}
If this block of code is running on the Event/Dispatch thread, which is used for drawing to the screen, then this will block the screen from updating.
Instead of doing the entire animation in one loop, consider designing an update method to do the animation, that will be called once every 15-30 milliseconds, and update the rectangle's position accordingly. Even better for smooth graphics is to draw to an image buffer and then have the actual draw method paint that buffer to the screen (double buffering).
Java3D has animations built-in, so it may be worth a look.
I'm working in J2ME, I have my gameloop doing the following:
public void run() {
Graphics g = this.getGraphics();
while (running) {
long diff = System.currentTimeMillis() - lastLoop;
lastLoop = System.currentTimeMillis();
input();
this.level.doLogic();
render(g, diff);
try {
Thread.sleep(10);
} catch (InterruptedException e) {
stop(e);
}
}
}
So it's just a basic gameloop, the doLogic() function calls for all the logic functions of the characters in the scene and render(g, diff) calls the animateChar function of every character on scene, following this, the animChar function in the Character class sets up everything in the screen as this:
protected void animChar(long diff) {
this.checkGravity();
this.move((int) ((diff * this.dx) / 1000), (int) ((diff * this.dy) / 1000));
if (this.acumFrame > this.framerate) {
this.nextFrame();
this.acumFrame = 0;
} else {
this.acumFrame += diff;
}
}
This ensures me that everything must to move according to the time that the machine takes to go from cycle to cycle (remember it's a phone, not a gaming rig). I'm sure it's not the most efficient way to achieve this behavior so I'm totally open for criticism of my programming skills in the comments, but here my problem: When I make I character jump, what I do is that I put his dy to a negative value, say -200 and I set the boolean jumping to true, that makes the character go up, and then I have this function called checkGravity() that ensure that everything that goes up has to go down, checkGravity also checks for the character being over platforms so I will strip it down a little for the sake of your time:
public void checkGravity() {
if (this.jumping) {
this.jumpSpeed += 10;
if (this.jumpSpeed > 0) {
this.jumping = false;
this.falling = true;
}
this.dy = this.jumpSpeed;
}
if (this.falling) {
this.jumpSpeed += 10;
if (this.jumpSpeed > 200) this.jumpSpeed = 200;
this.dy = this.jumpSpeed;
if (this.collidesWithPlatform()) {
this.falling = false;
this.standing = true;
this.jumping = false;
this.jumpSpeed = 0;
this.dy = this.jumpSpeed;
}
}
}
So, the problem is, that this function updates the dy regardless of the diff, making the characters fly like Superman in slow machines, and I have no idea how to implement the diff factor so that when a character is jumping, his speed decrement in a proportional way to the game speed. Can anyone help me fix this issue? Or give me pointers on how to make a 2D Jump in J2ME the right way.
Shouldn't you be adjusting the jumpSpeed based on the elapsed time? That is, perhaps the speed changes by -75/sec, so your diff should be a weight for the amount of change applied to the jumpSpeed.
So pass in diff to checkGrav and do something like... jumpSpeed += (diff * (rate_per_second)) / 1000;
(assuming diff in milliseconds)
(Ideally, this would make it just like real gravity :D)
Why not just scale all constants by diff?
By the way, I'm embarrassed to say this, but I worked on a commercial game where gravity was twice as strong on characters going down as going up. For some reason, people preferred this.
This seems to be more of a question about game design than the math of a jump. It is a common problem that in games running on different processors one game will be executed faster and on other games it will be executed slower (thus changing the entire speed of the game). I'm not sure what common practice is in games, but whenever I made home-brewed 2D games (they were fun to make) I would have the concept of a game-tick. On faster machines
long diff = System.currentTimeMillis() - lastLoop;
lastLoop = System.currentTimeMillis();
Would be lower. A wait time would be derived from the diff so that the game would run at the same speed on most machines. I would also have the render method in a separate thread so that the game speed isn't dependent on the graphics.
I can give a formula like this (I use it everywhere). The X is the parameter of it starting from zero and ending on the length of jump.
if you want someone to jump at some Height (H) and at some Length (L), then function of the jump will look like this (and it won't' be able to ever look different):
y = minus(power(x - Length of Jump divided by two) multiply by 4 and
multiply by Height of the jump) divide by power of Length and add
Height of jump in the very end.
y = -(x-l/2)(x-l/2)*4*h/(l*l) + h
And if you want the jumping object to land on something, then you can check every new X if it's approximately standing on a platform and if it is standing on something, then don't make it just stop, make it's Y position exactly equal to the Y of platform.
If you're using something like Flash or other base which has inverted y axis, then multiply the function output by -1;