I have recently started on developing my own game engines for fun. I implemented a method to load Blender .OBJ models from files, and I am successfully rendering them. However, when doing a stress-test I ran into an unusual predicament with my Delta-time based fps system. While running at 1500FPS, it takes me 4 seconds to look from one end of a wall of models to another. If I cap the FPS at 120FPS, however, it only takes my 0.84 seconds to look from one end of the wall to another. As I explored further it would seem that, in fact, the game-movement speed decreases as FPS increases.
Here is my Timer class:
class Timer
{
long lastFrame;
int fps;
long lastFPS;
int delta;
int fpsToReturn;
public Timer()
{
lastFPS = getTime();
}
public long getTime()
{
return (Sys.getTime() * 1000) / Sys.getTimerResolution();
}
public int getDelta()
{
long time = getTime();
delta = (int) (time - lastFrame);
lastFrame = time;
return delta;
}
public void updateFPS()
{
if (getTime() - lastFPS > 1000)
{
fpsToReturn = fps;
fps = 0;
lastFPS += 1000;
}
fps++;
}
public int getFPS()
{
return fpsToReturn;
}
}
And of course movement is just something along the lines of:
camera.rotX += (0.5 * timer.getDelta());
Does anyone have any ideas? Is this how delta-time is supposed to work? When running at 16FPS it returns around 65, at 120FPS Delta is returning around 8-9, while uncapped FPS it always returns 0 or 1 if that makes a difference or helps you spot if something is wrong with the uncapped FPS. I really appreciate any help, thank you guys.
Solved my own question, and glad I learned in the process.
My issue, which I later discovered when I was implementing angular movement, was that I was using the method getDelta() and getFPS() every time i needed it more than once-per-frame, which was throwing off the delta variable. I solved the issue by using a static variable, one for FPS and one for Delta, and updating each variable at the end of each frame.
Pseudo Code:
public static double globalDelta;
while(nextFrame) //Loops once per frame
{
updateGame();
globalDelta = calculateDelta();
}
Related
I am currently developing a 2D game using Swing components.
Each time I run my game it stutters randomly at some points. This is my game loop code:
public class FixedTSGameLoop implements Runnable
{
private MapPanel _gamePanel;
public FixedTSGameLoop(MapPanel panel)
{
this._gamePanel = panel;
}
#Override
public void run()
{
long lastTime = System.nanoTime(), now;
double amountOfTicks = 60.0;
double amountOfRenders = 120.0;
double nsTick = 1000000000 / amountOfTicks;
double nsRender = 1000000000 / amountOfRenders;
double deltaTick = 0;
double deltaRender = 0;
while (this._gamePanel.isRunning())
{
now = System.nanoTime();
deltaTick += (now - lastTime) / nsTick;
deltaRender += (now - lastTime) / nsRender;
lastTime = now;
while (deltaTick >= 1)
{
tick();
deltaTick--;
}
while (deltaRender >= 1)
{
render();
deltaRender--;
}
}
}
private void tick()
{
/**
* Logic goes here:
*/
this._gamePanel.setLogic();
}
private void render()
{
/**
* Rendering the map panel
*/
this._gamePanel.repaint();
}
}
I have tried multiple times to omit certain code parts, thinking that they cause lag, but I have found nothing that caused it particularly, so I think the problem lies within my game loop mechanism.
Thank you for your help!
Your game loop must contain a "Thread.sleep" on order to sleep the amount of time needed to respect your target FPS.
The main loop is supposed to contain 1 tick() and 1 render().
Your current implementation is flooding the paint manager, slowdowns will appear when the underlying buffers will be full and when the garbage collector will do its job.
While it's good that you've split your render and logic in two different methods, the problem lies in that they exist in the same thread.
What needs to happen to reduce the lag is to have them in separate threads. The render thread would request a snapshot of the current state from the logic thread (to prevent concurrent modification) and render that snapshot.
Right now, if one render takes too long, or if one logic step takes too long, the other check is going to have to wait for it to finish before it can begin working.
Basically what I just do to create a timer is float timer += 1 * deltaTime and it will add 1 every second, but I'm having a problem on using it.
if(timer < 2){
//do something
}
I want the if statement to stop running the code when timer is at 2 seconds, buy since I cannot do if(timer != 2f) because it will not even detect 2 seconds because it's too fast. This means I have to put a condition timer < 2f, which is not accurate and always gives me inacurate results.
Instead of using a hand made timer why not use a Task and a boolean?
Pseudo code below;
boolean stop = false;
Timer.schedule(new Task() {
#Override
public void run() {
stop = true;
}
}, 2f);
Then in your render method;
render() {
if (!stop) {
// Freefall
}
}
If I understand your question correctly, you want your code to process for X seconds and no more (where X can be a floating point value such as 12.556 seconds).
I'm going to propose a single-threaded alternative where you have a custom Timer class that manages your freefall logic as shown below. It watches the progress and if it sees something greater than the duration, it "lies" to the internal logic so that your logic is only executed for the amount of time specified (within a configurable margin of error since we're playing with floats).
The example below demonstrates this sort of thing without using libGDX (just in case folks using other libraries are interested), but it would be trivial to swap out my simulated getDeltaTime() method with Gdx.graphics.getDeltaTime().
package tech.otter.timing;
/**
* Created by john on 11/20/16.
*/
public class TimingExample {
public static void main(String... args) {
boolean complete = false;
Timer t = new Timer(15f) {
// You could implement your logic here.
#Override
void action(float delta) {
System.out.println(progress);
}
};
while(!complete) {
complete = t.update(getDeltaTime());
}
assert t.progress < t.duration;
assert t.progress + t.errorMargin > t.duration;
}
/**
* Simulates processing time by returning 0-100ms.
* #return The number of milliseconds that have allegedly elapsed since the last call.
*/
public static float getDeltaTime() {
return (float)(Math.random() / 10);
}
abstract static class Timer {
private float duration;
protected float progress;
private float errorMargin;
public Timer(float duration) {
this(duration, 0.0001f);
}
public Timer(float duration, float errorMargin) {
this.duration = duration;
this.errorMargin = errorMargin;
this.progress = 0f;
}
/**
* Update the timer based on how long since the last call.
* #param delta The amount of time since the last call.
* #return Whether the timer's progressed has met the duration.
*/
public boolean update(float delta) {
// This if-statement "caps" the delta so that we will never exceed the duration.
if(progress + delta > duration) {
delta = duration - progress;
}
progress += delta;
action(delta);
// Return "true" if the progress is equal to the duration (+/- a small margin just in case since we use floats).
return progress + errorMargin > duration && progress - errorMargin < duration;
}
/**
* Override this method with your game logic.
* You should not call it directly.
* #param delta The amount of time that has elapsed since the timer was last updated.
*/
abstract void action(float delta);
}
}
Consider this Game Loop:
double nextTime = (double)System.nanoTime() / 1000000000.0;
while(runFlag)
{
double currTime = (double)System.nanoTime() / 1000000000.0;
if(currTime >= nextTime)
{
nextTime += delta;
update();
draw();
}
else
{
int sleepTime = (int)(1000.0 * (nextTime - currTime));
if(sleepTime > 0)
{
try
{
Thread.sleep(sleepTime);
}
catch(InterruptedException e)
{
}
How should I calculate an appropriate value of delta (the time that has to pass before the next update and render). I have seen it been calculated different ways but I am still not sure how or what exactly is really going on. This is a fixed time step loop, so for instance if i wanted to calculate delta for a constant fps of 30 of 60, what would I have to set it as and why? I am unable to grasp some of the explanations I have come across on the Internet. Thanks.
If you want to produce a particular frame per second rate, consider the following:
You want to produce a new frame every 1/frames-per-second seconds.
It may take you p seconds to produce a frame (processing)
The "free time" between frame productions, delta, would be 1/frames-per-second - p
You might use the following pseudo-code:
frameTimeMilliseconds = 1000 / frames-per-second
Loop(condition) {
startFrameTime = system current time in milliseonds
do frame production
endFrameTime = system current time in milliseconds
sleepTime = frameTimeMilliseconds - (endFrameTime - startFrameTime)
sleep for sleepTime milliseconds
}
You may want to handle the condition where the actual frame production time is longer than the required frame production time (i.e. where sleepTime is <= 0)
I am using Box2D and OpenGL. I found that (at a 60 frame rate) when I apply quick changes in direction to a fast moving object, the rendering seems to jump or perhaps skip frames.
(I am only operating in 2D). I want to push the physics to the very edge. (Lots of objects moving simultaneously, possibly breaking down shapes with welds etc.)
If I speed up the display.sync() from 60 to 180, it is much cleaner.
What is an ideal frame rate?
Are there any other ways to keep the rendering clean?
With speed and only basic drawing being the priority, are there better libraries?
Such as Slick2D?
Sometimes the problem isn't in the renderer, but rather in the fact your time step in your simulation is causing problems and making it look like your frame rate is off. I noticed similar problems in a program of mine using OpenGL and Box2D and fixing my timestep helped smooth things out significantly.
Really good article here.
MtRoad's answer as a possible implementation.
public class GameStateRunning {
private final int TICKS_PER_SECOND = 30;
private final double timePerTick = 1000 / TICKS_PER_SECOND;
private final int MAX_FRAMESKIP = 5;
private double next_game_tick = System.currentTimeMillis();
private int loops;
private double extrapolation;
public void update() {
loops = 0;
while (System.currentTimeMillis() > next_game_tick && loops < MAX_FRAMESKIP) {
// YOUR GAME UPDATE CODE GOES HERE
next_game_tick += timePerTick;
loops++;
}
if (next_game_tick < System.currentTimeMillis()) {
next_game_tick = System.currentTimeMillis();
}
extrapolation = 1 - (next_game_tick - System.currentTimeMillis()) / timePerTick;
}
public void render() {
// YOUR GAME RENDER CODE GOES HERE
}
Slick2d probably won't be faster than OpenGL, as Slick2d uses OpenGL.
I am trying to create a game in which objects spawn on a timer then start moving. I would like to keep the frame rate independent of the game speed, so I have tried to implement delta time:
lastTime = thisTime;
thisTime = (int) SystemClock.currentThreadTimeMillis();
deltaTime = thisTime - lastTime;
for (int i = 0;i < objectList.size();i++) {
objectList.get(i).updatePosition(deltaTime);
objectList.get(i).display(bitmapCanvas);
}
And my updatePosition method:
public void updatePosition(float deltaTime) {
this.y += 10/1000 * deltaTime;
}
From my understanding, the rate (10 in this case) should be divided by 1000 since deltaTime is in milliseconds, but I have tried it with various other values as well, and the objects just spawn on the screen and do not move. I had no issues making them move before trying to implement interpolation.
If it helps any, I was logging the delta time and it was usually around 20/30. Is this normal?
I have looked at probably 5 or 6 questions on GameDev and a couple on Stack Overflow but I think I must be misunderstanding something as I cannot get this to work.
So this doesn't appear unanswered, Jon Skeet was (obviously) correct:
The calculation of 10/1000 is being performing (at compile time) in integer arithmetic, so it's 0... Try 10/1000f, or just 0.01f.