Is there any proper way to create and display a timer in java using only java sdk and libraries from libGDX?
I want my timer to start from zero and to be displayed in a format like minutes : seconds.
I am currently using workarounds like adding the delta time in render method and casting totalTime / 60 to integer for minutes and totalTime - minutes * 60 for seconds.
Before this I have tried adding the delta to seconds and having an if in the render method which sets seconds to 0 and increases minutes by 1 when seconds >= 59.9f. Both methods are very inefficient and not very accurate. My timer skips seconds when it approaches the end of a minute or it shows 60 seconds and then increases minutes and resets seconds, something like 1:59, 1:60 and then 2:00.
I searched all over the documentations and internet and not have found a way to properly display time (not workarounds that have huge flaws). How can you do this in a professional manner?
I don't think AAA games do this stuff seriously... Has anyone even thought that people might need something like a countdown timer in java or in libGDX (one that works without displaying 1:60 at least)?
It's kind of hard to follow exactly what you tried, so I'm not sure where you were losing accuracy, but maybe it had to do with where you put the integer cast in your calculation.
Generally a clock shows complete seconds, so round total time down to an integer. Minutes and seconds should be recalculated each frame. They serve no purpose except to display a human-readable version of totalTime, so there's no need to keep them in member variables. totalTime should be left unrounded so it stays accurate.
//Member variable:
float totalTime = 5 * 60; //starting at 5 minutes
void render(){
float deltaTime = Gdx.graphics.getDeltaTime(); //You might prefer getRawDeltaTime()
totalTime -= deltaTime; //if counting down
int minutes = ((int)totalTime) / 60;
int seconds = ((int)totalTime) % 60;
//...
}
Edit 6 years later:
To avoid rounding error from adding up a bunch of small floating point delta times, I would instead store the system clock start time and recalculate the elapsed time each frame, like this:
// Reset this to System.currentTimeMillis() when starting over.
private long startTime = System.currentTimeMillis();
void render(){
long totalTime = (startTime - System.currentTimeMillis()) / 1000;
int minutes = totalTime / 60;
int seconds = totalTime % 60;
//...
}
For my NoThree game, I print the timer as follows:
gameTime += delta;
float minutes = (float)Math.floor(gameTime / 60.0f);
float seconds = gameTime - minutes * 60.0f;
labelTime.setText(String.format("%.0fm%.0fs", minutes, seconds));
I think it works pretty nicely.
Related
Consider this Game Loop:
double nextTime = (double)System.nanoTime() / 1000000000.0;
while(runFlag)
{
double currTime = (double)System.nanoTime() / 1000000000.0;
if(currTime >= nextTime)
{
nextTime += delta;
update();
draw();
}
else
{
int sleepTime = (int)(1000.0 * (nextTime - currTime));
if(sleepTime > 0)
{
try
{
Thread.sleep(sleepTime);
}
catch(InterruptedException e)
{
}
How should I calculate an appropriate value of delta (the time that has to pass before the next update and render). I have seen it been calculated different ways but I am still not sure how or what exactly is really going on. This is a fixed time step loop, so for instance if i wanted to calculate delta for a constant fps of 30 of 60, what would I have to set it as and why? I am unable to grasp some of the explanations I have come across on the Internet. Thanks.
If you want to produce a particular frame per second rate, consider the following:
You want to produce a new frame every 1/frames-per-second seconds.
It may take you p seconds to produce a frame (processing)
The "free time" between frame productions, delta, would be 1/frames-per-second - p
You might use the following pseudo-code:
frameTimeMilliseconds = 1000 / frames-per-second
Loop(condition) {
startFrameTime = system current time in milliseonds
do frame production
endFrameTime = system current time in milliseconds
sleepTime = frameTimeMilliseconds - (endFrameTime - startFrameTime)
sleep for sleepTime milliseconds
}
You may want to handle the condition where the actual frame production time is longer than the required frame production time (i.e. where sleepTime is <= 0)
I am trying to create a game in which objects spawn on a timer then start moving. I would like to keep the frame rate independent of the game speed, so I have tried to implement delta time:
lastTime = thisTime;
thisTime = (int) SystemClock.currentThreadTimeMillis();
deltaTime = thisTime - lastTime;
for (int i = 0;i < objectList.size();i++) {
objectList.get(i).updatePosition(deltaTime);
objectList.get(i).display(bitmapCanvas);
}
And my updatePosition method:
public void updatePosition(float deltaTime) {
this.y += 10/1000 * deltaTime;
}
From my understanding, the rate (10 in this case) should be divided by 1000 since deltaTime is in milliseconds, but I have tried it with various other values as well, and the objects just spawn on the screen and do not move. I had no issues making them move before trying to implement interpolation.
If it helps any, I was logging the delta time and it was usually around 20/30. Is this normal?
I have looked at probably 5 or 6 questions on GameDev and a couple on Stack Overflow but I think I must be misunderstanding something as I cannot get this to work.
So this doesn't appear unanswered, Jon Skeet was (obviously) correct:
The calculation of 10/1000 is being performing (at compile time) in integer arithmetic, so it's 0... Try 10/1000f, or just 0.01f.
I've written a program with moving animated sprites based on user input that all takes place in a while(true) loop. Up until now the only way I've been timing the loop is by having a sleep(20) at the end of each run through the loop. I have noticed this effects the physics of my sprites negatively because my formula for calculating gravity is velocity = velocity + GRAVITY * Elapsed-Time and because the loop isn't running at a consistent rate the effect of gravity is not consistent either. Is there a way to keep the loop running consistently or better way of timing it so it actually runs on a schedule?
First, determine how long you want your frames to last. If you want 30 fps, then you'll have
final long frameDuration = 1000 / 30;
Next, when you start rendering, store the time, like so
final long then = System.currentTimeMillis();
render(); // surely it's more than this but you get the idea
final long now = System.currentTimeMillis();
final long actualDuration = now - then;
final long sleepDuration = frameDuration - actualDuration;
if(sleepDuration > 0) {
sleep(sleepDuration);
} else {
throw new FrameTooLongException();
}
velocity = velocity + GRAVITY * Elapsed-Time
This should work regardless of whether your frame rate is constant or not. The trick is to measure elapsed time accurately & consistently. Measure elapsed time from some point in your loop to the same point in the next loop.
I have a timer based on System.nanoTime()
and I want to slow it down.
I have this:
elapsed = System.nanoTime();
seconds = ((elapsed - now) / 1.0E9F);
where now is System.nanoTime(); called earlier in the class.
In order to slow it down I tried:
if(slow){
elapsed = System.nanoTime();
seconds = ((elapsed - now) / 1.0E9F) * 0.5F;
}else{//The first code block
But then I run into the problem of seconds being cut in half, which is not desirable.
Before slow is true, it's displaying 10 seconds to the screen, and when slow is true, it displays 5 seconds, which is not desirable.
I want to slow the timer, not cut it in half as well as slow it down
On the other hand, it slows down the time, which is what I want.
How would I slow down the time, without cutting my existing time in half?
Add to seconds rather than recalculating it from scratch each time through the loop:
elapsed = System.nanoTime();
if (slow) {
seconds += ((elapsed - now) / 1.0E9F) * .5F;
}
else {
seconds += ((elapsed - now) / 1.0E9F);
}
now = elapsed; // so next time you just get the incremental difference
You can have a member variable that contains the multiply needed to be done. Always calculate the correct number of seconds, and when you use the timer, multiply it:
float multiple = 1.f;
//...
seconds = (elapsed - now) / 1.0E9F;
//...
// setting timer:
somefunc(seconds * multiple);
public void setSlow(boolean slow) {
if(slow)
multiple = 1.0f;
else
multiple = 0.5f
}
I'm writing a game, and I saw the FPS algorithm doesn't work correctly (when he have to calculate more, he sleeps longer...) So, the question is very simple: how to calculate the sleeptime for having correct FPS?
I know how long it took to update the game one frame in microseconds and of course the FPS I want to reach.
I'm searching crazy for a simple example, but I can't find one....
The code may be in Java, C++ or pseudo....
The time you should spend on rendering one frame is 1/FPS seconds (if you're aiming for, say 10 FPS, you should spend 1/10 = 0.1 seconds on each frame). So if it took X seconds to render, you should "sleep" for 1/FPS - X seconds.
Translating that to for instance milliseconds, you get
ms_to_sleep = 1000 / FPS - elapsed_ms;
If it, for some reason took more than 1/FPS to render the frame, you'll get a negative sleep time in which case you obviously just skip the sleeping.
The number of microseconds per frame is 1000000 / frames_per_second. If you know that you've spent elapsed_microseconds calculating, then the time that you need to sleep is:
(1000000 / frames_per_second) - elapsed_microseconds
Try...
static const int NUM_FPS_SAMPLES = 64;
float fpsSamples[NUM_FPS_SAMPLES]
int currentSample = 0;
float CalcFPS(int dt)
{
fpsSamples[currentSample % NUM_FPS_SAMPLES] = 1.0f / dt;
float fps = 0;
for (int i = 0; i < NUM_FPS_SAMPLES; i++)
fps += fpsSamples[i];
fps /= NUM_FPS_SAMPLES;
return fps;
}
... as per http://www.gamedev.net/community/forums/topic.asp?topic_id=510019
Take a look at this article about different FPS handling methods.
UPDATE: use this link from Web Archive as original website disappeared: https://web.archive.org/web/20161202094710/http://dewitters.koonsolo.com/gameloop.html