# Programming quiz of the day

Your game starts doing something weird after 12.9 days of continuous running, what’s the likely cause?

Hint: assume your game is running at 30fps
Hint 2: Assume you have an IEEE-754 seconds counter

you had not handled an exception case and after 12.9 days the environment had been created to cause the exception to occur.

I’m amazed that you have a system that can run a game for 12.9 days without rebooting.
Must not be running windows 10, if you were, it would have force installed at least 10 updates in that time triggering 2 reboots.

4 Likes

Just about any software and computer will start having problems after running that long. As every IT person has repeated over and over again; “have you tried turning it off, then turning it back on again?”

2 Likes

Memory cap? That’s approximately 2^25 frames at 30fps.

mm, might be something to do with the max value of a float for Time.time or an Int for Time.FrameCount possibly?

dont have my math head on this early

1 Like

Several of you are really close, so why exactly 12.9 days?

I was recently working with an Ardiuno and came across such an issue with an ardiuno running for more than 12.9 days and had to make a work around for it.

I believe the issue is down to an Unsigned Long exceeding it’s maximum value.
12.9 days in Milliseconds is the maximum amount an Unsigned Long can hold before it rolls back over to Zero.

I think this is correct?

Close, but it’s more about rounding.

Obviously if it’s been running for 12.9 days that is a victory.

Of course, if we have to try, I would guess that 12.9 days, which is 1114560 seconds, probably hits the point where the time between updates is greater than the floating point uncertainty. Log10(1114560*30)=7.5, the precision is about 7.2 digits from Wikipedia, so that seems pretty close. I think the Wikipedia uncertainty is the midpoint, so if you divide the input to the log by 2, you get 7.22, which is about the uncertainty.

Bottom line, I suspect there is a time counter that has an addition of 1/30. At 12.9 days, that total time can’t increase anymore, because adding by 1/30 will give the same result.

Best answer yet, of course precision in decimal is variable, fixed in binary. Try seeing how large a decimal number of seconds leads to more than 1/30 (0.3333) change when you toggle the last bit in the mantissa: http://www.h-schmidt.net/FloatConverter/IEEE754.html

From that website, I would have guessed the issue would have appeared at 12.1 days. Or even more likely, at 6 days. At 6 days, the FPE is 0.625, which would still increase as time went up, but not by much. At 12.1, that error doubles, meaning that no matter how you increment it by 1/30, it won’t actually increase at all.

So, this is pretty old now, what do you say the answer is @ben?

So a 32-bit float has a fixed amount of significant figures it can represent, 22 binary digits or about 7.2 decimal digits. This can be visualised in the graph below…

12.9 days is about 1.1 million seconds, or about 10^6 on the horizontal axis of that graph. You can see this corresponds to about 10^-1 = 0.1 on the blue line for a 32-bit float. That means that once our float is around this size, we can only keep track of changes in the order of 0.1s, which is around the frame time of 1/30 = 0.033 seconds.

So, adding just 0.033 to the timer will have no effect on its value, it will be rounded-out by the limits of the floating point precision.

Clear as mud?

Just trying to figure out how that’s any different than my answer, but…