DeltaTime question

It makes sense that DeltaTime is used to make the game frame independent, but I don’t understand how 1 unit per second = 10fps * 0.1s or 100fps* 0.01s since the answer is 1 frame per no units of time. It might be that the object will move 1 unit per second, but the math is kind of confusing.
unreal docs state the following: static double DeltaTime = 1 / 30.0 (in seconds). I know that 30fps is a typical animation play rate for movies so that may be the reason for that time.
So, couldn’t delta time be used to make the “update rate” a constant 30FPS?. The math may look like the following: 30 FPS = ActualFPS * DeltaTime; but I am just guessing because the documentation is kind of vague on the specifics.

The delta time is not simply 1/fps. It is the time between frames where, if the frame rate is constant then it will be 1/60 for example. However, if frames are being dropped the delta will be the time between the 2 frames.

Now normally it will be whatever you set your 1 divided by the framerate you’ve set. You don’t have to scale it if you want 30 instead of 60 for example, as this can be set in project settings.

Dont try and do anything complicated with this, just trust that the engine knows how to do it correctly.

All you need to do is use delta time when scaling things like turn rate or velocity to ensure someone with 25fps will move the same as someone with 60fps.

It looks like we are in agreement; delta time cannot be just the inverse of fps because if you multiply the actual fps by its inverse the answer would always be 1, which would be kind of useless. which is the reason I brought it up. Delta time would have to result in the creation of a constant update independent of fps (such as when it is called in the tick function.)So, instead of calling it frames per seconds it may be better to call delta time updates per second.

I am not really sure what you mean in the second paragraph; But I would guess that the update rate is a fixed value; probably 30 updates per second; Which would make sense because that rate would be based on the standard 30fps video playback rate.

You may ask if the update rate is 30 updates per second, why do we need such a high game frame rate? I think the reason is because when you move the player (camera view) around lower frame rates cause more visual distortion; but if the camera was stationary then 30 fps would not be that big of a deal. so that’s why I assume update rate is 30. Also, as long as the fps does not drop below 30 then the update rate will be valid.

As a given rule, delta * fps does usually give 1, assuming that the fps is constant. It’s not always the case but it mostly is. However, you’d never rely on this to determine time unless it is approximate only.

As for the frames per second, the human eye needs updates at around 25 times a second to be tricked into seeing smooth movement. Anything less causes issues and sometimes emotion sickness. This is why the PAL TV system in old CRT TV systems was around 25fps and NTSC was 30fps, also linked to the frequency of mains electricity.

It is also why VR apps require much higher frame rates, typically 90fps or higher which is part of the reason you see apps with lower poly graphics as at 90fps you have about 11ms to draw a single frame and perform everything at the absolute maximum. Targeting 120fps which is increasingly common is around 8.5ms and now you have 144hz displays enabling faster updates.

So, with all that, why it matters is the smoother it is, the more it resembles the analogue world (I.e. Outside) but also at this higher framerate, it affects competitive gameplay as well giving players a advantage in some games. Personally I target 60fps and 90 for VR work as it is enough and it’s the reason that games using 4k on consoles reduce detail to achieve the 60 and 120hz framerates of modern televisions.

I hope this all makes sense.

Yes, I understand what you are saying and for the most part I would agree, but…

If delta time is the inverse of fps, then multiplying delta time by fps would cancels each other out: xf/s * f/xs = 1/1 = 1 which means the update rate is done only once or possibly one update per second.

Yes, I agree the video playback rate can be a little lower than 30fps; I came up with 30 updates per second because (like I said in a previous post) unreal defines delta time as 1/30 seconds which I would assume would be an update rate of 30 times per second. But that guess is based on the value unreal provided.

Yes, I understand that at higher frame rates means higher updates. (But that is the screen update rate; And yes, the objects on the screen will also be updated at that rate but their animations (or transforms) don’t have to. The animations/ transforms can update at a much lower rate (such as 30 times per sec.) Of course, it would make sense for the camera update to be the same as the actual fps because of obvious reasons.

But whatever the frame independent rate is, it would have to be a lot lower that the typical fps value to accommodate for lower fps values.

Question:
Let say we have a fps of 100 or 200 or whatever and you are trying to move an object using a transform function that uses delta time to create a frame independent rate; how many times will the object be updated/ transformed per second? My guess would be 30 times a second. Is that right? If not, what would it be?

It will be updated according to the maximum number of frames the display can handle so if it were 200, then 200 times and delta time would be 0.05s

Just to be clear, and you seem to be missing this bit, delta time is variable. It is the duration between frame updates and may or may not be 1/fps because if a frame drops or 2, then the duration is 3/fps instead.

UE attempts to generate at a given fps but it does not always happen
And so the tick may not run every 1/fps seconds.

This is really important and why delta time is used to adjust the velocity.

So, delta time is the time between 2 rendered frames and that may be 1/fps but it may not. It is variable. However on computer a, fps may be 15 and another at 144, so using delta to adjust the max velocity by this ensures regardless of frame rate, the velocity is consistent between devices

Another example is if unadjusted, you move the actor 10 units per tick. If one device updates at 15fps, the it moves 100.in the case of 144fps, it moves 1440. Using delta time is what prevents this.

I am actually not missing what you are saying, and I never said delta time was a constant; I know delta time is a variable and I know it will change when the fps changes. It will change in such a way as to provide a constant update rate based on the current fps. Are we in agreement?

The only time I brought up a constant was in regarding the frame independent rate that delta time creates, Unreal’s definition (not mine) of DeltaTime = 1/30, and to the math given in the lecture: fps * 1/fps =1(a constant.) if this equation is right then it is incomplete. It would need to be something like: (fps(f/s) * 1/fps(s/f) * X seconds) so that time in seconds can be accounted for because the first and second terms eliminates the seconds from the equation.

But it seems we have different points of view on this so let’s focus on the question I asked at the end of my last post…

…If we use delta time to create at a frame independent rate inside of a tick function what would that rate be?

Note: Unreal’s documentation describes delta time in the following:
(Syntax: static double DeltaTime = 1 / 30.0; Remarks: Holds current delta time in seconds.)

I would assume 1/30 has something to do with delta time’s role in creating an update rate of 30 times per second. (I stated in a previous post.) But I am not sure; what do you think?

Ok, so I know from experience tick will run at the given framerate of the device. My monitor has 60hz rate so 60 times a second with a delta of 16.67ms, my quest 2 is set to 120hz so it runs at 120hz therefore the delta time is 8.33ms.

You can actually test this by adding a debug message or log and output the delta time in tick.

I don’t need to do that because I already know that Tick will run at a rate based on the fps.

In my post I was actually talking about things directly affected by deltaTime. Such as, the code inside of the tick function that deltaTime is applied to. Without delta time these internal values would changes and execute proportionately to fps; but as soon as you apply delta time, these values become frame independent; they will still be executed based on FPS but their data will now update/change to a new value based on some fixed rate independent of fps (frame independent.)

You do agree that deltaTime is used to provide some sort of frame independence rate?

if so, what would this frame independent rate be? 30 times a second…possibly?

I do agree. The frame independence rate is dependent on machine performance, specifying a cap of frame rate in the project and the maximum that the display is capable of. The fps can be fixed but may not be achievable by the hardware and therefore can still be variable based on the hardware on which it runs on even if you try and fix it to, say, 30fps.

You can therefore not assume 30fps is achieved even if fixed at that because the hardware itself may not be able to achieve this. This also means that the tick interval will still potentially vary and why it is important to use the delta time and not rely on anything else to ensure consistency frame rate independence.

1 Like

Good we came to an agreement. I guess it doesn’t matter what the exact rate is I was just curious if anyone knew what it was; and since Ureal documentation states that delta time = 1/30 I assumed it would be updating 30 frames per second. Anyway, I enjoyed our discussion; it really made me think.

This topic was automatically closed 20 days after the last reply. New replies are no longer allowed.

Privacy & Terms