DelaTime as speed

Okay so in the example by the teacher we set the current yaw to be lerped between the current and the target yaw, with the speed of DeltaTime * 1.

CurrentYaw = FMath::Lerp(CurrentYaw, TargetYaw, DeltaTime * 1.f);

Now reading that, I don’t understand why this results in frame-rate independent speed.

I get 120 fps in the editor, which means a frame time of 8.33 repeating. I capped my FPS to 30 which means a frame time of 3.33 repeating.

Which means that the code might as well be

CurrentYaw = FMath::Lerp(CurrentYaw, TargetYaw, 8.33f);
CurrentYaw = FMath::Lerp(CurrentYaw, TargetYaw, 3.33f);

How could those two lines reproduce the same result?

I thought it might be that DeltaTime is something else and I misunderstood it, so I begun printing it with.

UE_LOG(LogTemp, Warning, TEXT("Deltatime is %f"), DeltaTime);

Results were:

120 fps == Deltatime is 0.008334
30 fps == Deltatime is 0.033333

So, a bit off by a few decimal points, but the initial conundrum stands, how do two different numbers produce the same result?

Then I thought it has something to do with Lerp itself. I tried finding the actual code for Lerp but Visual Studio basically hanged trying to find it. The unreal documentation for it isn’t particularly helpful either.

“Performs a linear interpolation between two values, Alpha ranges from 0-1”

So yeah. How does this actually work?

At 120fps Tick is going to be called 4x as much within the same time period so it would need a value 4x less.

0.03333 / 4 = 0.008333

Okay, that’s actually way simpler than I thought it would be.

Thank you for the quick response @DanM

1 Like

This topic was automatically closed 24 hours after the last reply. New replies are no longer allowed.

Privacy & Terms