Is the speed really 25 m/s? (Not s². That would be an acceleration.) Check your Inspector because it overrides the value in your code.
You get values around 0.40 m/f logged into your console.
Assuming you have a frame rate of 100, the offset for a frame would be be 0.6 s. 25 m/s * 0.6 s = 15 m. If the frame rate was consistent, this would mean that your speed is 15 m / s.
The problem is the following: You cannot rely on the display because it does not have much to do with the actual FPS. This is a known problem in Unity. You can infer from it if your game is generally performant but you cannot tell what the frame rate really is.
Your game is fairly small and the FPS in the display looks fine, so it’s not necessary to know the exact frame rate for a frame. If you want to get more information anyway, please refer to the links below.
Log Time.deltaTime
along with Time.frameCount
into your console. Then do the maths again.
See also: