In the course a couple of examples have used percent values e.g. ‘10 /100’ as opposed to just using ‘.1’ in the first instance. Is this a teaching stylistic choice or are there other benefits (readability?) I might be missing? My natural inclination is just to use a value like ‘.1’ at the source. I suppose it doesnt really matter as long as you’re consistent?
I would guess its jut personal, 10/100 just feels kinda “natural” to me, since full hp is 100%.
if youre working with sliders for healthbars, either you need to change the max value of the slider to your max hp number on start, for example, if your maxHP is 500, for the slider to work correctly you would need to set slider.maxvalue to that to get an accurate healthbar,
or you need to come to a value between 0 and 1, which i believe is currentHealth/maxHealth * 100.
But thats just as far as i know, maybe someone else will come with a better answer
I think in this case, it was just whichever seemed convenient at the time. 10/100 == 1/10 == .1 so the values are fairly transitive.
In my own work in the electronics industry, I deal with even more transitive notations… On a drawing, its’ not uncommon for one reference to be .1µF, another be 100nF, and still another be 100,000pF while the notation on the physical product might read 104 (they are all the same value). After 15 years building circuit boards, I don’t even notice anymore, I just make all the conversions without thinking about them. The same thing happens over time in the programming business, some things are just naturally converted in one’s head as needed.
Fun fact:
x = 1/10f;
y = .1f;
the compiler will automatically change the x to read x=.1f; when it creates the finished code.
This topic was automatically closed 20 days after the last reply. New replies are no longer allowed.