Even though I’m a mathematician, I have to look at some things carefully to convince myself it’s right. Turns out it isn’t, exactly, though without taking statistics of game plays, it would probably not be noticed.
So, with some pencil and paper work (I used a pen since it was handy, but still…) I see that if the random roll is between 0 and the very first relative chance, nothing will be dropped and null will be returned. Only if the character had a 100% drop chance would this be obvious, though, and then after many trials.
The solution is to put the if(chanceTotal… line BEFORE the chanceTotal +=… line.
A second less serious issue is the random roll has a minute probability of equaling the totalChance and thus the > sign should be >=. This bug would almost surely never be noticed in practice. (Unity random returns min <= value < max if min and max are both integers, but if either is a float, it returns min <= value <= max.)
Another issue which affects the probabilities and thus might not be noticed immediately, is:
C# (I had to look it up to be sure) is like C or C++ in that if you call a function in the second part of the three-part for loop argument, it gets called every iteration. Therefore, the first time the index is greater than or equal to the function result, the loop will terminate. Thus, it will tend to terminate early, on average, and larger numbers of drops will occur less often than smaller numbers of drops. Of course, you might want that, but my opinion is it would be better to make that explicitly happen rather than by accident!