Doing some more research. https://stackoverflow.com/questions/905355/c-string-length
if you assume a char is 1 byte. length() will return the correct character count as each char is one byte. So 5 characters is equal to 5 bytes and opposite is also true.
Now the signed/unsigned mismatch is something totally different. An int is signed. meaning it can hold values from
-2147483648 to 2147483647. Or -(2^31) to +( 2^31) - 1. It’s - 1 for the positive values as zero is considered positive. The most significant bit (the one farthest to the left) is used to tell the system if the number positive or negative. Therefore, one of the 32 bits is used up, leaving only 31 bits that can be used to store number values.
length() returns an unsigned int. meaning it can store values from 0 to 4294967295. Or (2^0) -1 to (2^32) - 1. The mismatch error the compiler displayed was telling ya, hey I need 32 bits to store my result from length(), but you are only giving me 31 bits. When you cast it to the int32, you said, I know length() needs 32 to bits, but I know the result of length() will never get bigger than 31 bits, so its okay to use an int32.
Now, this is all from memory, so while I believe what I said is accurate, I did not fact check it.