Lost the Thread a bit here

So, I’ve got up to the last but one chapter of S2, and I thought I was understanding - But whilst wrestling with adding difficulty levels in, the whole “using … = …” thing floored me.
The problem I have is that I can see that the “using int32 = int” means that everywhere you write “int32” in the file, the machine reads it as “int”.
So far so good…? My problem is that then the point was made that this is to ensure that you were using the same width of int across different machines - but surely this is the opposite? by using “int32”, the machine translates it as “int” and then uses the default int width?
Where am I getting it wrong here - any help greatly appreciated

Yes, I too am kind of lost. It’s said that the type implementation (of int) can change depending on the platform. Why then would you define your type alias to point to that changing type? Wouldn’t you do it the other way around and specify that your “int” type should be “int32” (I thought C++ has int32 by default) to ensure that your int is 32 bits on all platforms?

1 Like

Yeah, I stopped here and started looking around to figure out why this is done.

The only thing I could think of is this:

Suppose you want to make the project for one platform, which uses int. Ok, this should work. Now you want to make it for something else that uses int64, or something. Well, you just change what int32 equals at the top of each file, and you’re done porting.

I don’t know if that made any sense.

If that is the purpose, why not have it all in one place, maybe an alias header file Alias.h, which you include across the project?

We are only changing the name, creating an alias, for “int” which works as you understand. At this point we are not actually changing the memory allocated at all. @brentco is correct that “int” is defaulted in C++ to 4 bytes (the same as “int32” in Unreal). I think Ben just wants us to get used to seeing/using “Int32” now before we hop over to Unreal. The Unreal IDE supports “int”, but it may be considered poor code to use it.

From what I was able to piece together from the MSDN Library, if we did want to hard set “int” to 32 in VS, it looks like we would use “_int32” (int32 isn’t a thing). We would still have to create a type alias of “using int32 = _int32;” at this point for porting over to Unreal. Once we got to Unreal, we could just delete the alias. All the “int32” would now be real “types”.

@Bast - Good question. I am still trying to wrap my head around if and when we can use header files outside of class purposes.

Edit: Just realized this question was a month old. @Andrew_Milburn - Did you get past this?

1 Like

I skipped past it :slight_smile:

I guess the [quote=“PhycoBionic, post:4, topic:24922”]
Once we got to Unreal, we could just delete the alias. All the “int32” would now be real “types”
[/quote] makes sense - thanks!

https://msdn.microsoft.com/en-us/library/dn467695.aspx

@PhycoBionic & @Bast - You effectively could create an Alias.h file that has the sole purpose of containing all the aliases you have in your entire project, but this is considered bad form. It requires whenever you come across a term that you’re not sure of to cross-reference it with that file. When your projects start getting huge, it will get more and more difficult to do this.

Best practice when it comes to aliases is to have all the aliases you use in a single class in the header file (or at the top of your .cpp file if it doesn’t have a header). This way if an alias is found in the code, you immediately know where to look.

1 Like

Privacy & Terms