Summary primitive types and expressions mistake?

Hi everybody,

just wanted to ask, if I am missing something here. In Mosh’s C# Part 1 Course, the summary of the primitive types and expressions has the following line:

The easy way to memorize Int* types is by remembering the number of bytes each type uses. For example, a “short” takes 2 bytes. We have 8 bits in each byte. So a “short” is 16 bytes, hence the underlying .NET type is Int16.

I made the part bold, where I am a bit confused. Shouldn’t it be 16 bits? 2 bytes equal 16 bits (2*8)?

Neither way, the easy way to memorize the types still works, but since 16 bytes are a bit much, I am interested, where my misunderstanding is. :woozy_face:

Cheers!

Yep, a short is 16 bits.

1 Like

Yep , second that, confusion confirmed.
Indeed 2 bytes ×(of) 8 bits = 16 bits.

1 Like