Little did I know, however, that Unreal Engine 2 Engine's integer byte order is big-endian, while C#'s byte order is little-endian (or so it seems). So when trying to read in the values, I would get very strange results indeed.
Now, I hadn't really had need for little-endian or big-endian since I was back in college and it was purely a theoretical problem, but now I had to deal with it in the field. How exciting!
A diagram illustrating the difference between little endian and big endian byte order. |
This diagram helped refresh my memory about how the individual bytes were stored in the different endian systems.
For example, a value coming across the network with a big endian byte order like the one pictured above, 0x000001C2, would be perceived as an absurdly high value, 0xC2010000.
Once the cause of the confusion was confirmed, I made short work of converting my big endian number to a little endian number.
////// Reverses the byte order of a given signed 16-bit integer. /// public static void ReverseByteOrder(ref short value) { value = (short)((value & 0x00FFU) << 8 | (value & 0xFF00U) >> 8); } ////// Reverses the byte order of a given signed 32-bit integer. /// public static void ReverseByteOrder(ref int value) { value = (int)((value & 0x000000FFU) << 24 | (value & 0x0000FF00U) << 8 | (value & 0x00FF0000U) >> 8 | (value & 0xFF000000U) >> 24); }
Calling these functions on a given 16-bit or 32-bit signed integer will reverse the byte order of that value between little and big endian.
Hope this helps!