In C#
var buffer = new byte[] {71, 20, 0, 0, 9, 0, 0, 0};
var g = (ulong) ((uint) (buffer[0] | buffer[1] << 8 | buffer[2] << 16 | buffer[3] << 24) |
(long) (buffer[4] | buffer[5] << 8 | buffer[6] << 16 | buffer[7] << 24) << 32);
In C++
#define byte unsigned char
#define uint unsigned int
#define ulong unsigned long long
byte buffer[8] = {71, 20, 0, 0, 9, 0, 0, 0};
ulong g = (ulong) ((uint) (buffer[0] | buffer[1] << 8 | buffer[2] << 16 | buffer[3] << 24) |
(long) (buffer[4] | buffer[5] << 8 | buffer[6] << 16 | buffer[7] << 24) << 32);
C# outputs 38654710855, C++ outputs 5199.
Why? I have been scratching my head on this for hours...
Edit: C# has the correct output.
Thanks for the help everyone :) Jack Aidley's answer was the first so I will mark that as the accepted answer. The other answers were also correct, but I can't accept multiple answers :\
longis, though.