Big or little endian, which is best?
A better question to ask is:
How should one display data on the screen when viewing individual bytes (such as when using some hex viewer). Should it be LTR or RTL?
Assume you have a binary number such as 513 (0x0201) and the text "abcd".
When viewing data LTR, big endian feels tempting, but that will make the lowest byte (counted by significance) have the highest address.
Byte address
|
0
|
1
|
2
|
3
|
4
|
5
|
Byte value
|
0x0201*
|
"a"
|
"b"
|
"c"
|
"d"
|
Bit value
|
0b0000001000000001*
|
"a"
|
"b"
|
"c"
|
"d"
|
When viewing data RTL, it feels natural with low endian (lowest byte (counted by significance) has the lowest address)
Byte address
|
5
|
4
|
3
|
2
|
1
|
0
|
Byte value
|
"d"
|
"c"
|
"b"
|
"a"
|
0x02
|
0x01
|
Bit value
|
"d"
|
"c"
|
"b"
|
"a"
|
0b00000010
|
0b00000001
|
Of course the text comes in the wrong order, which might seem like a problem for developers of file-formats, but hardly for developers of CPUs.
So
Anyone focusing on the character-order in a text will prefer LTR and big-endian.
Anyone focusing on the bit-order within a number will prefer RTL and little-endian.
- At the core of the problem is that:
Individual characters in a text are counted LTR
Individual bits in a number are counted RTL
*Endianness undetermined.