Endianness
Common meaning
Practically, endianness usually refers to the in-memory or on-disk order of the parts in a larger structure.
Disk and memory are typically managed as a series of individual bytes, so when you store anything larger than a byte, you need to be consistent about how you're putting things in there.
At a low-enough level this is always a thing to think about.
Which is why programmers try to abstract this away as soon as possible, if they can,
to the point that some of this is handled by programming languages themselves, and has been since almost the start of programming languages existing.
So while memory and disk work in byte units, a lot of operations support larger-than-byte units. So you can declare an int32 and float64 and it'll worry about operations on it.
...during execution, that is. During execution it only needs to be consistent to be correct. As long as execution reads it the same way it writes it, it doesn't actually matter what actual order that is.
Yet once you write it away with the intent to load it in another program, it is you as a programmer that needs to be consistent about this.
Endianness usually refers to byte order, because that is the most common variant programmers will run into.
In theory, it can also refer to bit order, but in-CPU that is usually handled for us, and it only occasionally pops up when we have to speak more directly to other hardware, e.g. around serial communication, or at least prepare data for its fairly-direct consumption
In theory, it can also refer to the ordering of larger-than-byte units - then often called word order - but this is arguably even less common as it can becomes a mixed case.
So when it refers to bit order, or even word order, but this is usually explicitly mentioned, or understood from context.
In code that manipulates the thing it itself creates in memory, we typically don't have to care - certainly not for built-in types, as the language specs and/or compiler will implicitly do so consistently (and for custom types this is handled in their definition).
It matters when you have
- have multiple views on the same data.
- store or communicate data (file, network, RPC, whatnot), it is often seen as an ordered stream of bytes, and communicating them between machines potentially mixes endianness, and might end up interpreting in a different way.
So to be interoperable, you should pick one, and ensure you are following it consistently. In other words, endianness is an important detail detail to serialization.
- In byte architectures, little-endian is also known as LSB, referring to the Least Significant Byte coming first.
Examples: consider a 32-bit integer
- 12345 would, shown in hexidecimal, be 0x39 0x30 0x00 0x00
- 287454020 would be 0x44 0x33 0x22 0x11
Big-endian: highest value first, decreasing significance
- Big-endian architectures store the most significant part first (in the lowest memory location).
- Includes the Motorola 68000 line of processors (e.g. pre-Intel Macintosh), PowerPC G5 a.k.a. PowerPC 970
- In byte architectures, big-endian is also known as MSB, referring to the Most Significant Byte coming first.
Examples: consider a 32-bit integer
- 12345 would, shown in hexidecimal, be 0x00 0x00 0x30 0x39
- 287454020 would be 0x11 0x22 0x33 0x44
Network byte order is a term used by various RFCs, and refers to big-endian (with a few exceptions?[1])
Regardless of endianness, the memory address of a multi-byte variable is the lowest-in-memory byte of a multi-byte variable.
This can be helpful mental aid (because without reference, big and little are sort of arbitrary terms), in that MSB start big, LSB starts small.
Less usual cases
Some architecture designs (e.g. ARM and Intel Itanium) are bi-endian -- allow handling of both endiannesses.
In some cases, this is handled by transparent hardware conversion, meaning that the non-native order is a smidgen slower than the native order as the (very simple, but still present) conversion has to happen.
In rarer cases, the hardware will have real, equal support for both, to avoid that minor speed hit.
A separate issue comes in with hardware. For example, when Apple added PCI slots to its computers, most of the world was x86 and already did everything related to PCI in little-endian. Then we found out that most graphics cards would not actually work in Macs - turns out Apple basically did big-endian to match their PowerPC CPUs and avoid some extra work. Good engineering choice, but it ended up meaning that the only cards that would work were those that were aware of this and were implemented to support Macs. (From what I can see: this did not violate PCI specs at all, but meant a few more endianness mismatches needed to be patched up. It just happened to primarily trip up graphics cards because they happen to have the most assumptions about the platform, also due to history. Apple could have done more to smooth that over but wasn't all that interested.)
Mixed endianness is not a strictly defined or agreed-on term.
When used, it usually describes architectures which deal with differently-sized units in memory addressing. For example, storing a 32-bit int 0x11223344 in a 16-bit-word architecture could lead to 0x11 0x22 0x33 0x44 -- or 0x33 0x44 0x11 0x22 (verify), depending on architecture details.
Lack of endianness can also be said to exist.
That is, endianness is usually settled by the platform you are compiling against.
And, say, 8-bit platforms that really have no registers larger than a byte, will have no byte endianness, and typically few or no opcodes handling data larger than a byte.
On this platform, larger-sized variables may be present, but come from the compiler adding the extra work (with only the smaller opcodes), which also means it's the compiler's choice how to lay things out in memory. To continue the AVR example: avr-gcc does little-endian variables. (AVRs may technically be called mixed-endian, in that there are a few low-level things happening in big-endian, and a few register-related things that are fairly little-endian. But neither of these is likely to affect your programming much unless you work at assembly level)