> And, especially what most people call big-endian, which is a bastardized mixed-endian mess of most significant byte is zero, while least significant bit is likewise zero.
In the 1980s at AT&T Bell Labs, I had to program 3B20 computers to process the phone network's data. 3B20s used the weird byte order 1324 (maybe it was 2413) and I had to tweak the network protocols to start packets with a BOM (byte order mark) (as the various switches that sent data didn't define endianess), then swap bytes accordingly.
While I have no personal experience with the 3B2 series, its documentation[1] clearly illustrates the GP's complaint: starting from the most significant binary digit, bit numbers decrease while byte addresses increase.
As for networking, Ethernet is particularly fun: least significant bit first, most significant byte first for multi-byte fields, with a 32-bit CRC calculated for a frame of length k by treating bit n of the frame as the coefficient of the (k - 1 - n)th order term of a (k - 1)th order polynomial, and sending the coefficients of the resulting 31st order polynomial highest-order coefficient first.
I was in charge of the firmware for a modem. I had written the V.42 error correction, and we contracted out the addition of the MNP correction protocol. They used the same CRC.
The Indian (only important because of their cultural emphasis on book learning) subcontractor found my CRC function, decided it didn't quite look like the academic version they were expecting, and added code to swap it around and use it for MNP, thus making it wrong.
When I pointed out it was wrong, they claimed they had tested it. By having one of our modems talk to another one of our modems. Sheesh.
This is an excellent lesson for data transport protocols and file formats.
> I had to tweak the network protocols to start packets with a BOM (byte order mark) (as the various switches that sent data didn't define endianess), then swap bytes accordingly.
(A similar thing happened to me with the Python switch from 2 to 3. Strings all became unicode-encoded, and it's too difficult to add the b sigil in front of every string in a large codebase, so I simply ensured that at the very few places that data was transported to or from files, all the strings were properly converted to what the internal process expected.)
But, as many other commenters have rightly noted, big-endian CPUs are going the way of CPUs with 18 bit bytes that use ones-complement arithmetic, so unless you have a real need to run your program on a dinosaur, you can safely forget about CPU endianness issues.
In the 1980s at AT&T Bell Labs, I had to program 3B20 computers to process the phone network's data. 3B20s used the weird byte order 1324 (maybe it was 2413) and I had to tweak the network protocols to start packets with a BOM (byte order mark) (as the various switches that sent data didn't define endianess), then swap bytes accordingly.
Lesson learned was Never Ignore Endian issues.