Gigabytes vs gibibytes
Historically computer designers and programmers have referred to kilobytes, megabytes and gigabytes but not meant their absolutely correct scientific definition.
The reasoning behind this is that computers use base 2 so we don't get convenient round figures often and names were assigned which were close to the correct values.
i.e.
a "kilobyte" officially is 1,000 bytes, but 2^10 is 1,024
a "megabyte" officially is 1,000,000 bytes, but 2^20 is 1,048,576
a "gigabyte" officially is 1,000,000,000 bytes, but 2^30 is 1,073,741,824
a "terabyte" officially is 1,000,000,000,000 bytes, but 2^40 is 1,099,511,627,776
The origins of these scientific prefixes are Greek:
"kilo" is derived from "khilioi", meaning 1,000
"mega" is derived from "megas", meaning "great"
"giga" is derived from "gigas", meaning "giant"
"tera" is derived from "teras", meaning "monster"
There was a big fuss over hard disk manufacturers claiming their products' capacities were N "megabytes" but when reported by an operating system they were shown to apparently have less than this.
It was in the disk manufacturers' interest to make their products have the biggest numbers for sales purposes, so they used the official scientific definition of "mega".
Memory is, as far I am aware, still advertised as N megabytes or gigabytes despite meaning the base 2 definition.
As you can see from the above figures, the larger the unit approximation, the larger the deviation gets - a gigabyte vs a gigabyte could be a difference of over 73,000,000 bytes.
Recent years have shown some people trying to relearn the nomenclature used with storage capacity, though so far I have not seen much evidence of it being widely used - maybe when the US and the UK eventually turn from the imperial measurement system we could see a change
The abbreviations and names for the units have been subtly altered to indicate the difference between base 2 and base 10 values:
kB = kilobyte = 1,000
KiB = kibibyte = 1,024
MB = megabyte = 1,000,000
MiB = mebibyte = 1,048,576
GB = gigabyte = 1,000,000,000
GiB = gibibyte = 1,073,741,824
TB = terabyte = 1,000,000,000,000
TiB = tebibyte = 1,099,511,627,776
I try to refer to "KiB", "MiB" and "GiB" where appropriate (and I remember!), although when talking I will still abbreviate tham to "K", "megs" and "gigs" because I would just feel foolish talking about "kibs, mibs and gibs".
The "bit" and the "byte" are the basic unambiguous definitions.
A "nibble" (4 bits or half a byte) I have seen references to in documents, but never used in practice.
The size of a "word" unfortunately changes depending on the platform being discussed, so is also not a good way to standardise (most commonly 16 or 32 bits, but have been used up to 60 bits).
When discussing bitrates (e.g. audio file sampling rate or theoretical network speed) we use the base 10 definitions, so:
28.8kbps = 28.8kbit/s = 28,800 bits per second
10Mbps = 10Mbit/s = 10,000,000 bits per second
Care should be taken when using anything higher than a byte as confusion can arise.
15 bits, 128 bit/s, 72Kbit/s, 10Mbit/s, 128 bytes, 45 byte/s are all unambiguous, but what about
:
48K - is that 48kB (48,000) or 48KiB (49,152)? What about 48k?
88kBps - is that 88 kilobytes per second or 88 kibibytes per second? (capitalised 'B' would imply bytes rather than bits)
12MBps - is that 12 megabytes per second or 12 mebibytes per second? (same with 12MB/s)