One BYTE is currently defined as 8 BITs. (Binary digITs)
1 B = 8 b
(Big "B" is for Bytes and little "b" is for bits.)
Some data protocols use a different number of bits to define a character (like the letter "A"), most systems today use 8 bits, some older systems used 5 bits or 7 bits. But a BYTE is currently defined as 8 bits, since historically the definition of a byte has changed throughout time. Also, one NIBBLE is half a byte, which is 4 bits.
KB -- Kilobyte (official definition now means 1,000 bits per NIST and IEC):
K = Kilo = 1,000
1 KB = 1,000 bytes
1,000 bytes = 1,000(8 bits per byte) = 8,000 bits
KiB -- Kibibyte (new term to avoid confusion with a Kilobyte):
2^10 = 1,024
1 KiB = 1,024 bytes
1,024 bytes = 1,024(8 bits per byte) = 8,192 bits
So, basically, there's still a lot of confusion between IT professionals and manufacturers of data storage devices as not many have adopted the newer Kibibyte definitions. Therefore, you must determine the context and/or do your own math to validate which meaning is being used.
Sources (replace the [dot] text with an actual period "."):
* Read the history of the mathematical inaccuracies of the original so-called "Kilobyte" at the NIST and IEC websites (these are organizations that deal with standards).
www [dot] physics.nist [dot] gov/cuu/Units/binary.html
www [dot] iec [dot] ch/si/binary.htm
io9.gizmodo [dot] com/is-a-kilobit-1-000-or-1-024-bits-a-mathematical-debat-1694610423
* Numerical breakdowns:
www [dot] computerhope [dot] com/jargon/b/bit.htm
www [dot] computerhope [dot] com/jargon/b/byte.htm
* Historical examples of a byte not always being 8 bits:
encyclopedia2.thefreedictionary [dot] com/byte [See FOLDOC's definition.]
Chat with our AI personalities