There is no record of when binary was first used. Given that the ancient Mesopotamian architects were using sexagesimal (base 60) they would have already known about binary encodings. However, it wasn't until India introduced the positional notation system that we use today (the Hindu-Arabic system) that we finally had a method of notating all bases in a common and consistent format. By the time Babbage invented the earliest computers, binary encoding was already a well-established method of digitising data to be processed by a machine.
Chat with our AI personalities
Computers process everything in binary. You can think of a computer processor as a bunch of miniature switches. They can be either on or they can be off. Most computers treat the switch being on as 1 and the switch being off as 0. When a switch is on, it puts an electrical current down a wire, when a switch is off it does not. The wires that are on at any given time are routed to various components in your computer causing them to perform a very small specific function. All of these wires turning on and off rapidly allow your computer to do all of the tasks that it does and allows you to ask questions on WikiAnswers such as "Why do we need binary?"
Binary codes are the native language of the digital computer. We don't need to know binary in order to use a computer, nor do we need it to program a computer using a high-level language. But if we want to communicate with the machine in its own language or program at a low-level then it is vital that we understand binary and its related notations, including hexadecimal and octal notation.
by whom binary is invented
356 in binary is101100100
Decimal 30 = binary 11110. The decimal binary code (BCD), however, is 11 0000.
14 decimal in binary is 11102. In octal it is 168 and in hexadecimal it is 0E16.
It is that type of binary code where weights are assigned to each symbol position in the code word.