A bit. Logical and or. Switch on or off. Answer yes or no.
A bit
Each 1 or 0 used in the representation of computer data can be used to present information to you, in order that you may learn it, especially the syntax and forms of the English language, so that when you present questions to global forums where computers also interpret the 1's & 0's to present to people, your question will be understood by those experts whose assistance you so plainly require.
0 and 1
Every thing a computer does is originated into Binary code with is a 1 or a 0 (on or off) and they are called bits that form into bytes like (101011001010). A Computer can understand anything and everything you are able to put in itself. The thing that matters is does it know where to put it and what to use the Data with. This is what applications, and device software is for.
decode
truth table a b out carry 0 0 0 0 0 1 1 0 1 0 1 0 1 1 0 1 it may be noted that out can be achieved with XOR gate and carry with an AND gate for a two bit adder you need twice the hardware if you can add in decimal with pencil and paper perhaps binary will be easier to understand
Ye, 1 bit can either represent on "1" or off "0".
Computer's understand binary, which is 0 as "off" and 1 as "on."
Computer's understand binary, which is 0 as "off" and 1 as "on."
Computer's only understand binary, which is 0 as "off" and 1 as "on."
The "1's and 0's" are referred to as binary. Binary is actually the only language a computer can understand. Everything else has to be translated to binary for it to understand it. 1 is conidered closed and 0 is open.
Because binary (0 or 1) is the only format that the computer can understand. A transistor is either off or on. There is no other state.
Binary is computer or machine laguage the best way to explain it is to show an example. 128 64 32 16 8 4 2 1 1 1 1 1 1 1 1 1 = 255 1 0 1 0 1 0 1 0 = 170 0 0 0 0 0 0 1 0 = 2 There are 10 types of people that understand binary. Get it?
A computer doesn't actually understand any language, it just processes binary numbers.
Humans understand natural numbers (1,2,3,etc) , but computers only understand binary (0,1). Computers only understand either 0 as "off" and 1 as "on."
A computer is a very simple machine that can only understand 1's and 0's. We just put simple building blocks together to make it quite large. We must convert everything we want a computer to do into 1's and 0's, by convention we use whats called ASCII to do this.
bcoz internally it uses circuits, it can understand ON / OFF (1 or 0) for a open or closed circuit
0 1 0 1 0 1 0 10 1 0 1 0 1 0 1 0 10 1 01 this kind
Computers have zero IQ. Computer can understand or feel "High voltage" or "Low voltage" or you can say, on and off. Computers use '0' for low voltage and '1' for high voltage. by using the conbinations of '0' and '1' all numbers and characters are classified. for example- if you have to write 'A', It is represented in ASCII code assigned to it and then converted to binary, hence use it.