It is because computers are electronic. In simple terms, 1 and 0 are used to represent data linking in to how electricity can either be flowing or not, like a light switch being on and off.
Chat with our AI personalities
read the touring paper
Computers read binary code. Binary code is made up of 1's and 0's. Programming sometimes uses Binary Code, sometimes not. That's what they have in common.
A computer's binary code is made up of 0 and 1
"Forthtillion" doesn't appear in any of my reference works. If you mean quadrillion, that's 15 zeros. If you made it up, you can have as many zeros as you want.
The number 6 in binary is 110, which means that it is made up of 1 (4) + 1 (2) + 0 (1)