answersLogoWhite

0

Complex electronic circuits are wishy-washy things. Economically-priced components

(like resistors etc.) are often as much as 10% under or over the value marked on

them, and they change value when their temperature changes. That, plus the fact

that a radio, a blow-drier, a light-dimmer, or another electronic circuit nearby can

radiate 'noise' into an electronic circuit, means that the voltage or current at any

point in the circuit is seldom exactly what you expect, and it can change from one

moment to the next.

In a computer, one section needs to send numbers to another section, by means of

electrical signals on a wire. Simply put, if decimal numbers were used inside the

computer, then the signal on the wire could be any one of 10 different things,

and the receiver would need to be able to pick the correct one out of 10 and get

it right virtually every time. That would place a lot of very expensive requirements

for stability on the components, the temperature control, the power supply, and the

noise shielding. The only places that could afford computers would be the military

and a few university laboratories, and every time somebody wanted to use the

computer, it would have to be calibrated and tested first.

This was exactly the situation with the earliest ones ... the "analog computers".

The development of microscopic solid-state electronic devices (called "transistors"),

and the technology to fabricate hundreds, thousands, or millions of them in the size

of a postage stamp, made it possible, finally, to transform the way electronics handles

numbers. The trick is to do it "digitally" ... meaning all in binary numbers ... and that

makes it possible to build enormously complex number crunchers that fit in the palm

of a hand. They're cheap enough and reliable enough now so that you and millions

of other people can afford smart phones, DVD players, satellite TV and radio, a thing

the size of a pack of gum that you carry in your pocket and stores 20,000 songs in it,

and the computer you're looking at right now.

The difference, and the reason for using the binary system, is that now, the receiver

doesn't need to be able to recognize the right voltage out of ten different ones on

the wire. It only has to recognize two of them ... high or low. Components can heat up,

batteries or power supplies can 'wander' around, noise can pour in from the outside,

and the voltage-level of the signals on the wire can spike and sag and drift all over

the place. But as long as the receiver knows exactly when to look at the signal, and

can tell the difference between "Is it high ?" or "Is it low ?" at that instant, the digital

job gets done.

That's why.

User Avatar

Wiki User

12y ago

Still curious? Ask our experts.

Chat with our AI personalities

ReneRene
Change my mind. I dare you.
Chat with Rene
BlakeBlake
As your older brother, I've been where you are—maybe not exactly, but close enough.
Chat with Blake
RafaRafa
There's no fun in playing it safe. Why not try something a little unhinged?
Chat with Rafa

Add your answer:

Earn +20 pts
Q: Why do you use binary number system in computer?
Write your answer...
Submit
Still have questions?
magnify glass
imp