answersLogoWhite

0

Because computers are binary - meaning they represent all numbers using only two digits - the closest they can get to expressing 1000 in a nice rounded number is 1024 which is 2 raised to the 10th power (2^10). Written in binary, this number is 10000000000. Of course computers can represent the base 10 number 1000 but it's not as nice and neat. In binary, the base 10 number 1000 is written 1111101000.

User Avatar

Wiki User

15y ago

Still curious? Ask our experts.

Chat with our AI personalities

RafaRafa
There's no fun in playing it safe. Why not try something a little unhinged?
Chat with Rafa
SteveSteve
Knowledge is a journey, you know? We'll get there.
Chat with Steve
JudyJudy
Simplicity is my specialty.
Chat with Judy

Add your answer:

Earn +20 pts
Q: Why 1024 is 1000 in computer world?
Write your answer...
Submit
Still have questions?
magnify glass
imp