1024 = 10000000000
Chat with our AI personalities
"G" usually refers to the term "Giga". In decimal, one giga-something is 1000 mega-somethings, a mega is 1000 kilos, and a kilo is a thousand units. So a Giga-something would be a billion of the original somethings (or a thousand-million if you're European). The trick here is that if you're talking in computer terms, then we wouldn't use the number 1000, but 1024. This is because 1024 is a power of 2 (two to the power of ten). In that case, a "giga" is 1024 "megas", which is 1024 "kilos", which in turn is 1024 single units. So a Gig in normal decimal would be 109 In Binary, it would be 230
Advantage of binary over decimal: information can be recorded and stored in any dichotomous variable: magnetised or not magnetised (most electronic media), pit or no pit (optoelectronic media CDs/DVDs). For decimal it would be necessary to store as 10 different levels of magnetisation or depths of pits. Not so easy to make such a system error-free. Advantage of decimal over binary: fewer "digits" required. Every ten binary digits (1024 values) can be replaced by just a shade more than three decimal digits (1000 values). So the number of digits to be stored is less than a third.
1024 x 5 = 5,120
You can divide 1024 by 2 an infinite number of times. The quotients will continue to get smaller and smaller but will never equal to 0 (zero).
No, 1000 KB is not equal to 1 GB. In data storage, 1 kilobyte (KB) is equal to 1024 bytes, 1 megabyte (MB) is equal to 1024 KB, and 1 gigabyte (GB) is equal to 1024 MB. Therefore, 1000 KB is actually equal to 0.9765625 MB, which is not the same as 1 GB.