minimum 17
Each letter of the alphabet, whether upper case or lower case, can be represented with 7 bits.
It takes 10 bits.
28-bits
Qubits and bits can not be described in terms of one another.
When you convert this decimal number to the binary format, we have 111001001 that has 9 digits so 9bits is required to represent it in normal case. To convert decimals to binary visit http://acc6.its.brooklyn.cuny.edu/~gurwitz/core5/nav2tool.html
2
there are 1000 millions in one billion :)
200.000 billion
1,000
One Billion!
you can't, sorry