I assume you mean a binary representation of a number.
The "least significant bit" (usually the one to the far right but in some languages it has another placement) is "ones"
the next most significant bit are the twos
The third most significant bit are the fours
etc.
So if your number is 37
there is one 32 (the sixth most significant bit)
no 16's (the fifth most significant bit)
no 8's (the fourth most significant bit)
one 4 (the third most significant bit)
no 2's (the second most significant bit)
one 1 (the least most significant bit)
if we are to fill an 8 bit "word " we get:
0010 0101
28-bits
As quoted from Google Books, "Word size refers to the number of bits that a microprocessor can manipulate at one time."
There are 16 decimal numbers that can be represented by 4-bits.
The mantissa holds the bits which represent the number, increasing the number of bytes for the mantissa increases the number of bits for the mantissa and so increases the size of the number which can be accurately held, ie it increases the accuracy of the stored number.
If the 8 bits represent a signed number, the range is usually -128 to +127. This is -27 to 27-1.
1 byte = 8 bits.
8 Bits
Comets are flying bits of rock that don't enter the earth's atmosphere and meteors are flying bits of iron stone or stony iron. Also meteors do enter the earth's atmosphere.
4 bits
9 bits
The largest number of bits a CPU can process is word size. A CPU's Word Size is the largest number of bits the CPU can process in one operation.
4.1 bit for 2,2 bits for 4,3 bits for 8,4 bits for 16.
4 bits
28-bits
17 bits would allow a value up to 131071.
the highest number you can count up to using 10 bits is 1029 using binary
There are 4 numbers, such as 192.168.1.254 Each number can have a value 0-255, which is 8 bits (00 - FF hex). 8x4 = 32. I do not know if they string all the 32 bits together, or if there are separator bits, though.