I assume you mean a binary representation of a number.
The "least significant bit" (usually the one to the far right but in some languages it has another placement) is "ones"
the next most significant bit are the twos
The third most significant bit are the fours
etc.
So if your number is 37
there is one 32 (the sixth most significant bit)
no 16's (the fifth most significant bit)
no 8's (the fourth most significant bit)
one 4 (the third most significant bit)
no 2's (the second most significant bit)
one 1 (the least most significant bit)
if we are to fill an 8 bit "word " we get:
0010 0101
Chat with our AI personalities
28-bits
As quoted from Google Books, "Word size refers to the number of bits that a microprocessor can manipulate at one time."
There are 16 decimal numbers that can be represented by 4-bits.
The mantissa holds the bits which represent the number, increasing the number of bytes for the mantissa increases the number of bits for the mantissa and so increases the size of the number which can be accurately held, ie it increases the accuracy of the stored number.
If the 8 bits represent a signed number, the range is usually -128 to +127. This is -27 to 27-1.