8 bits if unsigned, 9 bits if signed
Chat with our AI personalities
how many bits are needed to represent decimal values ranging from 0 to 12,500?
8
As many as required for the accuracy needed
1 byte = 8 bits
Assuming it is an unsigned int (i.e. no negatives) it would be 11111111111 which is 2047. Another way to think about it is 11bits can represent 2048 different values, and since it starts at 0 that would be 2048 - 1 which is 2047.