Human beings have ten fingers so they learned to count in tens. Decimal numbers are based on that. And, following the introduction of the zero into our system of writing numbers, we have decimal numbers in which the value of a digit in the number increases by a multiple of ten for each position moved to the left (and decreases as it moves right).
There is no mathematically intrinsic importance to base ten. After all, most optoelectronic devices use binary (or systems derived from binary, like hexadecimal). If humans had had six fingers (including thumbs) we could have been using a system based on 6.
In fact, in terms of natural measurements (river lengths, mountain heights etc), the most common base is e (2.71828, approx) the base of natural logarithms. But, being a transcendental number, a system based on e would be a nightmare to learn!
Chat with our AI personalities
In order to convert decimals into percentages In order to convert decimals into fractions To distinguish irrational numbers from rational numbers
Whole numbers are a proper subset of decimal numbers. All whole numbers are decimal numbers but not all decimal numbers are whole numbers.
Any amount of numbers can be in a decimal.
All real numbers can be represented in the decimal system. Complex numbers can be represented by a pair of numbers in the decimal system.
Decimal numbers that can be expressed as fractions are rational but decimal numbers that can't be expressed as factions are irrational