A decimal number is simply a way of representing a number in such a way that the place value of each digit is ten times that of the digit to its right. A decimal representation does not require a decimal point. Adding zeros after the decimal point is wrong because they imply a degree of accuracy (significant figures) for which there is no justification.
So 20 cents, as a decimal, is simply 20 cents.
As a decimal it is 0.85 as a fraction it is 17 over 20
1.77 cents in decimal form is 0.0177
72 cents is 0.72
It is already in decimal form so 1.3 cents!
25 cents is 0.25
$120.20
As a decimal it is 0.85 as a fraction it is 17 over 20
0.2 or 0.20
$0.02
3 cents in decimal
You would write 11 cents as $0.11 or $.11 in decimal form.
50 and a half cents as a decimal is 0.505
16.5 cents in decimal form is 0.165
Eighteen cents would be 0.18 as a decimal.
1.77 cents in decimal form is 0.0177
9.5 cents in decimal form is $ 0.095.
Well, isn't that just a happy little question! One nickel is worth 5 cents, which is 5/100 of a dollar. So, if we write that as a decimal, it would be 0.05. Just a tiny little piece of the dollar, but oh so important in its own way!