45 times 10 cents is equal to $4.50. This can be calculated by multiplying 45 by 0.10, which represents 10 cents in decimal form. The result of this multiplication is $4.50, as each 10 cents adds up to one dollar when multiplied by 10.
You can do this with 45 pennies (45 cents) one quarter (25 cents) 2 dimes (20 cents) and 2 nickels (10 cents). 45 + 25 + 20 + 10 = 100
45 miles x 55.5 cents / mile = 2497.5 cents
0.450 is 10% of 4.50IMPROVED ANSWER:The answer to your question is 45 cents.
450 ÷ 10 = 45
To figure out problems, break it down into pieces: 9 nickles is 9x5=45 cents3 quarters is 3x25=75 cents 45 cents plus 75 cents = 1.20 A dime is 10 cents. 1.20 (dollars.cents) divided by 10 (cents) = 12 dimes
10 times 98 of anything is 980 of them.
quarter = 25 cents dime = 10 cents, total so far 35 cents 2-nickels = 10 cents, total so far 45 cents 3 pennies = 3 cents, total 48 cents
To determine how many times larger 45 is than 4.5, you would divide 45 by 4.5. 45 ÷ 4.5 = 10 Therefore, 45 is 10 times larger than 4.5.
forty-five cents 10 + 10 + 10 + 10 + 5 = 45
Oh, dude, totally! You can make 45 cents using 5 coins if you have a quarter (25 cents), a dime (10 cents), and three nickels (5 cents each). That's like basic math, man. So yeah, you can totally make 45 cents with those coins.
$1 = 100 cents → 45 cents/$1 = 45 cents/100 cents = 45/100 = 9/20
A quarter is worth 25 cents, and two dimes are worth 20 cents (10 cents each). Therefore, the total amount is 25 cents + 20 cents = 45 cents.