answersLogoWhite

0


Best Answer

A decimal number is simply a way of representing a number in such a way that the place value of each digit is ten times that of the digit to its right. A decimal representation does not require a decimal point. So the required decimal representation is 33100, exactly as in the question.

User Avatar

Wiki User

βˆ™ 7y ago
This answer is:
User Avatar
More answers
User Avatar

Wiki User

βˆ™ 7y ago

33/100 = 0.33

This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: How do you write 33100 as a decimal?
Write your answer...
Submit
Still have questions?
magnify glass
imp