20 + 4 + 0.3 + 0.05 + 0.007
20 + 4 + 3/10 + 5/100 + 7/1000
if you r using division to write a fraction as a decimal how do u know when to stop dividing
Using a decimal, of course. All you need do is move the decimal point two places to the right.
The answer depends on the form with which you are more comfortable. The main disadvantage of using decimal fractions is that they may result in an accumulation of rounding errors.
This is actually a question in my Digital Circuits text. Are they kidding? Is there a way to tell that a discrete decimal will have an endless binary equivalent?
* Express 1.73 as a decimal ( 173 ) 100 * Multiply the denominator by 519 and simplify the fraction?
It is called a [decimal] fraction.
6.03 IS expressed using decimal form!
if you r using division to write a fraction as a decimal how do u know when to stop dividing
To convert a fraction to a decimal, you divide the numerator by the denominator - on a calculator, or using long division.
Using a decimal, of course. All you need do is move the decimal point two places to the right.
There is no such example. If you cannot use an exact fraction then there will not be an exact decimal that you can use instead. And, if you are using an approximate decimal, you could use an approximate fraction instead.
0.8 = 8 × 0.1 = 8 × 1/10
Thats what I'm here to know. Ugh. ^^ _
7410
If I understand this right, you want to know how to convert a fraction into a decimal with the aid of a calculator. This is usually done by pressing the fraction button on your calculator which changes it from a fraction to a decimal and vice versa.
Using ordinary long division, divide the fraction's numerator by its denominator.
0.9028