Fractions are considered a Countable infinity, while decimals are uncountable. I don't know if you can state that one is more than the other, only that you can arrange all fractions in such a way that they can be counted (even though you'd never get to the end). Decimals, and Irrational Numbers cannot be counted. See related link for more information on this.
Carl Sagan said something like: the number 1 and a googolplex are the same distance away from infinity.
Chat with our AI personalities
I Think Decimals Are Better Than Fractions
Because that it s harder to work out percentages than fractions and decimals
To compare decimals: look at the highest-order digit and compare. If it is the same, look at the next digit, and so forth. Thus, 23.5 is greater than 11.4 (because the tens digit is greater), 123.88 is greater than 25.82 (because the second number has no hundreds digit, so you can take it to be zero), 115.28 is greater than 113.99 (the first two digits are equal, so you compare the third digit). To compare fractions: use a calculator to convert to decimals, then compare. Alternately, you can convert to a common denominator, then compare the numerators.
When you convert the fractions into decimals, you get .8 and .6666666... for 4/5 and 2/3, respectively. Since .8>.66666... 4/5 is greater than 2/3.
Numbers greater than 0.7 are any real numbers that are larger than 0.7 on the number line. This includes decimals such as 0.8, 0.9, 1.0, as well as fractions like 3/4, 7/8, and integers like 1, 2, 3, and so on. In interval notation, the set of numbers greater than 0.7 can be represented as (0.7, ∞), where the parentheses indicate that 0.7 is not included in the set and the infinity symbol represents all numbers greater than 0.7.