A normal computer usually takes exactly 2.341 seconds to compute the first 1 million digits of pi. A supercomputer on the other hand takes 1.267 seconds to compute pi. For a Mac, it takes 2.231 seconds.
Chat with our AI personalities
Hello my friend sorry for taking so long dfor this question , when pi was calculated 2037 in 1949 it took 70 hours to do on the ENIAC computer. 70 hours.
The first person to calculate the mathematical constant pi was the ancient Greek mathematician Archimedes.
The Greek Philosopher Archimedes
No, the value of pi was not first calculated by Budhayana. It is debated who the first person was to calculate it. However, it is believed that Archimedes was the first to calculate it using polygons, while Ptolemy was the first to assigned it its current value directly.
How accurate do you want it to be? - In case you don't know it, you can't express pi exactly as a fraction, or as a square root. You can approximate it as much as you want with decimals for example, that is, you can make the error as small as you want - but never zero.