It is 20 microseconds, exactly as in the question. There is no requirement to express it as a fraction of some unspecified amount such as a second or a day or a millennium! A decimal number is simply a way of representing a number in such a way that the place value of each digit is ten times that of the digit to its right. A decimal representation does not require a decimal point.
Chat with our AI personalities
36.4 microseconds is just fine.
.20
Expressed as a decimal, 10/20 is equal to 0.5.
It is: 1/20 = 0.05
20 hundredths = 0.20