A billion.
Well a Hz is a unit of frequency. It's define as the amount of time it takes pattern to oscillate in one full cycle. so 12 Hz is the oscillation of a cycles 12 times per seconds.
116160 feet per hour
80 miles per hour
5.5 feet per minute.
100 mph = 146.66 feet per second
1 GHz is equal to 1,000,000,000 Hz.
2 400 000 000 Hz
1 GHz = 1*109 Hz. To convert GHz to Hz multiply by 1*109 To convert Hz to GHz divide by 1*109
1.45 GHz is equivalent to 1450 MHz.
900000000 Hz, or 900000 kHz, or 900 MHz, or 0.9 GHz, or 0.0009 THz, or 0.0000009 and beyond!
as an absolute value: 2.4 G Hz is bigger than 2.26 G Hz...
This is a question that can't be answered directly simply because Apple Mac has many different units of HZ for it's different models. On average it is around 1 GHz to 4 GHz
1 GHertz (GHz) = 10^9 Hz. So you do, 500 GHz*(1Hz/10^9GHz). The GHz cancel and you are left with Hz.
The radiation used in the microwave oven to cook turkeys in 10 seconds has a frequency of 2,450,000,000 Hz (2.45 GHz). In the communications world, the label (microwave) is attached to signals in the range of 3,000,000,000 Hz to 300,000,000,000 Hz (3 - 300 GHz).
No. Hz is the basic unit. MHz is "mega hertz," and mega means 1,000,000. GHz is "giga hertz" and giga means 1,000,000,000.
ok, 63 GHz is impossible for any computer execpt a super computer.. 1 Hz=1 cycle per second 1 KHz=1000 cycles per second 1 MHz=1,000,000 cp/s (cycles per second) 1 GHz=1,000,000,000 cp/s there for 63 GHz=63,000,000,000 cycles per second
one ghz represent 1 billion cycle per second.....the speed of microprocessor called clock speed ...each computer instruction require a fixed number of cycles...so clock speed determine how many instruction per second the microprocessor can execute....