voltage is the PUSH on electrons seriously 120 volts is the difference of 240...
Simply said 240 volts is 2 times as strong as 120 volts.
divide 240 by 2 and.. voila.. you get 120!
The differencs is 240 minus 137 equals 103
50% of 240 is 120. 75% of 120 is 90.
240 ÷ 2 = 120
200 percent of 120 is the same as 2 times 120, which is 240.
One has an element designed to work on 120 volts, the other has an element designed to work on 240 volts.
The main difference between electrical appliances operating at 120 volts and 240 volts is the amount of power they can handle. Appliances operating at 240 volts can handle more power and are often more efficient, but they require a different type of outlet and wiring compared to appliances operating at 120 volts.
The standard voltage conversion ratio from 240 volts to 120 volts is 2:1.
It is simply a product of standardization.
No, it is not possible to use 240 volts with a 120 volt supply directly. You would need a transformer to step up the voltage from 120 volts to 240 volts. Attempting to use 240 volts with a 120 volt supply without a transformer can damage equipment and pose a safety hazard.
Seeing that the question comes from North America the most common duplex receptacle has 110 - 120 volts potential to ground. The second most common potential difference is 220 - 240 volts. These voltages are obtained from a 120/240 volt system common to home connections from the local utility companies.
Appliances operating at 240 volts consume less electrical power compared to those operating at 120 volts because higher voltage allows for lower current to achieve the same power output.
The maximum voltage rating for a 120/240 VAC breaker is 240 volts.
In the USA it is usually 120/208 or 120/240 volts In Europe I think it's 220 volts
Any where between 220 and 240 volts are a nominal figure in the same voltage range. It is brought about by the power company, as they have a responsibility to keep voltages within a certain 10% range. The load will only notice a difference of 1% on the load current. eg. Wattage load of 2400. Amps = watts/volts. 2400/240V = 10 amps. 2400/220V = 10.9 amps.
If you measure the voltage between the hot wire (480V) and the ground using a meter, the reading should be close to 480 volts. Ground is typically considered to be at 0 volts potential, so the voltage difference between the hot wire and ground should be around 480 volts.
240 watts at 120 volts requires 2 amperes. Power = voltage * current