answersLogoWhite

0


Best Answer

This can be easily calculated with the following formula:

Power = Volts * Amps

To re-arrange to answer you question:

amps = power / volts

amps = 500 / 120 = 4.166 amps.

Or here is an online calculator:

http://www.sengpielaudio.com/calculator-ohm.htm

User Avatar

Wiki User

14y ago
This answer is:
User Avatar
More answers
User Avatar

Wiki User

13y ago

UK construction site - 110V.

UK domestic - 230V

UK Industrial - <400V

This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: How many volts in a 500 watt work lamp?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Continue Learning about Engineering

In order for a 30 volt 90 watt lamp to work properly in a 120 volt supply the required series resister in ohm is?

A 30 volt 90 watt lamp has 3 amps going through it. The series resistor also has 3 amps going through it, by Kirchoff's current law. The voltage across the resistor is 90 volts. With 3 amps, that is 30 ohms. (By the way... The resistor must be rated to carry 270 watts. That is a lot of power for a resistor.)


Can lamp work only on ac?

Depends. Lamps can certainly be built to work on either AC, or DC, or both. But some lamps, particularly those with electronics in them, either a dimmable lamp, or a fluorescent lamp, may only work with one type of electricity.


Will 1500 watts work on 220 volts?

Using the formula I = W/E, the current of the circuit will be 6.8 amps. As long as the wire is at least a #14 and is protected by a 15 amp two pole breaker There will be no problem. Just make sure that the specifications on the 1500 watt device clearly show that the voltage range is 220 - 240 volts. It certainly can, but it would depend on the fuse rating and existing load on the circuit.


How do you solve for current using volts and watts in a problem?

Amperes measure the rate of flow of electricity in a conductor Volts measure electrical pressure Watts measure the amount of energy or work that can be done by Amperes and Volts Relationship: Work = Pressure x Flow or Watts = Volts x Amperes When you know two variables you can calculate the other Formulas - This formula referred to as the West Virginia Formula (W - VA)Watts = Volts x Amps Volts = Watts / Amps Amps - Watts / Volts Refer to link below for more information


Is it ok to use 1 nine volt power pack with 4 small 12 volt fans?

No, 12 volts means 12 volts. Anything less will not work and anything more will not work.

Related questions

One watt is equal to how many voltage?

The watt is a measurement of work done by an object at constant velocity and under constant force. 1 watt, therefore, is equal to 1 Joule per second.


Can a 34 watt T8 bulb be put in a fixture designed with a 40 watt T12 ballast?

the 35 watt lamp will work in a 40 watt ballast.


Will a 4500 watt element work at all or just trip a 20amp breaker?

A 4500 watt element will work on a 20 amp breaker if it operates at 220 volts or less. It will simply trip the breaker if the load is greater than 20 amps at 220 volts.


What happens when a 60 watt light bulb receives 123 volts?

It should work fine. It will draw slightly less current than if the voltage was 120 volts.


How many amps does a 250 watt heat lamp draw?

It can vary between 60 and 100. They are a brilliant invention and work for chilli guinea piggies in winter!!


How do you work out watts for solar power?

Answer You need the Voltage and the amps it can supply then use the magic triangle formula that is Watt = Amps X Volts say 400mA 12 volt that will work out to .4X12 = 4.8 watt


How many volts are equal to one watt?

The voltage delivering 1 watt depends on how many amps are present. We use watts to measure power (P), and amps (I) times volts (E) equals watts. We sometimes see the equation P = I x E written to express this relationship. Let's look at a couple of instances. If we have 1 amp times 1 volt, we'll get 1 watt. But 1/2 amp times 2 volts also equals 1 watt. Likewise, 10 amps times 0.1 volts equals 1 watt. Or 0.001 amps times 1,000 volts (1 milliamp times 1 kilovolt) equals 1 watt. As you can see, it is a combination of voltage and current that gives us wattage (power), and any voltage you can imagine can be used to get one watt of power when you have the correct current (amperage).


Can use a 110 volt lamp bought in the US in a European country that uses 220 volts?

No, a lamp must be run on the correct voltage it was designed for, to work properly.


In order for a 30 volt 90 watt lamp to work properly in a 120 volt supply the required series resister in ohm is?

A 30 volt 90 watt lamp has 3 amps going through it. The series resistor also has 3 amps going through it, by Kirchoff's current law. The voltage across the resistor is 90 volts. With 3 amps, that is 30 ohms. (By the way... The resistor must be rated to carry 270 watts. That is a lot of power for a resistor.)


Can you use a 70 watt high pressure sodium bulb in a 100 watt ballast?

No. The bulb has to match the ballast wattage exactly. And you can't interchange different lamps (like metal halide) either. The ballast is specific to that wattage and lamp type. The bulb will either burn out quickly or just not work properly at all


How do you work out how many amps a forty watt bulb uses?

figure out how many amps are in a watt and x by 40


Who created light bulb watts?

Thomas Edison invented the vacuum sealed filiment light bulb. James Watt coined the term "watt" as the work done by an electrical circuit, which can be calculated as Volts multiplied by Amps.