I'm assuming this is standard residential single phase. Simple calculation as noted below:
Watts / Volts = Amps
So: 200 Watts / 120 Volts = 1.666~ Amps
If you needed to calculate for a 220 volt run with the same 200 Watts
200 Watts / 220 Volts = 0.909~ Amps
Remember 80% load per circuit breaker so a 15 amp breaker you should only load to 12 amps or less. Using Watt / Volts = Amps is the same as Amps x Volts = Watts.
15 Amps X .8 (80%) = 12 Amps max per circuit (for a 15 amp breaker/fuse)
So 12 Amps x 120 Volts = 1440 Watts max for a 15 amp circuit (typical 14 gauge wire)
***************************************************************
If a 20 amp circuit and 12 gauge wire (smaller gauge = larger dia wire).
20 Amps X .8 (80%) = 16 Amps max per circuit
So 16 Amps x 120 Volts = 1920 Watts max for a 20 amp circuit (typical 12 gauge wire)
Chat with our AI personalities
Since P = V2/R, you can easily manipulate this equation and find the answer for yourself. Bear in mind that this will give you the 'hot' resistance; the resistance will be very much lower when the lamp is cold.
To find the amperage, use the formula: Amps = Watts / Volts. For a 200-watt light bulb at 120 volts, the amperage would be 1.67 amps.
Power = V2 / Resistance
Resistance = V2 / Power = (120)2 / 100 = 14,400/100 = 144 ohms
About 144 ohms. This assumes the bulb is rated at 120 volts. You might find it rated anywhere from 110v to 130v, which in normal usage means nothing. But it may mean a different resistance.
A 65-watt light bulb operating at 120 volts draws approximately 0.54 amps of current. You can calculate this by dividing the wattage (65 watts) by the voltage (120 volts) to get the amperage.
Depends on the size of the LED light and the voltage applied. An example is an LED 24 volt globe light that pulls 8 watts which draw 0.333333 amps. Take an LED 120 volt light bulb draws 12 watts and will pull 0.1 amps. The same bulb at 240 volts wil draw 0.05 amps. it really depends on the watts and voltage applied. An average would be about 0.1 amps.
Each 32-watt bulb in a 48-inch fluorescent light typically draws around 0.27 amps. Therefore, a two-bulb setup would draw approximately 0.54 amps in total.
To calculate the amperage, use the formula: Amps = Watts / Volts. In this case, 50 watts / 12 volts = 4.17 amps. So, a 50 watt 12V light will draw approximately 4.17 amps of current.
A 65 watt light bulb draws approximately 0.54 amps when used on a standard 120-volt circuit. (Ohm's Law: Amps = Watts / Volts)