answersLogoWhite

0


Best Answer

I'm assuming this is standard residential single phase. Simple calculation as noted below:

Watts / Volts = Amps

So: 200 Watts / 120 Volts = 1.666~ Amps

If you needed to calculate for a 220 volt run with the same 200 Watts

200 Watts / 220 Volts = 0.909~ Amps

Remember 80% load per circuit breaker so a 15 amp breaker you should only load to 12 amps or less. Using Watt / Volts = Amps is the same as Amps x Volts = Watts.

15 Amps X .8 (80%) = 12 Amps max per circuit (for a 15 amp breaker/fuse)

So 12 Amps x 120 Volts = 1440 Watts max for a 15 amp circuit (typical 14 gauge wire)

***************************************************************

If a 20 amp circuit and 12 gauge wire (smaller gauge = larger dia wire).

20 Amps X .8 (80%) = 16 Amps max per circuit

So 16 Amps x 120 Volts = 1920 Watts max for a 20 amp circuit (typical 12 gauge wire)

User Avatar

Wiki User

14y ago
This answer is:
User Avatar
More answers
User Avatar

Wiki User

12y ago

Since P = V2/R, you can easily manipulate this equation and find the answer for yourself. Bear in mind that this will give you the 'hot' resistance; the resistance will be very much lower when the lamp is cold.

This answer is:
User Avatar

User Avatar

AnswerBot

5mo ago

To find the amperage, use the formula: Amps = Watts / Volts. For a 200-watt light bulb at 120 volts, the amperage would be 1.67 amps.

This answer is:
User Avatar

User Avatar

Wiki User

14y ago

Power = V2 / Resistance

Resistance = V2 / Power = (120)2 / 100 = 14,400/100 = 144 ohms

This answer is:
User Avatar

User Avatar

Wiki User

13y ago

P=I*E and I=E/R so P=E^2/R.

So R=(E^2)/P and R=(120v)^2/200w = 72 ohms

This answer is:
User Avatar

User Avatar

Wiki User

13y ago

About 144 ohms. This assumes the bulb is rated at 120 volts. You might find it rated anywhere from 110v to 130v, which in normal usage means nothing. But it may mean a different resistance.

This answer is:
User Avatar

User Avatar

Anonymous

Lvl 1
4y ago

1.666

This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: How many amps does a 200 watt light bulb draw at 120 volts?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Related questions

How many amps does 65 watt light bulb draw?

A 65-watt light bulb operating at 120 volts draws approximately 0.54 amps of current. You can calculate this by dividing the wattage (65 watts) by the voltage (120 volts) to get the amperage.


How many amps will a 60 watt bulb draw from 120 volts?

Watts = Volts x Amps x Power Factor. An incandescent light bulb is a resistive load so PF = 1. ANSWER: = 1/2 Amp


How many amps does a led light draw?

Depends on the size of the LED light and the voltage applied. An example is an LED 24 volt globe light that pulls 8 watts which draw 0.333333 amps. Take an LED 120 volt light bulb draws 12 watts and will pull 0.1 amps. The same bulb at 240 volts wil draw 0.05 amps. it really depends on the watts and voltage applied. An average would be about 0.1 amps.


What draw in amps is a standard brake light?

About 0.6 amps for a 12v 21w bulb


How many amps does a two-bulb 32 watt 48-inch fluorescent light draw?

Each 32-watt bulb in a 48-inch fluorescent light typically draws around 0.27 amps. Therefore, a two-bulb setup would draw approximately 0.54 amps in total.


Do a 12 volt 50 watt bulb and a 230 volt 50 watt bulb draw the same current?

No, they do not draw the same current. The current drawn by an electrical device is determined by the power (Watts) and voltage (Volts) using the formula: Current (amps) = Power (Watts) / Voltage (Volts). So, the 12 volt 50 watt bulb will draw higher current compared to the 230 volt 50 watt bulb.


How many amps in a 50 watt 12v light?

To calculate the amperage, use the formula: Amps = Watts / Volts. In this case, 50 watts / 12 volts = 4.17 amps. So, a 50 watt 12V light will draw approximately 4.17 amps of current.


How many amps does a 65 watt light bulb draw?

A 65 watt light bulb draws approximately 0.54 amps when used on a standard 120-volt circuit. (Ohm's Law: Amps = Watts / Volts)


How many ampere is the 63 auto light bulb?

.63 ampere draw @ 7 volts


How many amps does a 120v 50w halogen light bulb use?

At what voltage? Until you tell me the voltage I can't give you an answer. To find out Amps you need to divide the Watts by the Volts. At 120V you have 0.4 amps. At 12V you have 4 amps.


How much current flows through a 60 watt bulb connected to 120 volts?

Using the Electrical Power Law, which is:The current (measured in amps) equals the power (measured in watts) divided by the potential difference (measured in volts)So a light bulb designed to use 60 watts of power when supplied with 120 volts must draw 60 watts divided by 120 volts, which is a current of 0.5 amps.The same answer could be expressed in a few different ways:500 milliwatts500 mW"1/2 an amp" !


How much amperage a 194 series bulb draws?

A 194 series bulb typically draws around 0.25 amps of current. This may vary slightly depending on the specific bulb manufacturer and design.