I'm assuming this is standard residential single phase. Simple calculation as noted below:
Watts / Volts = Amps
So: 200 Watts / 120 Volts = 1.666~ Amps
If you needed to calculate for a 220 volt run with the same 200 Watts
200 Watts / 220 Volts = 0.909~ Amps
Remember 80% load per circuit breaker so a 15 amp breaker you should only load to 12 amps or less. Using Watt / Volts = Amps is the same as Amps x Volts = Watts.
15 Amps X .8 (80%) = 12 Amps max per circuit (for a 15 amp breaker/fuse)
So 12 Amps x 120 Volts = 1440 Watts max for a 15 amp circuit (typical 14 gauge wire)
***************************************************************
If a 20 amp circuit and 12 gauge wire (smaller gauge = larger dia wire).
20 Amps X .8 (80%) = 16 Amps max per circuit
So 16 Amps x 120 Volts = 1920 Watts max for a 20 amp circuit (typical 12 gauge wire)
Since P = V2/R, you can easily manipulate this equation and find the answer for yourself. Bear in mind that this will give you the 'hot' resistance; the resistance will be very much lower when the lamp is cold.
Power = V2 / Resistance
Resistance = V2 / Power = (120)2 / 100 = 14,400/100 = 144 ohms
P=I*E and I=E/R so P=E^2/R.
So R=(E^2)/P and R=(120v)^2/200w = 72 ohms
About 144 ohms. This assumes the bulb is rated at 120 volts. You might find it rated anywhere from 110v to 130v, which in normal usage means nothing. But it may mean a different resistance.
1.666
Depends on the size of the LED light and the voltage applied. An example is an LED 24 volt globe light that pulls 8 watts which draw 0.333333 amps. Take an LED 120 volt light bulb draws 12 watts and will pull 0.1 amps. The same bulb at 240 volts wil draw 0.05 amps. it really depends on the watts and voltage applied. An average would be about 0.1 amps.
The 194 bulb is ~3.8 watts, at 14 volts they draw 0.271 amps.
draw 0.104 amps
A 65 Watt incandescent light bulb should draw 65W/120V = 541.67mA
They will use the same amount of power. A 100 watt bulb will use 100 watts. If a bulb is rated at 100 watts and is specified as a 120 volt bulb, if you apply the 120 volts, it will draw 0.83 amps. Volts times amps equals watts. If you have a bulb rated at 100 watts and is specified as 12 volts, if you apply the 12 volts, it will draw 8.3 amps.
Watts = Volts x Amps x Power Factor. An incandescent light bulb is a resistive load so PF = 1. ANSWER: = 1/2 Amp
Depends on the size of the LED light and the voltage applied. An example is an LED 24 volt globe light that pulls 8 watts which draw 0.333333 amps. Take an LED 120 volt light bulb draws 12 watts and will pull 0.1 amps. The same bulb at 240 volts wil draw 0.05 amps. it really depends on the watts and voltage applied. An average would be about 0.1 amps.
About 0.6 amps for a 12v 21w bulb
draw 0.104 amps
The 194 bulb is ~3.8 watts, at 14 volts they draw 0.271 amps.
A 65 Watt incandescent light bulb should draw 65W/120V = 541.67mA
A 50 watt bulb designed to run on 12 volts takes 4.17 amps. A 50 watt bulb designed to run on 230 volts takes 0.217 amps.
They will use the same amount of power. A 100 watt bulb will use 100 watts. If a bulb is rated at 100 watts and is specified as a 120 volt bulb, if you apply the 120 volts, it will draw 0.83 amps. Volts times amps equals watts. If you have a bulb rated at 100 watts and is specified as 12 volts, if you apply the 12 volts, it will draw 8.3 amps.
A 10 watt bulb is defined by the voltage supply and the resulting current. So to make the math simple, suppose you have a 10 watt incandescent bulb designed to work at 20 volts. That means it will draw 1/2 amps. Watts = Volts x Amps. The resistance of the bulb is then Volts / Amps so in this case the resistance of the bulb would be 40 ohms. So our mythical bulb has a resistance of 40 ohms with 20 volts across the bulb in our example. Now if we put two of these bulbs in series with the same 20 volts we now have a total resistance of 80 ohms supplied by 20 volts and the circuit will draw 1/4 amp. This lower current will cause the bulbs to be dimmer.
.63 ampere draw @ 7 volts
If the light is operating at 110 volts, and P = I x E, then I = 300/110, or 2.5 amps.
At what voltage? Until you tell me the voltage I can't give you an answer. To find out Amps you need to divide the Watts by the Volts. At 120V you have 0.4 amps. At 12V you have 4 amps.