To answer this question the systems voltage on the bulb must be known.
Appliances at home have designated voltage e.g. 220 V or 110 V. When the voltage is dropped, the appliances try to run at their designated power in kW as usual. To keep the power same, current is increased (P = VI). This increase in current can burn the most delicate part of the appliances if the low voltage is experienced persistently.AnswerA drop in supply voltage results in a drop in the power of appliances. For fixed-resistance devices, a 10% drop in voltage results in approx, 18% drop in power.
10000 watts / 220 volts = 45.4545 amperes
Total supply voltage = 220vrated power of first lamp= 100 wattso current though it, I=p/vI=100/220=.45 ampsResistance offered by first lamp=220/.45= 488 ohms( i avoid fractions)..............................................................rated power of second lamp=60 wattsvoltage is same, so current through it = 60/220=.27 ampsresistance of second lamp = 220/.27=814 ohms( i avoid fractions).........................................................................power drawn by first lamp =I12 R=(.45)2 * 488 =98.82(=100)power drawn by second lamp =I22 R=(.27)2 * 814 =59(=60)...................................................................................SO 100 WATT BULB WILL GROW BRIGHTER AS IT HAS MORE POWER
220 volts 60 Hz
When the peak voltage is 311, the RMS voltage is 220. (311 * square root (2))
the appliance will burn out, eg if it is a 60 watt light bulb it will burn at 120 watt for as long as the fillament can take it and that wont be long
the 220 volt bulp in 220 volt ac current
The current through a 220 volt 150 watt bulb is I = W/E = .68 amps. The resistance of that bulb is R = E/I = 324 ohms. The wattage used by the 220 volt bulb when 110 volts is applied W = E(sqd)/R = 37 watts. Half the voltage with the same resistance will quarter the wattage output.
A 100 watt 220 volt light bulb (or anything consuming 100 watts on 220 volts) draws 100/220, or .45 Amps. It will also have about 220²/100, or 484 ohms resistance. A 60 watt 220 volt light bulb (or anything consuming 60 watts on 220 volts) draws 60/220, or .27 Amps. It will also have about 220²/60, or 807 ohms resistance.
no , it will burn out
Yes a 220 volt light bulb will run on a 120 volt circuit but at 1/4 of the wattage that the light bulb is rated at. A 100 watt light bulb on 220 would would be equal to a 25 watt light bult on 120 volt system.
If the voltage supplied to the lamps is its operating voltage both lamps will have relatively the same output in brightness. If the 60 watt 110 volt lamp is used on a 220 volt supply, it will glow very brightly and then the lamp's filament will burn open. If the 60 watt 220 volt lamp is used on a 110 volt supply, the lamp will glow at half brightness, but it will last for a very long time before the filament burns open.
Not at all : Power = Voltage x Current Example : 220 V x 5 A = 1100 Watt
Yes. The 125V is the rating of the bulb, which indicates the voltage it was manufactured to withstand. The voltage it is actually operated at does not have to be precisely 125V; As a matter of fact the voltage coming from your wall socket can vary from as low as 100VAC all the way to above 125 VAC. So, screw that bulb in and light up your world!
The circuit voltage or the resistance of the individual bulb is needed to answer this question. Divide the total power (400 W) by the supply voltage.
Actual voltage would be 240V. 4 AWG copper is capable of carrying 50A. At 200 ft, with a 50A load, voltage drop would be about 6V, which is within the acceptable 3% voltage drop for a branch circuit.
If you had a 60 watt incandescent bulb it would draw about 1/2 amp. That means that the resistance of the bulb filament would be about 220 ohms. Now if you applied 12 volts DC across 220 ohms you would draw about .05 amps. This would not be enough to heat the filament and create any useful light. Remember Ohm's Law says Volts = Amps x Ohms.