Without knowing the voltage, I can not tell you how many amps a 60 watt light uses.
If you have a 12 volt system in a car, then a 60 watt light will pull 5 amps.
If you have a 120 volt system in a house, then a 60 watt light will pull 1/2 amp.
If you have a 240 volt system in an industrial building, then a 60 watt light will pull 1/4 amp.
I = W/E. Divide watts by volts. Since household wiring is basically 110 volts.
<><><>
About 1/2 amp.
The simple way to do this is to divide the RATED wattage by the RATED voltage (120v in most cases). Then if your actual voltage is lower than 120v, your amps will be slightly below 1/2 amp. If your actual voltage is higher than 120v, your amps will be slightly higher than 1/2 amp. At 110v you get about .46 amps.
<> UK answer: the current on a 240 v supply is ¼ amp for a 60 watt bulb.
Each 32-watt bulb in a 48-inch fluorescent light typically draws around 0.27 amps. Therefore, a two-bulb setup would draw approximately 0.54 amps in total.
1 amp
A 65-watt light bulb operating at 120 volts draws approximately 0.54 amps of current. You can calculate this by dividing the wattage (65 watts) by the voltage (120 volts) to get the amperage.
To calculate the amperage of a 40-watt bulb, you need to use the formula: Amps = Watts / Volts. If the bulb operates at 120 volts (standard for US households), the amperage will be 0.33 amps (40 watts / 120 volts).
A 15 amp circuit can handle approximately 8-10 60 watt bulbs. Each 60 watt bulb draws 0.5 amps of current, so you divide the circuit's amp rating (15 amps) by the current draw per bulb (0.5 amps) to get the approximate number of bulbs it can handle.
Each 32-watt bulb in a 48-inch fluorescent light typically draws around 0.27 amps. Therefore, a two-bulb setup would draw approximately 0.54 amps in total.
It is drawing .06 amps.
1 amp
A 65-watt light bulb operating at 120 volts draws approximately 0.54 amps of current. You can calculate this by dividing the wattage (65 watts) by the voltage (120 volts) to get the amperage.
A 120 volt table lamp with a 75 watt bulb will pull 0.625 amps. With a 100 watt bulb it will pull 0.833 amps. And with a modern fluorescent 13 watt bulb it will pull 0.108 amps.
No, they do not draw the same current. The current drawn by an electrical device is determined by the power (Watts) and voltage (Volts) using the formula: Current (amps) = Power (Watts) / Voltage (Volts). So, the 12 volt 50 watt bulb will draw higher current compared to the 230 volt 50 watt bulb.
To calculate the amperage of a 40-watt bulb, you need to use the formula: Amps = Watts / Volts. If the bulb operates at 120 volts (standard for US households), the amperage will be 0.33 amps (40 watts / 120 volts).
Watts = Volts x Amps x Power Factor. An incandescent light bulb is a resistive load so PF = 1. ANSWER: = 1/2 Amp
A 15 amp circuit can handle approximately 8-10 60 watt bulbs. Each 60 watt bulb draws 0.5 amps of current, so you divide the circuit's amp rating (15 amps) by the current draw per bulb (0.5 amps) to get the approximate number of bulbs it can handle.
It's 75/120 and the answer is in amps.
To calculate the amperage, you can use the formula: Amps = Watts/Volts. For a 65-watt light bulb at 120 volts, the amperage would be 0.54 amps.
Not enough to worry about. That's like asking how many amps does the memory preset's on your radio draw. It is in the .001-.01 range.