watts = Current x Volts for a resistive load. If you are talking about residential voltage of 120 VAC the formula is A = 12,000 / 120 or 100 Amps. 100. That current can kill you.
Wiki User
∙ 14y agoTo convert kilowatts to amps, you need to know the voltage of the circuit. The formula to calculate amps is: Amps = kilowatts / (volts * power factor). For example, if the voltage is 120V, then the amps would be 100A for a 12kW load.
To calculate the amperage, use the formula: Amps = Watts / Volts. In this case, 50 watts / 12 volts = 4.17 amps. So, a 50 watt 12V light will draw approximately 4.17 amps of current.
You need the volts times the amps to equal 100 Watts. On 12 v that is 8.33 amps, or on 200 v is it 0.5 amps.
To calculate the amperage, you can use Ohm's Law formula: Amperage (A) = Power (W) / Voltage (V). In this case, for a 5 watt bulb at 12 volts, the amperage drawn would be 0.42 amps (5W / 12V = 0.42A).
For a 100 watt 12 volt lamp, you should use a wire gauge size of at least 18 AWG to ensure it can handle the current without overheating. It's always best to refer to the manufacturer's recommendations for the specific lamp you are using.
To find the amperage produced by a 170 watt, 12 volt solar panel, you can use the formula: Amps = Watts / Volts. In this case, 170 watts / 12 volts = 14.17 amps.
It depends on how many Amps (current) are applied to the voltage. Watt = Volts x Amps. e.g. 12 volts @ 5 amps = 60 watts
A 60 watt bulb at 12 volts will pull 5 amps of current.
To calculate the amperage, use the formula: Amps = Watts / Volts. In this case, 50 watts / 12 volts = 4.17 amps. So, a 50 watt 12V light will draw approximately 4.17 amps of current.
To calculate the amperage, you can use Ohm's Law formula: Amperage (A) = Power (W) / Voltage (V). In this case, for a 5 watt bulb at 12 volts, the amperage drawn would be 0.42 amps (5W / 12V = 0.42A).
Answer You need the Voltage and the amps it can supply then use the magic triangle formula that is Watt = Amps X Volts say 400mA 12 volt that will work out to .4X12 = 4.8 watt
You need the volts times the amps to equal 100 Watts. On 12 v that is 8.33 amps, or on 200 v is it 0.5 amps.
It depends on how many Amps (current) are applied to the voltage. Watt = Volts x Amps. e.g. 12 volts @ 5 amps = 60 watts
No, they do not draw the same current. The current drawn by an electrical device is determined by the power (Watts) and voltage (Volts) using the formula: Current (amps) = Power (Watts) / Voltage (Volts). So, the 12 volt 50 watt bulb will draw higher current compared to the 230 volt 50 watt bulb.
The same number as 250 oranges is apples. A watt is a volt times an ampere.
Watts = Volts X Amps. Amps=Watt / Volts. So, with a 240V mains, a 60W bulb draws 0.25amps. On a 12 system (car/auto) a 60W bulb draws 5 amps. On a 110V mains, a 60W bulb draws .55 Amps.
To determine the fuse size for a 200 watt amp, you will need to divide the power rating by the voltage of the system. If the amp operates on a 12V system, the amperage would be around 16.67A, so a 20A fuse would be appropriate for a 200 watt amp.
1 watt = 1 amp * 1 volt So.... In a house: 5 amps * 115 volts = 575 watts In a car: 5 amps * 12 volts = 60 watts