Ohms law will tell you watts equals volts times amps: 115 x 5 = 575
1 watt = 1 amp * 1 volt So.... In a house: 5 amps * 115 volts = 575 watts In a car: 5 amps * 12 volts = 60 watts
The formula you are looking for is Watts = Amps x Volts. Amps = Watts/Volts. This comes to 4 amps load. Minimum size fuse would be 5 amps.
Watts = Volts X Amps. Amps=Watt / Volts. So, with a 240V mains, a 60W bulb draws 0.25amps. On a 12 system (car/auto) a 60W bulb draws 5 amps. On a 110V mains, a 60W bulb draws .55 Amps.
5-115 = -110
1 × 115 = 115 5 × 23 = 115
To find the number of amps in a circuit with 115 volts and a power rating in watts, you can use the formula: Amps = Watts / Volts. So, if you have a device that runs at 115 volts and consumes 575 watts, the amperage would be approximately 5 amps (575 watts / 115 volts = 5 amps).
1 watt = 1 amp * 1 volt So.... In a house: 5 amps * 115 volts = 575 watts In a car: 5 amps * 12 volts = 60 watts
The wattage would be 500 watts. This is calculated by multiplying the amperage (5 amps) by the voltage (100 volts), resulting in 500 watts of power.
A transformer does not use, it transforms voltage from one value to another. The output amperage is governed by the connected load. If the load wattage is higher than the wattage rating of the transformer then either the primary or secondary fuse will blow or the transformer will burn up if the fusing is of the wrong sizing. The maximum primary amperage can be found by using the following equation, Amps = Watts/Volts, A = W/E = 600/120 = 5 amps. The same equation is used for the calculating the maximum secondary amperage, A = W/E = 600/12 = 50 amps.
The switch will use no wattage whatsoever. The load on that switch is what uses power. If you know the load is 5 amps then 5 amps at 120 volts is 600 watts.
you would need to know the wattage of each lamp. multiply the lamp wattage x 5 = total watts divide the total wattage / 230 volts (or the voltage you will connect to)= amps example: 250 watts x 5= 1250 watts 1250 watts / 230 volts = 5.43 amps
You can determine the amps of any power consuming source by dividing the watts by the volts. Example: If the bulb is 60 watts and your volts are 120, then 60/120 = .5 amps. Or you could use an amp probe.
To calculate the amperage at 12 volts based on 1.5 amps at 5 volts, you can use the formula P=IV (Power = Current x Voltage). First, find the power at 5 volts (P=1.5A * 5V = 7.5 watts). Then, using P=IV at 12 volts, solve for current (7.5W = I * 12V => I = 0.625A). So, at 12 volts, 1.5 amps at 5 volts translates to approximately 0.625 amps.
The product would be VA.
The 5 amp fuse has many wattages that it can protect. It depends on the voltage of the circuit that the fuse is protecting. Use the following formula, Watts = Volts x Amps. For example 120 volts x 5 amps = 600 watts, 240 volts x 5 amps = 1200 watts, 480 volts x 5 amps = 2400 watts and 600 volts x 5 amps = 3000 watts.
It depends on how many Amps (current) are applied to the voltage. Watt = Volts x Amps. e.g. 12 volts @ 5 amps = 60 watts
a 1.5 kVa source of electrical power has the capacity to supply 100 volts at 15 amps, 300 volts at 5 amps, or 1000 volts at 1.5 amps.