To calculate the current (in amps), you can use the formula: Current (Amps) = Power (Watts) / Voltage (Volts). In this case, if you have a 65-watt power supply with a voltage of 240 volts, the current would be 0.27 amps.
To answer this you have to know how many volts will be used. If you know the voltage then you can calculate the current by dividing voltage into wattage. For example; an electric heater rated at 700 watts when plugged into a 115 v outlet will draw 700/115 = 6.08 amps of current.
A 18000 watt generator supplying power at 240 volts would supply 75 amps (18000 watts ÷ 240 volts = 75 amps).
To calculate the amps consumed by a motor running at 3736 watts in 230 volts for one hour, use the formula: Amps = Watts / Volts. Therefore, Amps = 3736 watts / 230 volts ≈ 16.23 amps consumed in that one hour.
To find the amperage, you need to know the voltage of the circuit. The formula is: Amps = Watts / Volts. If we assume a standard voltage of 120V (for US), then 87 watts at 120 volts would be approximately 0.725 amps.
No. Your power supply must be able to supply rated voltage (12 volts) and rated current (3 amps).
To calculate the current (in amps), you can use the formula: Current (Amps) = Power (Watts) / Voltage (Volts). In this case, if you have a 65-watt power supply with a voltage of 240 volts, the current would be 0.27 amps.
No
a 1.5 kVa source of electrical power has the capacity to supply 100 volts at 15 amps, 300 volts at 5 amps, or 1000 volts at 1.5 amps.
Volts = Amps x Resistance Therefore Amps = Volts / Resistance
No you can not. The power supply output of 1.2 amps is under sized. You would need to have a power supply of 3 amps or larger.
Amps can not give you a kilowatt with out a voltage being applied to the question. Watts = Amps x Volts. Amps = 1000/ Volts.
To answer this you have to know how many volts will be used. If you know the voltage then you can calculate the current by dividing voltage into wattage. For example; an electric heater rated at 700 watts when plugged into a 115 v outlet will draw 700/115 = 6.08 amps of current.
A 18000 watt generator supplying power at 240 volts would supply 75 amps (18000 watts ÷ 240 volts = 75 amps).
To calculate the amps consumed by a motor running at 3736 watts in 230 volts for one hour, use the formula: Amps = Watts / Volts. Therefore, Amps = 3736 watts / 230 volts ≈ 16.23 amps consumed in that one hour.
Rephrase your question, as it doesn't make any sense. If the primary side of the transformer is 480 volts 3 phase, this transformer can be supplied from a breaker as big as 180 amps. If 480 volts 3 phase is your secondary then you can supply up to 180 amps to your loads.
Any value - you must supply the resistance.