Watts are a measure of power, and they are the same regardless of voltage, but the current (amps) required to produce that power depends on the voltage.
The basic formula is:
Watts (W)=Volts (V)×Amps (A)
Same wattage, different voltage — very different current.
This matters a lot in things like wiring and battery systems: low voltage systems (like 12V) need much higher current to deliver the same power, which means thicker wires and more heat loss.
4800
Amps * Volts = Watts Amps * 12 = 600 600/12 = Amps = 50 amps You would need a reserve capacity, so I'd go somewhere between 60 or 100 Amp rated transformer. Transformers are rated in volt-amps which is usually calculated the same as watts. But the term "watts" technically does not apply to transformers. So you need a 600 volt-amp transformer or, as Redbeard has suggested, you need an 800 or 1000 volt-amp transformer. That's a lot of amps for a 12 volt system so I recommend you double check your requirements. You will need a #2 gauge wire if your requirements are correct.
If we assume that you are using a common 15 Amp lighting circuit and switch and using 120 volts to power the bulbs then you need to keep the wattage at 80% of 15 amp worse case or 12 amps. Watts = amps x volts for standard incandescent bulbs. 12 x 120 = 1440 watts.
About 180 watts assuming a 90% conversion efficiency.
The American Wire Gauge code table shows 8 gauge safe for 24 Amps, 10 Gauge for 15 Amps. If the circuit is going to be used at capacity (2400 Watts in this case), 8 or 10 gauge is the minimum, if load is constant, use 8 gauge. Voltage ability of the wire is dependent on the insulation thickness and material. So 20 amps at 120 Volts is 2400 watts of power, and 20 amps at 12 volts is 48 watts of power. Both would require the same gauge of wire, but the higher voltage would need better insulation. <<>> This is a voltage drop question. A #1 copper conductor will limit the voltage drop to 3% or less when supplying 20 amps for 500 feet on a 120 volt system.
To convert amps to watts in a 12-volt application, you can use the formula: Watts = Volts x Amps. Therefore, in a 12-volt circuit, if you have 1 amp of current, the power consumption would be 12 watts (12V x 1A).
The amp hours capacity of a battery remains the same whether it is connected to a 12-volt DC load or a 120-volt AC inverter. So, the battery would still have 100 amp hours regardless of the inverter voltage.
On a 12 volt system the 80 watts draws 6.7 amp and the 120 watts draws 10 amps.
Yes, a 240 volt window air conditioner is generally more energy efficient than a 120 volt unit with the same BTU rating. This is because higher voltage appliances require less current to operate, which can result in lower energy consumption and potentially lower operating costs.
Your 12 volt 2 amp battery charger draws 24 watts of power (12 volts x 2 amps = 24 watts).
It's sometimes possible to use the secondary as the primary in transformers as long as you keep the overall power (wattage) the same or lower, and the AC frequency the same. ex. A 12 volt AC @ 10 Amp transformer (120 watts), will only yield 120v AC @ 1 Amp (120 watts)
12 Volts DC
The same number as 250 oranges is apples. A watt is a volt times an ampere.
12 volt systems are not measured in watts...... doofus
No, the battery is DC not AC.
To power ten 12 volt 10 watt lights, you would need a transformer with a total output of at least 120 volts and 100 watts.
To determine the watts needed to run a 12-volt drill charger, you can use the formula: Watts = Volts x Amps. If you know the amperage of the charger, you can multiply it by 12 volts to find the wattage required.