Assuming a resistive load, the continuous current flowing would be 600/220 = 1.36 amps. The resistance of the load is 220/1.36 = 162 ohms.
If you have a 200 ampere hour battery that only supplies 24 volts you can't run your 600 watt device that is designed to run at 220 volts.
For sake of argument, say your load is an incandescent light bulb designed to work at 24 volts. If you attached the battery it would try and draw 600/24 = 25 amps and the resistance of the load would be about 1 ohm.
You need to match the voltage source to the load requirements.
CAVEAT - This example assumes that if a 24 volt battery was used that the 600 watt device was made to work for 24 volts. It is not the same load that would be for a 600 watt device at 220 volts. The problem is that the hypothetical question asked does not match reality.
Watts are the product of amps times volts. The amperage in a circuit is governed by the resistance of the load. A battery just supplies the potential as voltage, the load determines how much current is going to be drawn out of the battery. Batteries are rated in amp/hours. This means how long can a battery maintain a specific amperage over a period of time.
The power (in watts) can be calculated by multiplying the current (in amps) by the voltage (in volts). In this case, 10 amps at 12 volts would result in 120 watts of power (10A * 12V = 120W).
Electric power = Volts X Amps, so 7 vols at 1 Amp will produce 7 watts 7 volts at 5 amps will produce 35 watts 7 volts at 15 amps will produce 105 watts and so on. Technically, there is not enough information (just volts) to answer your question but if you know the Amps, you can now figure the answer yourself.
It doesn't really work that way. For each battery there's a given maximum current that the battery can deliver, and for each load(=thing powered by the battery) there's a given current that the load can pull. So to get to 1000 watts you need a load that can use 1000 watts, and a battery that can deliver that current You get amps by taking watts / volts, so 1000/12 = 83 amps So a load capable of pulling 83A and a battery capable of delivering 83A will basically give you your 1000W. Now, one basic car battery will be able to deliver 83A/1000W, even more, but only for a short period of time. Not only will it drain the battery rather quickly, but at that power, you're also looking at possible overheating and other nasties. Several batteries in parallel would be a better option, but not that easy to set up well.
Assuming 100% efficiency in the conversion process, the power output would be 7 hours at 12 volts, so the total watt-hours of energy stored in the battery would be 12 volts * 7 amp-hours = 84 watt-hours. With a load of 20 watts, the battery would last 84 watt-hours / 20 watts = 4.2 hours when converted to 120 volts AC.
Watts are the product of amps times volts. The amperage in a circuit is governed by the resistance of the load. A battery just supplies the potential as voltage, the load determines how much current is going to be drawn out of the battery. Batteries are rated in amp/hours. This means how long can a battery maintain a specific amperage over a period of time.
An ampere-hour rating is a relatavistic indication of how long a battery can supply a specific current.It is not possible to determine the run time when you only gave watts, but watts are volts times amps, and you did not supply the volts.
The power (in watts) can be calculated by multiplying the current (in amps) by the voltage (in volts). In this case, 10 amps at 12 volts would result in 120 watts of power (10A * 12V = 120W).
Depends on the voltage. Volts x Amps = Watts
Watts = Amps * Volts Watts = 20 amps * 100 Volts Watts = 2000 2,000 Watts or 2k Watts
Watts = Amps * Volts Watts = 20 amps * 100 Volts Watts = 2000 2,000 Watts or 2k Watts
Electric power = Volts X Amps, so 7 vols at 1 Amp will produce 7 watts 7 volts at 5 amps will produce 35 watts 7 volts at 15 amps will produce 105 watts and so on. Technically, there is not enough information (just volts) to answer your question but if you know the Amps, you can now figure the answer yourself.
If the wattage of a load is known then the current can be calculated. Watts equals amps times volts. You would use the following formula, Amps = Watts/Volts.
It doesn't really work that way. For each battery there's a given maximum current that the battery can deliver, and for each load(=thing powered by the battery) there's a given current that the load can pull. So to get to 1000 watts you need a load that can use 1000 watts, and a battery that can deliver that current You get amps by taking watts / volts, so 1000/12 = 83 amps So a load capable of pulling 83A and a battery capable of delivering 83A will basically give you your 1000W. Now, one basic car battery will be able to deliver 83A/1000W, even more, but only for a short period of time. Not only will it drain the battery rather quickly, but at that power, you're also looking at possible overheating and other nasties. Several batteries in parallel would be a better option, but not that easy to set up well.
Volts cause current to flow through the load. The current is measured in amps, and the volts multiplied by the amps gives the power in watts.
Assuming 100% efficiency in the conversion process, the power output would be 7 hours at 12 volts, so the total watt-hours of energy stored in the battery would be 12 volts * 7 amp-hours = 84 watt-hours. With a load of 20 watts, the battery would last 84 watt-hours / 20 watts = 4.2 hours when converted to 120 volts AC.
The amp hours capacity of a battery remains the same whether it is connected to a 12-volt DC load or a 120-volt AC inverter. So, the battery would still have 100 amp hours regardless of the inverter voltage.