The transformer itself does not pull current. Whatever you connect to the transformer pulls current. Whatever the output voltage of the transformer is, divide that into 600 and you get maximum current possible without burning up the transformer.
At 24V that's 25 amps.
At 120 volts it will pull 4.166 amps. At 240 volts it will pull 2.08 amps.
A 120 volt table lamp with a 75 watt bulb will pull 0.625 amps. With a 100 watt bulb it will pull 0.833 amps. And with a modern fluorescent 13 watt bulb it will pull 0.108 amps.
There are zero amps in 6600 watts. Watts are the product of amps times volts. W = A x V. To find amperage use the following equation, A = W/V, so as you can see a voltage value is needed in the equation to result in an amperage.
Anything that does not pull over 50 amps.
T430.247 of the NEC shows that a 1 hp motor operating at full load on 115v will draw 16 amps, called Full Load Current (FLC). Conductors supplying this motor are required to be 125% of FLC which is 20 amps. Motor circuits are complicated things and do not follow the rules of other circuits. This motor, while drawing a maximum of 16 amps at full load and supplied with #12 AWG copper conductors can be protected by a breaker of 40 amps.
This typically has to do with how many amps you can safely pull from the secondary of the transformer.
It depends on the voltage; which depends on the country. If you know the voltage, divide the wattage by the voltage, the result is the amperage.
It depends on the voltage source. watts = voltage * voltage / resistance and amps = voltage / resistance example 1: To produce 600W from a 120V source, you need a resistor of size 120V*120V/600W = 24 Ohm. This would pull 120V/24 Ohm = 5 amps. example 2: To produce 600W from a 240V source, you need a resistor of size 240V*240V/600W = 96 Ohm. This would pull 240V/96 Ohm = 2.5 amps.
At 120 volts it will pull 4.166 amps. At 240 volts it will pull 2.08 amps.
It depends on the voltage it runs on. The answer would be the wattage 15,000 divided by the voltage. Example at 240 volts it would run on 62.5 amps.
Seven amps pulls zero kilowatts . W = A x V. You need to state a voltage to multiply the amperage by to get watts. Then divide by 1000 to get kilowatts.
Using the equation Volts X Amps = Watts, you can take 3000 watts / Volts to get your answer: 3000W/240V = 12.5A or 3000W/120V = 25A So, at 240 volts you will use 12.5 amps for 3000 watts of power. Or at 120 volts you will use 25 watts.
Power(Watts) = I (Amps) x E(Voltage) PIE rule. so 1000 = I x 240. 1000/240 = 4.16667 amps.
Aprox 12 amps.
A 120 volt table lamp with a 75 watt bulb will pull 0.625 amps. With a 100 watt bulb it will pull 0.833 amps. And with a modern fluorescent 13 watt bulb it will pull 0.108 amps.
Depends on the size of the LED light and the voltage applied. An example is an LED 24 volt globe light that pulls 8 watts which draw 0.333333 amps. Take an LED 120 volt light bulb draws 12 watts and will pull 0.1 amps. The same bulb at 240 volts wil draw 0.05 amps. it really depends on the watts and voltage applied. An average would be about 0.1 amps.
at 230v it will use 5 to 6 amps