AWG Wire Ampacity
Gage (Amperes)
16 10 A
14 15 A
12 20 A
10 30 A
8 45 A
6 65 A
4 85 A
2 115 A
1 130 A
0 150 A
So if anything its probably 12-14 gauge wire
Yw
1.73*480*22
Watts = Volts times Amps. Therefore, if the voltage was 220 volts, the motor would draw 500 amps. If the voltage was 4,000 volts, the motor would draw 27.5 amps. The voltages for large powerful motors tend to be relatively high, for example in the 380 Volts to 11,500 Volts range.
To determine the amperage of a 3-phase motor, you would need to know the voltage at which the motor operates. Assuming a standard voltage of 480 volts for industrial applications, a 25 hp 3-phase motor would typically draw around 30-32 amps. This calculation is based on the formula: Amps = (HP x 746) / (Voltage x Efficiency x Power Factor x √3).
The fuse (or circuit breaker) rating has been exceeded. I'm assuming you do NOT have the AC unit connected to a dedicated circuit. Lets say you have a 15 amp circuit breaker and the AC unit draws 12 amps and other items on the same circuit draw an additional 5 amps. The total amp draw of 17 amps exceeds the circuit design capacity; so the breaker trips to prevent an electrical fire. If you have old style fuses, NEVER EVER replace a fuse with a higher rated one! I'm sure you'd have problems with your homeowners insurance paying out when your house burns down. Have an electrician run a new dedicated circuit for the AC unit and you should be good to go.
Amperage or Amps is a measure of is the flow rate of electrical current that is available.
Yes
The total current in the circuit would be 12 amps. When electrical loads are connected in parallel, the currents add up. So if each load draws 6 amps, the total current would be the sum of both loads, which is 6 + 6 = 12 amps.
It limits the current to the circuit at 20 Amps. If a load on the circuit draws more than 20 Amps the breaker will trip and interrupt the current to all devices on the circuit.
If you put an 8 amp circuit-breaker in a power circuit that draws more than 8 amps, the circuit-breaker would trip or disconnect the circuit to prevent overheating and potential fire hazards. It is important to always use the correct amperage rating for circuit-breakers to ensure safe operation of electrical circuits.
Full load amps is the maximum rated amps that the motor should draw according to its nameplate rating. Running load amps is the actual amperage the motor is drawing at that point in time when the test is taken. Some motor loads vary depending on if the load is cyclic. The reading on this type of motor would be from no load amps to full load amps.
The electrical code book states that a 40 HP 230 volt three phase motor draws 104 amps. For that motor the wire must be rated for 131 amps, Non time delay fuses at 300 amps, time delay fuses at 175 amps or a 250 amp circuit breaker. When calculating wire sizes and motor protection the motor's full load amperage should be taken from the motor's nameplate.
Depends on how big the motor is. A stronger motor will draw more amps then a weaker or less efficient motor. For example a wiper motor draws far less then a starter motor.
It depends on the voltage of the motor, and whether it is single-phase or 3-phase. A 120 VAC 2HP single phase motor draws almost 20 amps, a 240 VAC single-phase 2HP motor draws about 10 amps. A 480 VAC 2HP three-phase motor only draws about 6 amps.
The circuit breaker is sized to the full load amps of the motor times 250%.
I have a single phase induction motor. It draws 8 amps on start up and climbs to 14-15 amps when I put a load on it. When I don't have a load it runs at 1 and climbs to 2-3 amps. It is normal operation for this motor to run at the lower number of amps with a load. But I don't know what is wrong.
The formula you are looking for is Watts = Amps x Volts.
A 1 horsepower motor typically draws around 10 amps at 120 volts and 5 amps at 240 volts. The actual amperage can vary depending on the efficiency and design of the motor.