6 AWG will handle 50 amps with a voltage drop of about 4 volts. If you go to 4 AWG and limit to 50 amps your voltage drop will be 2.5 volts.
12 AWG
The American Wire Gauge code table shows 8 gauge safe for 24 Amps, 10 Gauge for 15 Amps. If the circuit is going to be used at capacity (2400 Watts in this case), 8 or 10 gauge is the minimum, if load is constant, use 8 gauge. Voltage ability of the wire is dependent on the insulation thickness and material. So 20 amps at 120 Volts is 2400 watts of power, and 20 amps at 12 volts is 48 watts of power. Both would require the same gauge of wire, but the higher voltage would need better insulation. <<>> This is a voltage drop question. A #1 copper conductor will limit the voltage drop to 3% or less when supplying 20 amps for 500 feet on a 120 volt system.
A #6 aluminum conductor will limit the voltage drop to 3% or less when supplying 20 amps for 200 feet on a 240 volt system.
For single phase 30 amps at 120 volts you would need a #8 copper wire with an insulation rating of 90 degrees C.
The minimum wire size allowable would be 10 gauge. For a 75 foot run however, the voltage drop would be 5.31 volts. This exceeds the NEC recommendation of no more than 3% (3.6v) so while it is not legally required, for best efficiency, you should run 8 gauge.
Heavier wires do not necessarily mean better sound. Typically, 18 or 16 guage wire is sufficient for home audio systems. If you are going to be running extremely long distances, say 100 feet or more, then increase the gauge size to like 14 or 12 to keep from losing too much audio signal. In a car, 18 guage wire should be quite sufficient.
A #14 wire will do the job.
10 AWG in copper.
You have to know the maximum amps you wish to deliver or draw of the device or devices you wish to power. There are charts on the internet for wire gauges and distance that you can follow. For example: A 12 gauge wire will handle 20 amps safely for runs up to 100 feet. You go down 1 gauge for runs over 100 feet. So if you are running 175 feet at 20 amps you should use 10 gauge so voltage drops don't occur. Voltage has no effect on rated current output other than selecting a wire or cable that is rated for the working voltages. 20 amps at 240 volts is the same as 20 amps at 120 volts. As long as the wire is rated at 240 volts, it will carry 20 amps at voltages less than 240 the same.
The American Wire Gauge code table shows 8 gauge safe for 24 Amps, 10 Gauge for 15 Amps. If the circuit is going to be used at capacity (2400 Watts in this case), 8 or 10 gauge is the minimum, if load is constant, use 8 gauge. Voltage ability of the wire is dependent on the insulation thickness and material. So 20 amps at 120 Volts is 2400 watts of power, and 20 amps at 12 volts is 48 watts of power. Both would require the same gauge of wire, but the higher voltage would need better insulation. <<>> This is a voltage drop question. A #1 copper conductor will limit the voltage drop to 3% or less when supplying 20 amps for 500 feet on a 120 volt system.
18 amps.
12
8 gauge will be sufficient with less than a half volt drop
A #6 aluminum conductor will limit the voltage drop to 3% or less when supplying 20 amps for 200 feet on a 240 volt system.
This is a voltage drop question. To answer this question an amperage is needed.
To answer this question a voltage needs to be given.
For single phase 30 amps at 120 volts you would need a #8 copper wire with an insulation rating of 90 degrees C.
You would need at least 3 AWG at 120 volts, giving you a 4.8 percent voltage drop at the maximum load of 37.5 amps (using 30 Amps at the ordinary 80 percent rated capacity of the circuit). For 240 volts you would only need 6 AWG.