12 ga, 20 amp.
14 ga, 15 amp.
16 ga, 10 amp.
Generally 40 amps continuous or 50 amps surge is safe.
The terminal strip's rating is 15 amps at 600 volts. It does not matter what the voltage is up to 600 volts, the maximum amperage allowed on the strip is 15 amps. It could be 15 amps at 12 volts or 15 amps at 600 volts or any voltage in between.
5000 watts
Ohm's Law states Volts = Amps x Resistance. You would need to apply 600 volts across 3 ohm load to have 200 Amps flow in circuit. Not sure what you are really asking and why you mentioned 2 gauge.
To answer this you have to know how many volts will be used. If you know the voltage then you can calculate the current by dividing voltage into wattage. For example; an electric heater rated at 700 watts when plugged into a 115 v outlet will draw 700/115 = 6.08 amps of current.
160 amps at 12v.
160 amps at 12v.
You have your own answer. It is 1.5 amps.
Ohm's law: Volts = Amps * Ohms, or Amps = Volts / Ohms 12 volts / 0.5 ohms = 24 amps
Depends on power factor, but it should be about 8 Amps.
A #3 copper wire with an insulation rating of 90 degree C has the capacity to receive 105 amps. This is the most common or standard insulation that most calculations are based on. It is the insulation that governs the rating of the voltage. House wiring cables are insulation rated at 300 volts. Most other wiring insulation is rated at 600 volts. Special wires have a insulation factor of 1000 volts. The higher the insulation temperature factor is the higher the rating of current through the wire becomes. #3 at 60C is 55 amps, at 75C 65 amps, at 90C 105 amps, at 110C 120 amps, at 125C 130 amps, and at 200C 145 amps.
4 volts and how many amps? Watts = amps x volts. It depends on the amount of current (in Amps) flowing at 4 Volts... See Ohms Law: Watts = Volts x Amps If you have 2 Amps flowing at 4 Volts you are dissipating/consuming 8 Watts. If you have 10 Amps flowing at 4 Volts you are dissipating/consuming 40 Watts.