A resistive load just describes something like a toaster, electric water heater or space heater, where the load is relatively constant. The description is used to distinguish from something like an electric motor, which uses much more current at startup then drops off significantly after it is running.
AnswerIn a.c. circuits, a resistive load describes a load whose load current is in phase with its supply voltage. Expressed another way, it is a load having unity power factor. Resistive loads are not necessarily constant -for example a tungsten-filament lamp has a low resistance when cold and a high resistance at its operating temperature.
The power (in watts) can be calculated by multiplying the current (in amps) by the voltage (in volts). In this case, 10 amps at 12 volts would result in 120 watts of power (10A * 12V = 120W).
It is not recommended to have a resistive load of 12.5 amp on a 15 amp breaker. The general rule is to not load a circuit to more than 80% of its capacity, which in this case would be 12 amps for a 15 amp circuit. Overloading a circuit can lead to overheating and potentially cause a fire hazard.
To convert kilowatts to amps, you need to know the voltage of the circuit. The formula to calculate amps is: Amps = kilowatts / (volts * power factor). For example, if the voltage is 120V, then the amps would be 100A for a 12kW load.
To calculate the current in milliamps, use the formula: current (in milliamps) = power (in watts) / voltage (in volts). In this case, 1.5 watts / 12 volts = 0.125 amps. To convert this to milliamps, multiply by 1000: 0.125 A * 1000 = 125 mA. Therefore, 1.5 watts at 12 volts is equivalent to 125 milliamps.
To calculate watts, multiply the amperage by the voltage. In this case, 12 amps multiplied by 220 volts equals 2640 watts.
The power (in watts) can be calculated by multiplying the current (in amps) by the voltage (in volts). In this case, 10 amps at 12 volts would result in 120 watts of power (10A * 12V = 120W).
It is not recommended to have a resistive load of 12.5 amp on a 15 amp breaker. The general rule is to not load a circuit to more than 80% of its capacity, which in this case would be 12 amps for a 15 amp circuit. Overloading a circuit can lead to overheating and potentially cause a fire hazard.
To convert kilowatts to amps, you need to know the voltage of the circuit. The formula to calculate amps is: Amps = kilowatts / (volts * power factor). For example, if the voltage is 120V, then the amps would be 100A for a 12kW load.
It all depends on the load. The formula for calculating amps, volts or ohms (resistance of load) is E=IR, where E is the voltage, I is the current and R is the load or circuit resistance. So, if you know the resistance in ohms and the current in amps, you multiply them together to get the voltage of the circuit. Again, it depends on the load, so a 12 volt car battery can deliver 1.5 amps if the load is 8 ohms whereas a 120 volt circuit will deliver 1.5 amps if the load is 80 ohms. This is all simplified and is based on a resistive load. If the load is capacitive or inductive, then phase angles come into play and the math is more complicated using imaginary numbers and J-operators.
The amps it draws depends on how big it is. Typically 2-12 amps. Check for a manufacturer's plate that shows the wattage Most of the load in a rice cooker is a resistive heating element, so the amperage will be quite close to the wattage divided by the voltage (220 here).
To calculate the current in milliamps, use the formula: current (in milliamps) = power (in watts) / voltage (in volts). In this case, 1.5 watts / 12 volts = 0.125 amps. To convert this to milliamps, multiply by 1000: 0.125 A * 1000 = 125 mA. Therefore, 1.5 watts at 12 volts is equivalent to 125 milliamps.
Assuming DC and resistive loads, resistance equals voltage across the load, divided by the current through it. In this case 120/10 or 12 ohms.
The total current in the circuit would be 12 amps. When electrical loads are connected in parallel, the currents add up. So if each load draws 6 amps, the total current would be the sum of both loads, which is 6 + 6 = 12 amps.
The total load in watts would be W = A x V. 20 x 120 = 2400 watts. Any wattage higher than this will trip the breaker and shut the circuit off. <<>> The theoretical resistive load is V/I = 120/20 = 6 Ohms. The lower the resistance the higher the current. Usually you don't want to operate above the 80% point so the number would be 120/16 = 7.5 Ohms.
In a 12VDC circuit with a 1K load, there will be 12ma of current. (Ohm's law: Volts = Amps * Ohms, so Amps = Volts / Ohms.)
To calculate watts, multiply the amperage by the voltage. In this case, 12 amps multiplied by 220 volts equals 2640 watts.
12-2 (#12-2 conductor) wire doesn't "pull" 20 amps. However, it's ampacity rating is that of 20 amps. #12 copper wire is rated for a total load of 20 amps. So, always use a 20 amp breaker with it.