2
Chat with our AI personalities
A 100 ohm resistor carrying a current of 0.3 amperes would, by Ohm's Law, have a potential difference of 30 volts. A current of 0.3 amperes through a voltage of 30 volts would, by the Power Law, dissipate a power of 9 watts. You need a 10 watt resistor, alhough it is better to use a 20 watt resistor. E = IR 30 = (0.3)(100) P = IE 9 = (30)(0.3)
It depends on the voltage applied across it. But the maximum current is limited by the power-rating of the resistor (power divided by the square of the voltage).
A resistor doesn't have a power factor. However, if a circuit is pure resistance in nature the power factor will be one when a voltage is applied and a current flows in the circuit. The power factor is a measure of the relative phases of the current and voltage in a circuit.
A typical resistor will burn out when it dissipates power in excess of double its power dissipation rating for an extended period of time. The power dissipated by a resistor is equal to I2R or E2/R, where E = the voltage across the resistor I = the current through the resistor R = the resistance of the resistor
Who can tell? The power rating of a resistor simply tells us the maximum power that resistor is capable of handling; it doesn't tell us anything about the actual power being produced for any given current. So, to find out the voltage drop across that resistor, you will need to find out its resistance, and multiply this value by the current you specify.