answersLogoWhite

0


Best Answer

When they give you WATTS the also give you VOLTS. Using some very simple math you can then figure out AMPS.

WATTS = Amps x Volts

User Avatar

Wiki User

13y ago
This answer is:
User Avatar
More answers
User Avatar

AnswerBot

5mo ago

Appliances are rated in watts to indicate the total power they consume, which is the product of the voltage and current they draw (P = V x I). Some appliances may also be rated in amps, which measures the current they require from the power source. Both ratings are important for determining electrical loads and ensuring safety.

This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: Why are some appliances rated in watts and others amps?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Related questions

A Type of power used by appliances in the homes is what?

home appliances are rated by watts or amps


What is the VA rating for common appliances?

For all intents and purposed the VA rating is the same as the wattage rating of appliances. VA is an electrical classification for Volt Amps. The formula for watts is, Watts = Amps x Volts.


How many amps does a griddle use?

I assume that you are asking about an average, household kitchen griddle. More counter-top kitchen appliances are rated in the 1,000 to 1,250 Watt range. Since Watts Watts divided by Volts.


Can you use 20 watts or 30 watts in a fuse?

No, watts are a measure of power while fuses are rated in amperes (amps). To determine the fuse rating, you need to calculate the current in amps by dividing the power in watts by the voltage. Then, choose a fuse that is rated equal to or slightly higher than the calculated current in amps.


How many home appliances can be run using a 2.5 KVa generator?

It depends on the amperage's of the appliances. You should be able to draw, Amps = Watts/Volts, 2500/120 = 20.8 amps at 120 volts.


How many amps does a microwave use?

A typical microwave rated at 1100 watts uses 10 amps of power. This is calculated by dividing the number of watts by the voltage of 110.


How do you decide which appliances can be used in a circuit without overloading it?

each appliance should have a rating label showing the amps or watts used, Add up these figures to see if they exceed the capacity of the circuit. Amps =watts/voltage


Does 50 amps rated at 240 volts equal 100 amps rated at 120 volts?

When you multiply amps x volts the product is watts. Using this formula W = Amps x Volts should give you your answer.


How can you calculate which fuse to use for different appliances?

The basic equation is Watts divided by Volts equals Amps W/V = A


How many amps does a double 100 watt spotlight use?

If each spotlight is rated at 100 watts, together they would use 200 watts. To convert watts to amps, you can use the formula: Amps = Watts / Volts. Assuming a standard voltage of 120V in a household setting, the double 100 watt spotlight would use approximately 1.67 amps.


How many amps are needed for 24hours?

Depends on how many, and what electrical appliances you have. If you are on 110volts, the current (amps) at any time are the kiloWatts you are running times 1000 (=watts) divided by 110


200 watts to amps 12v?

To convert watts to amps, you can use the formula: Amps = Watts / Volts. In this case, to convert 200 watts at 12 volts to amps, it would be: 200 watts / 12 volts = 16.67 amps. So, 200 watts at 12 volts is approximately 16.67 amps.