Login

Welcome, Guest. Please login or register.

April 20, 2024, 07:05:45 am

Author Topic: Power Loss in Transformers  (Read 981 times)  Share 

0 Members and 1 Guest are viewing this topic.

BakerDad12

  • Trendsetter
  • **
  • Posts: 126
  • Respect: +1
Power Loss in Transformers
« on: May 02, 2020, 01:17:18 pm »
0
Hey guys, the textbook says that the supplied power is always P = VI but the power lost is P = I^2R. I don't understand why. Also, does the power supplied and power lost specification only apply to transformers and electrical distribution lines or everywhere other application?

One more thing is that transformers are used to step-up the voltage so that the current is decreased before being distributed. I thought current increases as voltage increases? e.g. V = IR

Einstein_Reborn_97

  • MOTM: April 20
  • Forum Regular
  • **
  • Posts: 91
  • There is no substitute for hard work.
  • Respect: +44
Re: Power Loss in Transformers
« Reply #1 on: May 02, 2020, 03:12:43 pm »
+1
Hey guys, the textbook says that the supplied power is always P = VI but the power lost is P = I^2R. I don't understand why. Also, does the power supplied and power lost specification only apply to transformers and electrical distribution lines or everywhere other application?
Hey. The supplied power as a function of voltage and current is P=VI. Both coils of the transformer have resistance, which means that they will heat up when a current flows through them. This resistance is a source of power loss. Recall that V = IR (Ohm's Law). So if you substitute IR in place of V in the first equation you'll end up with P = I2R. This represents the energy lost in resistance heating of the coils (power loss).
Power supplied and power lost apply to any electrical circuit; any network in which electrons flow from a voltage, generating current and passing through wires/coils that have resistance. (Remember that power is defined as the amount of energy transferred or converted per unit time. So this concept also comes up in motion, forces and gravity).

Quote
One more thing is that transformers are used to step-up the voltage so that the current is decreased before being distributed. I thought current increases as voltage increases? e.g. V = IR
In an ideal transformer, power in = power out. P = VI.  So if the voltage increases, to maintain the value of power, current has to decrease. Note: transformers can also step down the the voltage.
HSC 2020: Advanced English | Advanced Mathematics | Physics | Chemistry | Biology | Studies of Religion 2

My HSC Journey Journal