ruffrecords
Well-known member
When designing a linear power supply it is well known that you usually calculate the ac current rating of the winding as some factor times the dc current. Part of the reason is that the peak dc output is 1.414 times the ac winding voltage so the current needs to be at least 1.414 times the dc current because power in must be at least as much as power out. The current multiplier is usually 1.6 for a full wave bridge. See Sowter info here:
http://www.sowter.co.uk/rectifier-transformer-calculation.php
One design question is how big to make the reservoir capacitor. For low ripple bigger is better. This also makes regulation easier because it tends to reduce the voltage drop across the regulator and hence its dissipation. In a 12V 2A supply I use a 22000uF capacitor which gives a peak to peak ripple of less than 1V.
Recently however, I have discovered that designing using these formulae tends to lead to quite significant temperature rises is the transformer which begs the question what is an acceptable temperature rise in a toroidal mains transformer. I then came across this data sheet at Farnell:
http://www.farnell.com/datasheets/2200017.pdf?_ga=2.243874893.939696366.1502612378-493727691.1475222435
This is unusual in that not only does it separately list iron and copper losses but ait lso specifies a temperature rise which for a typical 50VA toroid is a whopping 50 degrees Celsius and this assumes an ambient of 40 degrees. It is not clear if this is core temperature rise or the rise for the transformer as a whole but wither way it implies significant temperature rises are not unusual.
Then most recently I came across this piece by a transformer manufacturer.
http://www.precision-inc.com/power-toroid-p-1296-l-en.html
wguch gives a grapgh of dc output power versus rated VA of the transformer for various core sizes and dc voltages. The dc applications section begins by saying " A transformer VA rating must always exceed the WDC rating due to the equivalent circuit series resistance, rectifier voltage drop and high peak currents during the charging of output capacitor"
This is the first time I have seen mention high peak capacitor charging currents in a transformer manufacturer's literature. The really interesting part is that for low voltages like 12V, you should design for a dc output of about 55% of rated VA which gi gives a current multiplication factor of 1.8 rather than the usual 1.6.
I have experimented with one transformer that at present is giving what I consider to be an unacceptable temperature rise. It has winding for three different supply voltages. By using just the one that consumes about 2/3rds of the power, the temperature rise is quite modest - just warm to the touch.
All this is leading me to the conclusion that, for a conservative design, it might be best to add up all the dc powers, double it and call that the VA rating of the transformer.
I am no expert on transformers so any insights would be welcome.
Cheers
Ian
http://www.sowter.co.uk/rectifier-transformer-calculation.php
One design question is how big to make the reservoir capacitor. For low ripple bigger is better. This also makes regulation easier because it tends to reduce the voltage drop across the regulator and hence its dissipation. In a 12V 2A supply I use a 22000uF capacitor which gives a peak to peak ripple of less than 1V.
Recently however, I have discovered that designing using these formulae tends to lead to quite significant temperature rises is the transformer which begs the question what is an acceptable temperature rise in a toroidal mains transformer. I then came across this data sheet at Farnell:
http://www.farnell.com/datasheets/2200017.pdf?_ga=2.243874893.939696366.1502612378-493727691.1475222435
This is unusual in that not only does it separately list iron and copper losses but ait lso specifies a temperature rise which for a typical 50VA toroid is a whopping 50 degrees Celsius and this assumes an ambient of 40 degrees. It is not clear if this is core temperature rise or the rise for the transformer as a whole but wither way it implies significant temperature rises are not unusual.
Then most recently I came across this piece by a transformer manufacturer.
http://www.precision-inc.com/power-toroid-p-1296-l-en.html
wguch gives a grapgh of dc output power versus rated VA of the transformer for various core sizes and dc voltages. The dc applications section begins by saying " A transformer VA rating must always exceed the WDC rating due to the equivalent circuit series resistance, rectifier voltage drop and high peak currents during the charging of output capacitor"
This is the first time I have seen mention high peak capacitor charging currents in a transformer manufacturer's literature. The really interesting part is that for low voltages like 12V, you should design for a dc output of about 55% of rated VA which gi gives a current multiplication factor of 1.8 rather than the usual 1.6.
I have experimented with one transformer that at present is giving what I consider to be an unacceptable temperature rise. It has winding for three different supply voltages. By using just the one that consumes about 2/3rds of the power, the temperature rise is quite modest - just warm to the touch.
All this is leading me to the conclusion that, for a conservative design, it might be best to add up all the dc powers, double it and call that the VA rating of the transformer.
I am no expert on transformers so any insights would be welcome.
Cheers
Ian