Eeehh, if by "work" we mean power then increasing the load might increase the current and therefore power. Presumably the intent is to push the transformer to emit higher distortion. Although I suppose the current will actually depend on the source impedance of the supply which is presumably phantom which is not particularly low and therefore the voltage could simply drop and therefore it won't really increase power much or at least not nearly enough to cause the transformer to emit higher harmonics and this is a run-on sentence. Of course this is all wild speculation without a schematic of further explaination.ruffrecords said:Adding a resistor to the output of a transformer does not make it work harder. The only thing that makes it work harder is a bigger signal voltage.
Transformer distortion depends primarily on the voltage applied to the windings. Increasing the load will simply increase the losses in the transformer.squarewave said:Eeehh, if by "work" we mean power then increasing the load might increase the current and therefore power. Presumably the intent is to push the transformer to emit higher distortion. Although I suppose the current will actually depend on the source impedance of the supply which is presumably phantom which is not particularly low and therefore the voltage could simply drop and therefore it won't really increase power much or at least not nearly enough to cause the transformer to emit higher harmonics and this is a run-on sentence. Of course this is all wild speculation without a schematic of further explanation.
Interesting. I didn't know that. I thought distortion was a function of how much of a magnetic field has been built up and more specifically how close that magnetic field is to saturating the core. I also thought that magnetic fields are a function of current and not voltage.ruffrecords said:Transformer distortion depends primarily on the voltage applied to the windings. Increasing the load will simply increase the losses in the transformer.
squarewave said:Interesting. I didn't know that. I thought distortion was a function of how much of a magnetic field has been built up and more specifically how close that magnetic field is to saturating the core. I also thought that magnetic fields are a function of current and not voltage.
So if drive 600:600 transformer with a very low impedance such that the voltage across the primary is constant regardless of load and then a) load the secondary with 600 ohms and measure the distortion and then b) load it with 150 ohms and measure the distortion again, there will be no increase in distortion between cases a and b even though there will be 4x as much current going through it in case b?
Ahh, that is interesting. And it kinda makes sense because the increased load current is going into the load and so it cannot contribute to increasing flux density.ruffrecords said:Jump to paragraph 1.1.3 where Bill says: "Because secondary current flow is in the opposite direction, it creates magnetic flux which opposes the excitation flux. This causes the impedance of the primary winding to drop, resulting in additional current being drawn from the driving source, which creates additional flux just sufficient to completely cancel that created by the secondary. The result,which may surprise some, is that flux density in a transformer is not increased by load current. This also illustrates how load current on the secondary is reflected to the primary"
And it is flux density that determines distortion (see section 1.3.1)
Enter your email address to join: