SSLtech
Well-known member
As part of the Bloo compressor kit project, and by way of ensuring that the product really is every bit authentic to the original, we've been doing some more in-depth T4B research.
Some discoveries have been most enligtening. We ran some tests along the lines mentioned in the UAudio webzine article on T4B aging and response patterns. The Bloo T4 behaves in exactly the same way. On our bench today, we had (amongst other experimental variants) three original UA T4B's, and three Bloo recreations. One thing which struck us was the difference in distortion on the various T4s. -Not compression-related distortion, -distortion with no GR applied. -Removing the T4 on our test "mule" gave a baseline distortion figure of 0.08% THD at 1kHz, +4dBm in/out level, 600Ω in/out impedance.
Fascinating thing: The distortion creeps downward after you pass tone through the optical unit for a while! Looking at the schematic, it became apparent that the unit doesn't even have to be powered on to achieve this, -it seems that just having a signal present across the shunt LDR improves the distortion over time!
The distortion improvement can be dramatically accellerated by "blasting" the LDR with signal. (Note: Signal, *NOT* light!!! -Blasting it for a long period with high-Gain Reduction settings on a working compressor will burn up theEL panel, not to mention stress the driver tube!) In about 2 minutes, we reduced the distortion on one T4b from 0.4% to an indicated 0.08%... (essentially unreadable above the noisefloor and residual distortion of the test mule unit) by blasting +30dBm into it for 2 minutes.
However, the distortion seems to creep back in after the T4 has been unused for a while, and right now I'm wondering if longer, slower 'Burn-in' produces a longer-lasting result...
Anyhow, suffice to say that on one of the first T4B modules which were made, the distortion appears to be permanently too low to measure, and our current line of thinking is that it might be because of how long it sat on the bench with tone being fed through it, running sweeps, checking thresholds, etc...
Anyhow... here's a practical benefit for anyone with an LA-2a or clone: If you're going to be using it on anything delicate, you might like to try running 1kHz through it overnight at a decent level.... put the 'compress/limit' switch into 'compress', and leave the unit powered off, with plenty of signal fed to the input. -By morning, I reckon you should have a well burned-in T4B module, and I'm pretty certain that I can guarantee you will have unmeasurable T4B distortion...
I'll post more as I discover it, but this is really interesting to me. -Anyone have any idea why this is happening, on a physical level? -What is it about cadmium LDRs that is happening here?
This is also nothing to do with the behaviour mentioned in the UA webzine article by the way... this is very different. The UA article only discusses optical reactions and behavours... I'm not even totally sure that they know about this behaviour...
Keith
Some discoveries have been most enligtening. We ran some tests along the lines mentioned in the UAudio webzine article on T4B aging and response patterns. The Bloo T4 behaves in exactly the same way. On our bench today, we had (amongst other experimental variants) three original UA T4B's, and three Bloo recreations. One thing which struck us was the difference in distortion on the various T4s. -Not compression-related distortion, -distortion with no GR applied. -Removing the T4 on our test "mule" gave a baseline distortion figure of 0.08% THD at 1kHz, +4dBm in/out level, 600Ω in/out impedance.
Fascinating thing: The distortion creeps downward after you pass tone through the optical unit for a while! Looking at the schematic, it became apparent that the unit doesn't even have to be powered on to achieve this, -it seems that just having a signal present across the shunt LDR improves the distortion over time!
The distortion improvement can be dramatically accellerated by "blasting" the LDR with signal. (Note: Signal, *NOT* light!!! -Blasting it for a long period with high-Gain Reduction settings on a working compressor will burn up theEL panel, not to mention stress the driver tube!) In about 2 minutes, we reduced the distortion on one T4b from 0.4% to an indicated 0.08%... (essentially unreadable above the noisefloor and residual distortion of the test mule unit) by blasting +30dBm into it for 2 minutes.
However, the distortion seems to creep back in after the T4 has been unused for a while, and right now I'm wondering if longer, slower 'Burn-in' produces a longer-lasting result...
Anyhow, suffice to say that on one of the first T4B modules which were made, the distortion appears to be permanently too low to measure, and our current line of thinking is that it might be because of how long it sat on the bench with tone being fed through it, running sweeps, checking thresholds, etc...
Anyhow... here's a practical benefit for anyone with an LA-2a or clone: If you're going to be using it on anything delicate, you might like to try running 1kHz through it overnight at a decent level.... put the 'compress/limit' switch into 'compress', and leave the unit powered off, with plenty of signal fed to the input. -By morning, I reckon you should have a well burned-in T4B module, and I'm pretty certain that I can guarantee you will have unmeasurable T4B distortion...
I'll post more as I discover it, but this is really interesting to me. -Anyone have any idea why this is happening, on a physical level? -What is it about cadmium LDRs that is happening here?
This is also nothing to do with the behaviour mentioned in the UA webzine article by the way... this is very different. The UA article only discusses optical reactions and behavours... I'm not even totally sure that they know about this behaviour...
Keith