When it came to broadcast, we did things differently... When I did the 318A distribution amp for ABC, it had to have -60 of isolation between splits (8 out) when +4 dBm (in those days) was wrongly put into an output, and the grand test was to take a walkie talkie and push it into the guts and click it on and off many times, if you saw anything on anything, it was rejected. The 318a had -105 CMMR at 90khz. The networks actually created the ELCO connector, as they needed to have a connection that could handle +8 into 150 Ohms (called "fader level"). The original design was the ELCO that had all the pins in parallel (you may have asked yourself why they didn't do that, it's still available), but had a lot of capacitance/crosstalk at 20Khz, so they staggered the pins, which is the ELCO of today. We did the distribution system (and intercom system) for the '84 Olympics with ABC, and their was a roll off of 1dB at 20K, they wanted to fail the whole system, until I pulled out the ELCO pins and connected them outside the shell, showing that it was the connector they specified that was the problem... The accepted it.It's a different universe in a broadcast system. It is totally common to have (say) a 1 kHz tone appearing on random remote lines....into a channel or out a bus...preparing for the next segment of the programming.Then any sort of crosstalk becomes a problem.
Bri
Also, in those days, many stations had the transmitters on the roof. That was hot with RF. You only begin to understand grounding when you fix that stuff...