> I can find only 650 Om resister in my area.
MOST "600 ohm" gear is not very fussy. 650 will be fine for most work. I have used 470.
> Is this method apply to Urei 1178 & LA4 ACompressor (1983)?
The LA4, I know, is solid-state, and probably does not care what you load it in, 500 ohms on up.
> standard set long ago by the phone companies and since the audio industry had no real clue how to amplify a signal we looked to them.
Well, no. All audio IS telephony, just variations on Bell's idea and many practical improvements. And through the first half of the 20th century, there was more money in telephones that in all other audio combined. And in the US, Ted Vail had a far-reaching vision of AT&T's future, which included making enough money to fund a LOT of research. In computers, all "new" ideas were discovered at IBM long before; and all audio is solidly grounded in Bell Labs work. There has never been an operation quite like Bell Labs 1920-1970, though some of the other mega-telcos gave us a lot too.
Late-comers include the MGM movie sound project, and operations at the BBC and other big broadcasters. But many of these guys started in telephone operations.
And when you need racks, connectors, etc: why invent your own when WE makes excellent ones by the thousands, and Kellog and others copy them at lower price?
> didn't we used to have a 0DB operation level as pro standard as well?
0dB relative to what?
Since the late 1930s, the reference has been 1 milliWatt, any impedance as long as you say what it is (or have a transformer to convert to desired impedance). This is the "dBm". (Before that they used a meter that referenced 6mW in 500Ω, I have a pair.)
The maximum level that the US telco network allowed on their lines was +8dBm measured with a mechanical meter. Higher levels will leak into adjacent lines in the same cable; +8dBm is often audible in other lines but not real annoying. (Bell Labs did LOTs of testing, and knew things like annoyance-factors to double-zero precision.) So a meter on the sending end of the line from your radio studio in the city to the transmitter in the field should never read higher than +8dBm. Because the meter can't read millisecond peaks, you actually need at least +18dBm to +26dBm of drive power to handle the peaks of a signal that reads +8dBm on a meter.
When I was young, all broadcast boxes were calibrated to +8dBm.
Both 150Ω and 600Ω systems existed. Actually everything from 100Ω to 900Ω has been made. But a 2-winding transformer can give both 150Ω and 600Ω with minor change, and these two standards were close enough to all existing standards that everybody moved to 150Ω and 600Ω. The actual impedance of old-style open telephone lines (on glass insulators) is really more like 700-900Ω, but works fine fed 600Ω. The actual impedance of multi-pair cable, standard for at least 60 years, is more like 120-150Ω, and feeds nice from 150Ω connections. However the 1938 VU meter needed voltage to hide its rectifier drop, so was specified in terms of 600Ω. And there are a lot more in-house lines than long outside lines. And lines short than a mile don't really have an impedance at audio frequencies. Most operations standarized on 600Ω for all in-house use, only transforming to 150Ω when going outside the building. However one major network stuck to 150Ω for everything, even though it meant re-tapping standard boxes and adding transformers (or buffers) in front of all VU meters.
In AM broadcasting it was customary to assume that peaks were 10dB to 12dB above meter reading. In fact it is a statistical thing, like a lottery: there are lots of $100 winners, some $10,000 winners, and very rare $1,000,000 winners. And there are lots of 6dB peaks, a 12dB peak every minute or so, and a 16dB peak every few hours. One or two very small clips per song is no big deal in radio, where static is high and you only hear it once. But in recording you may hear the track over and over, without static, so you notice rare clipping. And much broadcast stuff, especially transistor and chip, wasn't really happy over +20dBm. So sometime in the 1960s/1970s it became a recording studio custom to calibrate everything to +4dBm in 600Ω. This gives 16dB headroom in an amp that goes sour above +20dBm. While I have seen arguments for using 18dB and even 20dB peak-factor, speech/music peaks 20dB above the VU meter reading are very rare, like once a year.
You are right: working "+8dBm" gear in our "+4dBu" world means a little worse signal-to-noise, but less distortion. The broadcast gear that got accepted into the recording world mostly had plenty of S/N to spare.
You have to remember also that the VU meter and the maximum telco line level happen to come together around 0.5% distortion, because the rectifier drop is non-linear. If you take the pad off the VU meter so "0" means +4dBm, distortion is more like 1%. In 1938, 0.5%THD was acceptable. In 1970s recording practice, it was not. So meters got buffers, and you could run a meter at +4dBm without getting 1% THD on your signal.
Good peak-reading meters came later. Some in the late 1950s, but in a big way in the late 1960s. With a true peak meter, you don't do all this 10/16/20dB headroom stuff: you KNOW if your peaks are hitting the nasty-zone of your amplifiers. So now we work with 0dBfs, where the reference is the analog voltage that translates to a digital "all-ones" condition, the maximum that the digital channel can handle. We just make sure we never reach 0dBfs (but come reasonably close reasonably often so we are not down in the digit-noise, and so we have the loudest tune on the radio).