> is there something special about line level?
"Line level" implies high enough to stay away from the universal noise, low enough not to overload on reasonable size amps.
The other extremes are:
- Microphone level, which ranges from 0.1 microvolt to over 1 volt
- Loudspeakers, which often demand unreasonable size amplifiers.
Line level originally comes from telephone. It IS the level on a telephone line. Early telephone systems had no amplifiers, so line level is also earphone level. Telephones were not practical until they devised microphones that made a "line level" (or earphone level) output. Actually, the Carbon Mike "is" an amplifier: it takes battery power and modulates it with weak speech power to give enough electrical audio power to hear in an earphone.
So line level, in telephony, is a compromise between the maximum practical output of a carbon mike, and the minimum usable signal into an earphone.
If, instead of high-output microphones, we had developed high-gain earpieces, we would have had trouble with line noise. Miles of overhead wire will pick up lots of atmospheric static.
People accept that a telephone is pretty much unity-gain and imperfect: won't carry from a whisper to a shout cleanly.
The carbon mike is hot, but has high noise and distortion with limited bandwidth. All the "better" mikes have lower output.
While communicating speech has a narrow range of level, fancy speech and music has a wide range.
So for "good" radio and recording work, we need gain, and adjustable gain, right after the microphone.
We need to put the 100+dB range of performance through a 40dB-60dB recorder, wire-line, or radio link. We need to adjust gain between the mike and the "line" (most recorders and transmitters expect "line" input, fairly hot, levels already set near optimum).
Line level from a carbon mike is ~about~ 1 milliWatt average, with very wide range depending on the talker.
Maximum "line" level, while originally the level of a loud carbon mike, soon became the maximum level of one (or a few) "small" tubes. Tubes don't get any cheaper below about 1 Watt dissipation, which is good for about 100mW audio. For best fidelity on noisy lines, more than 1 mW average levels were used. US Telco standards limited level to avoid interfering with other customers in the same cable: +8dBm or 6mW on a VU-like meter. For most early work, rare clipping was not the biggest problem, so peaks were assumed to be -12dB above meter reading or +20dBm (100mW). As productions got more advanced, rare clipping became an issue, and most recording gear was calibrated to +4dBm nominal to give 16dB headroom within a +20dBm max system.
> I always assumed that you need the higher level as a standard interface between equipment
It sure can be convenient.
> and an extra umph to handle long cable runs.
Levels far above the millivolts of modern microphones are helpful in overwhelming interference. One Volt (or so) is fine. We could design for much higher line levels to reduce interference more, but above around 1mW nominal 100mW peak the cost of bigger output stages becomes a problem, and few lines need that much level.
Most lines are run with very small net loss. NYDave's mile-long line only loses a few dB. If you get into many-mile lines, you don't use a standard gear line output stage, you use a special Line Amp with +30dBm output (most can peak over +36dBm).
> where the AD converters accept mic level signals and microphones more commonly have mic level eqs, variable pads or just digital outputs.
First we designed mikes "hot" because we had no amplifiers. Then with amplifiers, we soon designed mikes as "weak" as possible, sacrificing output to get more bandwidth. So the soft-sound output of many mikes is right at the lowest level an analog system can handle cleanly. In general, digital systems can't work at such low levels (though those clever digital boys are making a liar out of me).
Also, with no analog gain control, and no restriction on mikes or performers, the range of level is 0.2 microVolts to 2 Volts. While ultimately we will deliver this in a 16-bit range of 2V to 32mV (output of a CD player), our job as recordists is to adjust the level for a nice playback. If we had to take 0.2 microVolts to 2 Volts direct at the A/D converter, we'd need 24 bits of A/D converter, with the bottom bits around a tenth of a microvolt. That may still be beyond the limit of audio A/Ds. Of course we could put a fixed gain, say 100X, between the mike and A/D, to keep the lowest levels up where a A/D can read them properly; but then the loudest mikes and performances come out at hundreds of volts which is far more than a practical A/D really wants to see.
So we still want analog gain with an analog gain control. Of course for $0.50 we can hide this inside the mike along with a $5 A/D converter. Then we need an upstream path so I can sit in my easy chair and adjust the gain of the mike in the studio or high in the balcony.
FWIW, one of the mega-mike makers has a "USB mike" intended for project studios that eats air-waves and puts-out USB digital. So the whole analog path is an inch long. USB obviously has an upstream path so you can set analog gain, other frills too if they want to do it (though once in digital, you could shape it in the CPU). While this is very cute for a new studio, it doesn't make sense for people with legacy analog gear all intended for line level analog in/out.