This might help a little on what the CCA Ultimate series was...and what that mystery line amp was really for.
Nearly every console of the era (Gates, CCA, RCA, LPB, and a few others), and before, used a passive mixing network designed to be constant impedance regardless of how many channels were on or off, and regardless of fader position. Those attenuators/faders were stepped attenuators that maintained a constant load on the source and provided a constant source impedance to the mixing network. The line inputs were entirely passive (shown as such in the block diagram you posted), with the source device going only through switching and a transformer, primarily because the faders and mixing network were unbalanced, and sources were balanced. The attenuator had a "normal" position of about 2:00, at which point they provided about 10-12dB of attenuation. Since the mixing was passive, it was also very lossy, with the total loss depending on the channel count. The mixing network was then followed by a "booster" amp to make up for the losses, typically 30dB or so, to get everything back to a normal line level of +4dBu or +8dBu (both were studio standards). There wouldn't be a line amp per input channel, there would be a line amp per bus, in this case of two stereo busses, there would have been four, not counting monitor boosters, and cue amps. The thing depended on 600 ohm source impedances that expected 600 ohm loads.
The Altec faders, indeed all stepped faders of the day, were well built to last almost forever, but suffered from two rather serious problems. First, the step size, which was 2dB, sometimes more. That meant that the limited number of steps (30 or so?) would result in 2dB jumps in level, except for the highest attenuation positions near fully CCW, where the last two jumps could be bigger. There was no smooth fading! It was only smooth if it was done quickly. There was no fine adjustment of level, just 2dB steps. The only point was: constant impedance. The other rather fatal flaw (again, all manufacturers faders) was with stereo models. The left and right steps didn't happen simultaneously. Again, if you moved the fader quickly, it was percieved as tracking and smooth, but fine adjustments could result in 2dB channel balance errors. There was a "Cue" detent switch at the full CCW position that placed one or both channels into a cue circuit for setting up tapes and cuing records. Switching a channel off actually removed the fader from the mix network and substitued a resistor (that persistent desire for constant impedance). Channel switching was fairly complex, Program to the right, center off, Audition to the left...times 2 for stereo. That's why the big Switchcraft leaf switches with precisely "timed" switching.
By the mid 1970s, manufactures started to figure out that a continuous fader (non-stepped) was desired, and at about the same time, linear faders were becoming popular. That meant the entire topology had to change. Mixing busses were still passive, but now buffered. Line amps with make-up gain were still required. While the old stepped faders were incredibly reliable, that reliability was slowly traded for smooth wide-range attenuation. And channel tracking was then an even bigger problem, because early faders by Duncan (found in the UREI broadcast boards known as the "Mod One") didn't rack worth a hoot, and actually would alter channel levels if the slider knob was slightly rotated. Improvements in one area, more problems in others. The Duncan faders had a short life span, and would become noisy or just fail at certain points of the travel. We didn't get around those problems until low cost monolithic VCAs appeard (from dbx), and the now legendary Penney and Giles vertical faders appeared. Cost was traded off for reliability and tracking. The most popular broadcast boards of the late 1970s and 1980s were either P&G stereo faders or linear taper faders driving VCAs. Once design by Auditronics had the odd failure mode that when a fader became dirty or intermittent it resulted in the VCA going suddently to full gain. Yikes, that was a bad idea!
I'm not sure of the benefit to repurposing such a beast. Certainly not for the faders. If you add all those 500 modules...well, OK, but that's adding a whole lotta electronics driving 600 ohm loads. Talk about burning up power. The transformers in the line amps add distortion (this was pre-Jensen when transformers got good), but I guess people like distortion these days. Hope you like IMD, 'cuse you'll definitely have it. The lossy summing network will punish you with noise from the extra makeup gain required. Remember, these broadcast boards were designed with much less concern for noise, and more concern for headroom and longevity. The gain structure would offer as much as 25-30dB of headroom post-fade, and the inputs were passive, so it would be hard to clip anything. Jocks could peg the VU meter without any issues other than damaging the meter. The transformers would squish of course. But to do that, you let the noise floor come up. In broadcast it didn't matter because an AM transmitter couldn't do better than 50dB S/N anyway, and FM was only a little better at 68dB. Tape was, of course, tape...pre high output high bias, Scotch 111, so high noise floor too. And carts...no better. The basic audio quality of any of these consoles is, frankly, poor, but so was everything else back then, so what didn't matter then might be more of a concern today. We are, after all, doing (fake) 24 bits, right?
They do have a nice vintage look, though!
Watch out for the monitor muting and "on-air" sign switching on the mic inputs.