> use power transformers as output transformers to interface with say an 8ohm speaker. How do you figure out what the impendances of a power transformer would be?
Assume that the "120V" winding likes 1K-2K, "240V" likes 4K-8K. Working at lower impedance increases copper loss and leakage inductance treble-loss, but extends bass response. Working at higher impedances cuts bass response and may run into capacitiance losses.
Use the voltage ratio (squared) to estimate the secondary impedance.
For low distortion, keep the audio signal down to 1/3rd, preferably 1/10th, of the rated power voltage. That means a "240V" (336V peak) winding is only good for a B+ voltage like 100-150V, preferably more like 30-40V.
When measuring bass response, signal level matters. Bass response improves as you shift from very-very low level to medium level, then falls off at high level. This is true of all transformers, and is another view of bass distortion. So after you figure your maximum signal level, check bass response at -6dB, -20dB, and -40dB, to be sure it won't bass-cut weak signals too much.
> since they're designed for single-frequency use (60Hz), they're wound with no regard for winding and leakage capacitance.
That would be true if customers only used them for resistance loads. But most get used with rectifier-capacitor loads. Then leakage inductance can severely kill the DC output voltage. The spike waveform in the rectifier is equivalent to high harmonics, which if suppressed by leakage inductance will give disappointingly low voltage and rejected shipments.
Anyway, these things are all built as CHEAP as possible. With minimum material. Minimum copper leads to low leakage inductance and low capacitance. To keep it from burning up on power, they used fairly high-inductance iron, and it is the ratio of iron permeability to air permeability that sets the frequency range of a simple transformer. So for simple commercial reasons, they are not bad; it would cost more to make them NOT pass most of the audio band.
> They're also not designed to carry direct current,
No, but. There are two ways to deal with DC current. Use much more iron, or slip some air in the iron to keep flux density off the steep part of the curve. If all transformers were priced by the pound, obviously any tranny with DC should be air-gapped. But transformers today are not priced per pound: audio transformers are specialty items with specialty prices; power transformers are commodity items with incredibly competitive pricing (and vast quantity of over-stock on the surplus market). So the same job might be done with a 1VA special gapped core for $70, or with a 20VA commodity non-gapped core working at 1VA for $20.
Also the smallest trannies, around 10VA, are very often run with half-wave rectifiers that induce DC in the core. Since this is obviously the cheapest class of work, buyers don't expect precise operation, but the iron must not DC-saturate, suck huge AC current, and burn-up. So there must be some tolerance for DC flux at least in the smallest trannies.
As a very rough starting point, figure the AC current that the winding is rated for, and keep DC current to 1/10th of that (for E-I lamination, NOT for torroids). And if the next larger standard size is only a couple bucks more, get the bigger one. Too big won't do much harm, too-small sure will. The main limits on "too big" are really price and weight, and in the 10VA-50VA range the cost and weight are usually not problems.
Do NOT run DC through a torroid! You can get away with it in a stacked E-I core because the butt-lap joints between the iron sheets are in effect very small air gaps, delaying DC saturation. Torroids made by winding a single long strip of iron into a ring have no effective gap at all, and will saturate with DC current far-far-far smaller than their AC current rating. Since you might need a torroid 100 times bigger than a proper gapped core, or 10 times bigger than an "ungapped" stack-core, and torroids cost more per VA, they are poor ideas for any DC application.