Word clock sync, thoughts

GroupDIY Audio Forum

Help Support GroupDIY Audio Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.

Andy Peters

Well-known member
Joined
Oct 30, 2007
Messages
2,031
Location
Sunny Tucson
Consider the usual audio converter box, with a delta-sigma chip, and the box has a word clock input. From the word clock input, we synthesize the modulator clock using a PLL. From that modulator clock, we also derive a bit clock for the I2S data shift register and an LRCLK which tells the shift register how to frame the serial data bits. One might presume that LRCLK indicates "the start of the sample frame," or the sampling instant.

Because of how the PLL works, the LRCLK divider output aligns with the incoming reference (the word clock). The BCLK divider is also phase-aligned with the incoming reference. (Of course LRCLK and the data must be synchronous with BCLK.)

Further, consider a black-box PLL which synthesizes the modulator clock from the incoming reference clock, but doesn't give you access to the internal divider. In other words, you don't get BCLK and LRCLK from the divider for free. Instead, you must use an external divider to get those two clocks. But without some consideration of resetting those external dividers,  there's no way to ensure that the resulting LRCLK will line up with the input word clock.  The edge of LRCLK could be anywhere within the sample period, on any edge of the 512x modulator clock.

The question: does it matter that the LRCLK used to drive the converters isn't aligned with the word clock?

My answer is simple: no.

But not for the reason you might think.

I have a MOTU 828 mk 2 and a another 8-channel ADC box. The two are connected via ADAT lightpipe. The 828 sources a word clock to the ADC box which slaves to it.

This lead me to speculate: if I send the same signal, say a snare drum hit, to an input on the 828 and to an input on the ADC box, is there a delay between the two? What started me down this path was a non-obvious (to most) musing, which was that the converters in the two units had different group delays, and since group delay is given in the number of samples (and sometimes it's fractional!) it was obvious that while we can sample at the same instant, the prop delays through the converters would be different so the two signals are not lined up.

In this set-up, there is also the latency in getting the samples into the ADAT serializer, and getting them back out of the ADAT deserializer in the 828, and then putting them into whatever buffer along with the samples from the  828's internal converters.

I did an experiment, feeding  a snare sample into one channel of each converter box, recording, and looking at the waveforms in the DAW. Sure enough, the two signals were clearly delayed by many samples. (I will re-create this experiment and post pictures.)  As such, it is apparent that we can never sample the same signal with the two different boxes and get the samples to align.

Knowing that this delay was non-zero and possibly significant, when I use this set-up to record live shows I always ensured that the drums were input on one of the boxes and the other instruments were on the other. That is, I didn't have kick drum on the 828 and snare on the other box.

And that's why I think that it doesn't matter that I2S LRCLK which ultimately feeds the converter chip doesn't need to be edge-aligned with the incoming word clock. Obviously it needs to be exactly the same frequency (the PLL's job) but the phase within the sample period doesn't matter.

Please, shoot holes in my hypothesis.

-a
 
Interesting.
If you send the snare sample into two different channels on the same converter are they exactly aligned?
 
[You appear to be discussing observed behavior; here I'm talking about what I think should be]
Andy Peters said:
The question: does it matter that the LRCLK used to drive the converters isn't aligned with the word clock?
I'd split that question in two halves: short term (cycle-to-cycle) and long term (average).

On a cycle-to-cycle basis you really don't want a converter to track a possibly jittery WCLK; this is why you have a PLL.

On a long term average I would expect a converter/transmitter to have a constant phase offset between external WCLK and internal LRCLK. I would argue that a well-designed converter will work to minimize this offset, and that different devices from the same make/model should be expected to have identical phase offset (and latency) when fed with identical WCLK signals. Receivers should accept incoming bitstreams (AES/SPDIF) with arbitrary phase offset to WCLK.

(Maybe this is my phased array background speaking, but I cannot fathom why a converter would not abide by this)

Andy Peters said:
I have a MOTU 828 mk 2 and a another 8-channel ADC box. The two are connected via ADAT lightpipe. The 828 sources a word clock to the ADC box which slaves to it.

[...]

In this set-up, there is also the latency in getting the samples into the ADAT serializer, and getting them back out of the ADAT deserializer in the 828, and then putting them into whatever buffer along with the samples from the  828's internal converters. [...]
In a heterogeneous setup like yours, it is indeed reasonable to expect different latency in the paths, for the reasons you mention. I would hope this latency difference is constant (and thus correctable), though.

JDB.
 
jdbakker said:
[quote author=me]The question: does it matter that the LRCLK used to drive the converters isn't aligned with the word clock?
[You appear to be discussing observed behavior; here I'm talking about what I think should be]I'd split that question in two halves: short term (cycle-to-cycle) and long term (average).

On a cycle-to-cycle basis you really don't want a converter to track a possibly jittery WCLK; this is why you have a PLL.

On a long term average I would expect a converter/transmitter to have a constant phase offset between external WCLK and internal LRCLK. I would argue that a well-designed converter will work to minimize this offset, and that different devices from the same make/model should be expected to have identical phase offset (and latency) when fed with identical WCLK signals. Receivers should accept incoming bitstreams (AES/SPDIF) with arbitrary phase offset to WCLK.
[/quote]

I think you're misunderstanding my question. Assume that it's an ADC design, so there is no AES/SPDIF receiver. We would like to sync this ADC to some external word clock source.

LRCLK is one of the I2S inputs to the converter. It is an output of the PLL, divided down from the VCO output (MCLK). It's not the raw word clock input. 

That said, in the usual sort of PLL (think 4046 and its friends, and of course you know this), lock is achieved when LRCLK aligns with the word clock. We divide the VCO output down back to the word clock rate and use that divided signal as an input to the phase comparator. It's convenient that we need a word-rate clock (LRCLK) to drive our converters. This addresses your cycle-to-cycle concern, as we simply don't use the word clock for anything other than the PLL reference clock input.

Regarding your long-term concern, in the 4046-type PLL, the phase offset between WCLK and the generated LRCLK is zero when locked.

But what if my PLL is, as I suggest, a black box, where you don't get access to the internal divider, so you use an external divider to get LRCLK. And that LRCLK may have some phase offset from the original WCLK input, as opposed to the 4046 design. 

Does that phase offset ultimately matter? The effect is that the ADC which sources the word clock has a different sample instant than the one which receives the word clock, but we are talking a sample period of 20 us for 48 kHz.

Ideally, as you suggest, that phase offset would be constant, not only with every power cycle or resync, but also the same from unit to unit (two ADC boxes have the same offset).



I can imagine some mechanism to ensure that my external-divider-generated LRCLK is phase-aligned with the incoming word clock. A few lines of VHDL, I think, in a small CPLD.
 
Common practice is to not separate like instruments across different converters as you do get phase issues as mentioned.  I avoid this even when it's the same brand of converter.

Imo wclk is outdated and it makes more sense to sync to a higher frequency these days.
 
john12ax7 said:
Common practice is to not separate like instruments across different converters as you do get phase issues as mentioned.  I avoid this even when it's the same brand of converter.

That's what I assume is standard practice.

Imo wclk is outdated and it makes more sense to sync to a higher frequency these days.

Distributing the higher-frequency modulator clock can make sense; after all, we run gigabit rates down cheap twisted-pair cable all the time. But I suppose that makes my question moot, in the sense that I'm asking whether it matters that the start of the sample in unit A is exactly the same as the start of the sample in unit B.

And the duck just dropped down from the ceiling with the magic word, which of course is "skew" between channels.

Now to combine the two halves of your response: I run into digital mixing consoles that have expansion units to provide for more inputs than come standard. I have to believe that the communication between the remote expansion unit and the main processing unit, over Cat-6 cable or whatever, has a known latency and that the guys designing the mixer recognize that having all samples occur at the same time with zero latency or skew between ALL inputs and ALL outputs is paramount.

-a
 
If everything is properly compensated then it theoretically shouldn't matter.  And in practice you could probably be off a little and it still wouldn't make a big difference.

So I would say it should be identical with both converters,  but doesn't necessarily need to be for many applications
 
Back
Top