Receiver Tolerance Testing – with crosstalk!
Like I said in my last post, “Deterministic Jitter for High Speed Serial Receiver Tolerance Testing,” the idea behind stressed-receiver tolerance testing is to subject a receiver to the worst case compliant input signal and see if it can operate at the maximum allowed BER (bit-error rate), usually 1E-12.
The thinking is that if the receiver can operate at the maximum allowed BER under worst case conditions then, it can
work in any conditions with even the cheapest imaginable (but compliant) transmitters and channels. We’re done! Book my flight, I’m taking it to manufacturing.
That statement, “worst case compliant input signal,” isn’t, however, very well defined. Given that a technology specification confines design to a finite volume of the multi-dimensional space of restricted parameters, how can we possibly know which compliant signal is worst for a given receiver? But we have to do something and the idea of subjecting a receiver to a “worst case, but compliant” signal sounds good, even if it just flirts with reality.
In the more elaborate high speed serial specifications, receivers are tested with input signals that include a combination of some or all of: a stressful test pattern, random jitter and noise, inter-symbol interference, sinusoidal jitter, spread spectrum clocking, bounded uncorrelated jitter, and, coming soon to a technology dear to you: crosstalk.
Receiver tolerance testing in 40/100 Gigabit Ethernet (which is composed of parallel differential signals: 4 x 10 Gbit/s for 40 Gbit/s, 10 x 10 Gbit/s or 4 x 25 Gbits/s for 100 Gbits/s) provides two separate tests, both of which include crosstalk stress.
Using more than one test is a better way to constrain “worst case but compliant.” A given design has to survive both cases and a given receiver will find one worse than the other. Of course, having two tests can also allow greater design freedom. Any time a hard, single guideline is imposed, design space is eliminated.
Two “interference tolerance tests” determine the receiver’s ability to tolerate a combination of jitter and crosstalk. Both tests have the same levels of sinusoidal jitter and random jitter, but one has large crosstalk with moderate insertion loss while the other has moderate crosstalk and large insertion loss. As each receiver is tested, the tests require crosstalk on all other lanes. The aggressors (called disturbers in 40/100 GbE jargon) should operate at the prescribed voltage swing and rise/fall times.
The crosstalk stress is defined as “integrated crosstalk noise” - which I describe in an application note “Critical 100 Gigabit Ethernet Testing: Equalization, DP-QPSK, and Crosstalk Compliance.” ICN (integrated crosstalk noise) is a combination of NEXT (near-end crosstalk) and FEXT (far-end crosstalk) noise from
all the aggressors. The NEXT and FEXT of each aggressor is calculated from DC to the Nyquist frequency, with the peak
disturber differential output amplitude and a minimal receiver response assumption.
It’s important that the aggressors have uncorrelated test signals. That is, the test patterns on each disturber have to be unique. If they’re the same, the timing of crosstalk from different aggressors will be correlated and the crosstalk from different aggressors will interfere. For example, two aggressors that have little skew (or whose skew is a multiple of the bit period), will jostle the victim at the same time, every time, piling up the noise more than in a real system.