Measuring ISI at high data rates is impossible
The frequency and loss response of channels vary with the local frequency content of a signal. The symbols effectively interfere with each other because their frequency content depends on their neighbors.At high data rates, traces on PCB behave like absurdly complex waveguides. Digital signals become electromagnetic waves propagating through FR-4 dielectric with just a tenuous grasp of the conducting trace. The complexity is reflected in their messy frequency responses, but at a glance, the most pronounced effect is that traces respond like low pass filters (really horrible low pass filters).
Consider these extremes, a long sequence of 1s followed by a long sequence of 0s. At the transition, the low pass nature extends the fall-time. If the voltage swing doesn’t make it below half the voltage swing in less than half a bit period, the first 0 will be mistaken for a 1 - a bit error. Though, eventually, the voltage will swing all the way to the lower rail.
Now consider a long alternating sequence, 1010 1010 etc. As the signal transitions from high to low to high (etc) it will oscillate about the voltage center of the eye but never reach the rails. If the swing is larger than the sensitivity of the slicer, then there shouldn’t be any errors.
Transitions of a perfect digital signal all look the same, but on the nonlinear waveguides we call “circuits” and “interconnects” or even “cables” that low-pass nature is a weak approximation of their actual transfer functions. At each transition, the waveform follows a trajectory that depends on the values of a certain number of preceding bits and, in addition to causing bit errors, the variations in those trajectories can confuse the clock recovery circuit and/or cause baseline drift.
To perform a complete analysis of ISI and how receivers tolerate it, it would be necessary to use a test pattern that includes every permutation of bits that can cause unique transition trajectories.
How long does the test pattern have to be?
The number of adjacent symbols that interfere with others corresponds to the length of the channel’s pulse response. A specific pulse response is shown in the figure. By pulse, I mean a single logic 1 buried in a long stretch of 0s. I’m not referring to the impulse response, although you can get the pulse response from the impulse response by convolving a square wave with it, or from the S-parameters by going through software. The duration of the smearing of a single bit betrays how many other bits a given bit can affect.
A transmission channel acts like a low-pass filter, distorting pulses as they pass.
Because a channel’s pulse response has a fixed length, as data rates get higher, more symbols interfere. For example, a typical FR-4 backplane pulse response lasts a few ns. At 2.5 Gb/s that’s less than ten bits, but at 25 Gb/s it’s over 75 bits. To “perform a complete analysis,” you’ll need a test pattern that includes every permutation of 75 bits. Such a pattern would be absurdly long. In fact, it would take almost 48,000 years at 25 Gb/s to transmit such a pattern.
Maybe I’m under/over-estimating humanity, but I reckon that in 48,000 years, we’re either transmitting at much higher rates or we’re long gone - alternatively, if you can convince your boss that it’s necessary, you and a quarter-million generations of your progeny will have job security. Sweet.