How do I know if my simulation correlates to reality?

-September 07, 2015

The attention the major test and measurement companies paid to comparison of simulation and measurement caught my attention at this year’s PCI-SIG (Peripheral Component Interface-Special Interest Group) DevCon. Both Keysight and Tektronix focused their entire press briefings on the topic.

So I asked around.

As simulation has become recognized as the better, cheaper, faster way to design PCBs and circuits, its accuracy has become an ever more important question. The question is usually phrased: “How do I know if my simulation correlates to reality?”

Never one to mince words, Al Neves, Chief Technologist at Wild River Technology, suggests, “Don’t trust the EDA [electronic design automation] tool, all of them have issues and user problems.”


Figure 1 Example of simulation-measurement comparison, courtesy of Wild River Technology.

A measurement-simulation comparison should put the simulation on trial, not the test setup and not the analysis implementation.

To make sure that we get what we’re after, apples to apples, dust to dust, Sarah Boen, Solutions Marketing Manager at Tektronix, says, “Either remove, or ‘de-embed,’ the effects of the measurement circuit—like the scope, probes, cables, breakout channels—that are not in the simulation or include models of them in the simulation. In either case, you need to measure the S-parameters of the measurement circuit itself. The S-parameters give the phase and frequency response of transmission, reflection, and, if you measure everything on the board, crosstalk. Your test equipment should be able to de-embed the effects or your simulation can include them in the model.”


Figure 2 The slide that Tektronix presented to the PCI-SIG press briefing (hacked by Ransom, Copyright 2015 PCI-SIG).

If de-embedding puts you uncomfortably close to the test equipment’s noise floor, I’d put the measurement circuit response in the simulation.

Heidi Barnes, World Wide Applications Engineer for High Speed Digital of Keysight Technologies, says, “It’s important to study the trade-offs and differences between simulation and measurement. Sometimes it is as simple as going back to school on sampling theory and realizing that simulators, real-time scopes, and sampling oscilloscopes all have different approaches to band-limited data. At high frequencies, it is amazing how many people forget the realities of reflections and the importance of clearly defined reference planes for both measurements and simulations. Verification in both the time, TDR/TDT, and frequency, S-parameter, domains is critical to make sure that the simulation set-up matches the measurement set-up.”


Figure 3 The slide that Keysight presented to the PCI-SIG press briefing (hacked by Ransom, Copyright 2015 PCI-SIG).

Make sure that you use the exact same software for analyzing measured and simulated data. Boen recommends staying true to the AMI component of your IBIS/AMI models. “Make sure that you have the same model in the scope and in the simulator. For example, the reference receiver equalizer should be identical to the silicon implementation—not just in method but in parameterization—if they aren’t the same, you won’t know what you’re comparing.”

Once we believe that the simulation and measurement are analyzing identical systems, it’s time to compare.

“One should always have a ‘calibration’ technique for both measurements and simulations,” Barnes says. “My favorite is to start simple and just verify that the simulator or measurement calibration can accurately handle phase and frequency for Insertion Loss and Reflections by looking at a series resonant impedance discontinuity like the Beatty standard we presented in a DesignCon tutorial last year with Wild River Technology and Speed Edge.”

Personally, I’m with Boen. She likes to go for the jugular and start with BER (bit error ratio) analysis. Though it probably won’t work on your first pass, the odds are lottery-like that measured and simulated horizontal and vertical bathtub plots, BER(t) and BER(V), could align without strong simulation-measurement agreement. Boen says, “If the bathtub plots are close, then look deeper, compare a range of transmitter output settings to look at the trend between the measured and simulation data and different channel models.”

If overlaid plots of measured and simulated data agree qualitatively, it’s worth some statistical analysis. You can estimate the uncertainty in your measured data by repeating measurements and propagating known factors, like detector noise, through the simulated algorithms. Remember, transmitter equalization (e.g., de-emphasis) and receiver CTLE (continuous time linear equalization) both amplify noise. Once you have error bars you can perform a chi-squared analysis and perform a rigorous quantitative analysis of simulation accuracy.

And remember what Neves says, “Verify heavily.”

Also see:


Want to learn more? Attend DesignCon 2016, the premier conference for chip, board, and systems design engineers. Taking place January 19-21, 2016, at the Santa Clara Convention Center, DesignCon will feature technical paper sessions, tutorials, industry panels, product demos, and exhibits. Register here.
DesignCon and EDN are owned by UBM Canon.


Loading comments...

Write a Comment

To comment please Log In

FEATURED RESOURCES