Signal integrity is all about ensuring that the ones and zeros transmitted appear as they should at the receiver end, while power integrity is about having enough current in the drivers and receivers to send and receive the ones and zeros. Keeping the data and power supply clean while minimizing crosstalk, noise, jitter, and inter symbol interference (ISI) helps ensure proper circuit operation.
In high-performance systems, including those based on multi-die architectures, there are interconnects between the package, substrate, PCB, and backplane. All of these need to be assessed for signal and power rail quality; otherwise, there’s the risk of failure. Ideally, any issues should be detected via pre-silicon analysis. Waiting until you’re at the bring-up lab to find SI and PI problems is simply too late in the design project.
However, the combination of today’s faster data rates with more complex protocols are making it harder to comply with SI and PI requirements. Consider DDR memories as an example. As each generation provides increasingly faster data transfers per second, newer and more complex equalization-based measurements have emerged. DDR5, for instance, calls for testing of parameters such as jitter sensitivity, voltage sensitivity, stressed eye tests, and loopback output timing to comply with the standard.
What’s needed to overcome these challenges are smarter, more automated chip design and signal and power integrity analysis tools. For instance, design engineers can use signal and power integrity tools to uncover system design issues earlier in the simulation phase, minimizing the risk of chip defects stemming from SI and PI bugs.