Sjoerd Op ’t Land is a researcher and high-speed specialist at Technolution. Gerard Rauwerda and Erwin Kerkdijk are business developers at Technolution.

21 October

Increasing digitization and the corresponding security requirement, smart industry and the enormous growth of cloud applications and data centers are generating a growing need for high data speeds. Other trends are feeding the appetite for bandwidth, too, like more online communication and remote working. But the limits of what’s physically possible are coming in sight and there’s no bending the laws of nature. How do we protect signal integrity and yet realize high speeds – and all this within acceptable project budgets?

By far the best way to transfer large quantities of data is with light. Communication through fiber-optic cables can achieve dazzling speeds, up to hundreds of terabits per second in test situations, and the energy loss in glass fibers is minimal. While light is the ideal vehicle for data, data journeys always begin and end with electricity – dozens of centimeters of PCB traces, with bumps and transitions. This is where the bandwidth bottleneck is situated.

High-speed data transfer across electrical connections faces many challenges. Ideally, zeros and ones are transmitted by sending low and high voltage pulses, respectively, across a copper trace and the same voltage appears at the end of the trace so that the information arrives in good order. In reality, things are more complicated. The faster the bits follow each other, the more the signal begins to behave like an electromagnetic wave. Bends, changes of layers and connectors cause echoes or reflections, as a result of which successive bits begin to interfere with each other. The rapid voltage fluctuations for the zeros and ones are partially lost in the form of heat, and they may also interfere with adjacent transmissions due to crosstalk. These effects all make it more difficult for the receiver to distinguish between zeros and ones. The combination of these (and other) features constitutes the signal integrity (SI) of a connection.

Designing a high-speed system affects SI in many ways, both positively and negatively. The length, shape and distance between circuits, the connectors used, the mix of serial and parallel, and the quality of the power supply (power integrity or PI) all play a role. Not to mention environmental factors during use, such as temperature (the system itself also generates heat) and humidity. Or the fact that no chip or circuit board is exactly the same. There are more than twenty designable system components in total that could impact SI and PI. That’s a lot of factors to control – and each one of them is enough to occupy a PhD student for a couple of years.

Simulate, measure, demonstrate

At Technolution Advance in Gouda, we’ve been developing high-speed applications for our clients for years. We combine this practical experience with our own research on optimizing the SI/PI of electrical data connections. The new knowledge and insights that this yields we share with our clients for them to remain ahead of the market with their innovations.

Simulations are one of the tools we use in our research of high-speed connections. They offer insight and make it possible to perform sensitivity analyses that can’t be done through measurement. A simulator can calculate the behavior of a circuit board and connectors based on Maxwell’s equations on electromagnetism. Models of chips determine whether bits arrive intact at their destination. Thanks to these simulations, we can decide pre-layout what the best architecture is and what type of material should be used. But we also use simulations post-layout, to check that the architecture that was realized still offers sufficient SI. We then measure to verify that our simulations include the correct physical factors, which means the simulations’ reliability is enhanced after each measurement.

Technolution infographic
Technolution Advance’s high-speed demonstrator. The breadboard is a testing ground for varying connection paths and possible disruptive factors. The measurements of the self-evaluating FPGA (Virtex Ultrascale+) and the results of the vector network analysis are compared to the simulations (Keysight ADS). In this way, the simulation models for the SI/PI become more and more refined.

We develop demonstrator projects to thoroughly test the theory. One of the things we use to do this is an FPGA-based demonstrator that can transport up to 50 Gbit/s per lane. The FPGA can visualize the SI on the chip using an eye pattern. In addition, we’ve designed a breadboard, a large circuit board with almost 200 different structures whose impact on the SI we want to examine. The breadboard can be linked to the FPGA so that we can actually measure the end-to-end SI. If the simulation is correct, the eye pattern of the measurement will be identical to that of the simulation. This helps us define our best practices, which can then be deployed straightaway in projects for clients.

Right approach

As experts in high-speed electronics, FPGA development, embedded systems and application software, we develop medical, cryptographic and diagnostic systems for Ethernet, PCI Express, USB, DDR, HDMI and other serial high-speed interfaces. In our experience, it’s essential for any high-speed development project to give SI and PI the attention they require from the start, just like we do with electromagnetic compatibility and producibility. The right approach can make it possible to realize reliable electrical connections at terabit speeds – within time and budget.

In the quest for very high speeds, we balance theoretical modeling with simulations on the one hand and a pragmatic, results-focused approach on the other. We’re keen to share our knowledge and experience in the field with clients and partners, in a context of co-creation that spans the entire development path. From the first back-of-the-envelope calculations to determining the compliance testing strategy for complex devices with high-speed data connections.

Edited by Nieke Roos