Background

Paving the way for terabits per second

Sjoerd Op ’t Land is a researcher and high-speed specialist at Technolution. Gerard Rauwerda and Erwin Kerkdijk are business developers at Technolution.

Reading time: 5 minutes

Increasing digitization and the corresponding security requirement, smart industry and the enormous growth of cloud applications and data centers are generating a growing need for high data speeds. Other trends are feeding the appetite for bandwidth, too, like more online communication and remote working. But the limits of what’s physically possible are coming in sight and there’s no bending the laws of nature. How do we protect signal integrity and yet realize high speeds – and all this within acceptable project budgets?

By far the best way to transfer large quantities of data is with light. Communication through fiber-optic cables can achieve dazzling speeds, up to hundreds of terabits per second in test situations, and the energy loss in glass fibers is minimal. While light is the ideal vehicle for data, data journeys always begin and end with electricity – dozens of centimeters of PCB traces, with bumps and transitions. This is where the bandwidth bottleneck is situated.

This article is exclusively available to premium members of Bits&Chips. Already a premium member? Please log in. Not yet a premium member? Become one for only €15 and enjoy all the benefits.

Login

Related content