Paul van Gerven
19 September

Canada’s Xanadu is the latest company to claim a quantum advantage. While delivering an impressive result, the demonstration doesn’t represent progress for every aspect of quantum computing, says one expert.

Currently, a popular way to demonstrate a computational advantage in optical quantum computing is boson sampling. This technique is quite difficult to grasp for the uninitiated, but it’s useful to think of it as a simple marble run. Balls are inserted on top, run down a track and at the bottom, ‘randomly’ end up in one of several holes with scores attached to them. The distribution of the balls is a straightforward statistical matter.

Now think of boson sampling as a highly advanced marble run, in which photons are injected into the device and also ‘land’ in one of many output slots. Calculating the output distribution from a given input configuration is hard to do classically because of the quantum mechanical weirdness happening inside the device. Inter-particle interference produces a kind of randomness that has proven hard to deal with using conventional computers.

The more photons are involved, the longer it will take supercomputers to reproduce the output. The bigger the difference in calculation time between a classical computer and a boson sampling setup, the less likely it is that the former will ever catch up with faster hardware or a better algorithm.

Xanadu’s setup, showing from left to right: the photon source, fiber-based loops of different sizes with programmable parameters and a demultiplexer that sends the outputs to different photon-number-resolving (PNR) detectors. Credit: Xanadu

Earlier this year, Canadian company Xanadu squeezed a record number of 219 photons through a home-built quantum processing unit. In mere milliseconds, a boson sampling experiment was completed that would take the world’s best algorithms and fastest supercomputers over 9,000 years. Xanadu claims the system is over 50 million times faster than reported from earlier photonic setups.

There’s no doubt the experiment takes quantum computing yet another step forward, says optical quantum computing expert Jelmer Renema, who was involved in the experiment that previously held the quantum advantage record. “They built a bigger system with more photons. It’s an impressive result,” says Renema. But while there’s no denying that size matters, there’s also the quality of the quantum experiment to consider. And about that dimension, Renema has a few notes.


Full disclosure: Renema works at Enschede-based optical quantum company Quix, a competitor of Xanadu. Renema is Quix’s CTO as well as assistant professor in the Adaptive Quantum Optics group at the University of Twente. Bits&Chips asked the scientist Renema to discuss Xanadu’s results, but it’s good to keep the potential conflict of interests in mind. On the other hand, his two personas aren’t in the habit of contradicting each other, Renema quips.

Any experiment has imperfections and boson sampling is no exception, Renema explains. “The scientific community is still grappling with the finer details of how these imperfections affect computational performance. We do know that minimizing them is important, probably just as important as the size of the system. It’s simply not possible to compensate for imperfections by adding more photons.”

Consider the phenomenon of decoherence. In qubit-based quantum computers, this is typically explained as an error-causing disturbance of delicate quantum states, brought about by interactions with the environment. In photonic quantum computers, decoherence is a slightly different animal, but it has a comparable effect on performance. Another major factor is connectivity: the degree to which input and output channels can talk to each other. In a fully connected system, photons entering through an input have an equal chance of exiting through any output channel as long as there’s no quantum weirdness going on.

In terms of decoherence and connectivity, Xanadu’s results aren’t as good as the previous photonic quantum advantage demonstration, says Renema. So while it’s progress on one front, it’s a retreat on another important aspect of quantum computing.

Renema also has issues with the way Xanadu is qualifying its results. Perhaps that shouldn’t come as a surprise in a field that focuses on outdoing classical computers, leaving no way to verify the results directly. Researchers need to fall back on specially devised statistical tests to demonstrate the validity of their results.

“Xanadu used a test developed by Google, but colleagues and I have already poked holes in that test. In simple terms, the test involves trying to spoof the results of the boson sampler by classical means. If it’s easy to tell whether a particular type of test has been used, it’s not a very good test. We’ve shown that’s the case for Google’s test. Other tests might not yield equally rosy results,” states Renema.

Fault tolerant

Bits&Chips emailed Xanadu for comment on the issues raised by Renema, but the company only provided a general response. “The constraints do not prevent a demonstration of a quantum computational advantage. The team behind our photonic quantum processor Borealis is now squarely focused on building a fault-tolerant quantum computer heavily based on integrated photonics. The success of Borealis helped us validate key technologies necessary for this goal,” writes Jonathan Lavoie, who led the development at Xanadu.