TUE Martijn Heck

Martijn Heck is the scientific director of the Eindhoven Hendrik Casimir Institute.

6 September 2023

The public scrutiny of the LK-99 ‘super-superconductor’ harbors important lessons for the scientific community, argues Martijn Heck.

This summer wasn’t the scientific silly season it used to be, with the announcement of a superconductivity breakthrough. Korean researchers published their results on LK-99, a room-temperature and ambient-pressure superconductor. This is the holy grail of almost every technology, from energy technology to medical imaging. And a sure-bet Nobel Prize. It was published on the Arxiv repository, which allows people to upload pre-prints of their scientific papers before peer review has taken place.

Obviously, this was an open invitation for other experts in the field to try and reproduce the findings. Social media exploded with new results, a few supporting but most refuting the claims. After only a few weeks, the scientific journal Nature posited that LK-99 isn’t a ‘super-superconductor’ after all, putting an end to a month of scientific excitement and hope.

So, have we been looking at a substantial scientific process, or just hyped-up social-media silliness?

The good part is that we’ve experienced on a global scale the process of open science and the drive for reproducibility. By uploading pre-prints, the authors shared their results at a very early stage for the world to scrutinize. Results were flowing in day after day, expanding our insight into the material. This is how science should be conducted in its purest form. Moreover, it put science in general and superconductivity in particular on the radar of the general public. The excitement and discussions about the potential impact of LK-99 went far beyond the scientific in-crowd.

ASML special

The bad part is, however, that this process started with flawed work. Experts immediately stated that the papers weren’t well-written. The claims’ credibility was low from the get-go. Even the authors themselves had disagreements about whether this work was ready for publication. The process of open science requires that people do their job well. Otherwise, the process is moot and just a waste of time and money.

The ugly part is that in most other academic fields, sloppy work isn’t weeded out so easily. The superconductivity topic is hot, which means that the media immediately picked up on it. It also meant that fellow researchers were eager to try to reproduce the results, out of curiosity but also to piggyback on the fame train. And even though their work turned out flawed, we can assume that the authors would have anticipated the hype and would have at least tried to double-check their findings, to avoid losing face publicly. So, let’s think about what this means for fields less sexy. Fields where reproducing the results might take months or years, instead of days. This whole process wouldn’t have transpired so quickly there – if ever at all.

By showing us what can occur in the most ideal of circumstances, we become painfully aware that this process won’t happen in 99 percent of all other cases, leading to what we call the reproducibility crisis. In some fields, tens of percent of the scientific work can’t be reproduced. Think about all the costs involved: other scientists building on non-existing results and thus failing in their work. Or medical trials failing to see significant results after spending tens of millions of euros.

The LK-99 case has shown us that we should look first and foremost to the scientists themselves to solve the reproducibility crisis. Errors, incompetence and fraud are key problems, and it’s the responsibility of the whole team of authors to avoid them.

So, my suggestion for the scientific community and funding agencies: don’t put so-called top scientists who publish over a hundred papers per year on a pedestal. They’re likely not very up-to-date with all the ins and outs of ‘their’ papers and therefore aren’t helpful in solving this existential crisis. Don’t encourage and reward quantity of publication over quality.

Most importantly, don’t confuse quality with citations and journal impact factor. Quality will become clear after 10-20 years, when we look back on the work and its wider impact. Does it still hold and did it help us? Yes, that takes time. But science was never meant to give professors tenure, get politicians elected or give startups a new funding round. Science has only one goal: to increase our knowledge, one correct step after another.