Benchmarking for quantum technologies

At quantum physics atoms, molecules or photons are used to store information.

At quantum physics atoms, molecules or photons are used to store information. © Pixabay

Does a device do what it's supposed to? This question is not only asked in everyday life. Researchers working with quantum technologies also want to know what novel instruments can do. A team led by Prof. Jens Eisert, a physicist at the Dahlem Center for Complex Quantum Systems of Freie Universität Berlin and at Helmholtz-Zentrum Berlin, together with researchers from the Sorbonne University in Paris, have published an overview of tools that can currently be used to compare and certify quantum devices. The review article is published in Nature Reviews Physics.

To what extent is a device doing what it should be doing? This is the task of benchmarking and verification. In the German context, everybody will be aware of the TÜV, the institution that makes sure that cars, and basically all machines, are working in precisely the way anticipated. At the same time, it should be clear that any endeavor related to highly precise technological devices will at some point arrive at this question.

This is even more so the case in the emergent quantum technologies, the field of research dedicated to create information technologies of a new kind in which single quantum systems, so single atoms, molecules, or light quanta, are being made use of as carriers of information.

Experimental and technological reality

Ideas of secure information transmission or even the quantum computer that is receiving so much attention these days, are slowly becoming an experimental and technological reality - the 53 qubit quantum computer realized by Google outperforming classical supercomputers has made international headlines. And for such new applications where information is stored in a particularly fragile fashion in quantum systems, tools and ideas ot benchmarking are particularly important.

Writing in Nature Reviews Physics, and in collaboration with researchers from the Sorbonne Universite in Paris, a team of physicists from the Dahlem Center for Complex Quantum Systems at the Freie Universität Berlin and the Helmholtz Zentrum Berlin led by Prof. Jens Eisert have published a survey of the wide array tools currently available for the certification and benchmarking of quantum devices. 

Many of these tools, some of which have been pioneered in the Dahlem Center, are already in use in state-of-the-art laboratories around the world as experimentalists tackle increasingly large and complex quantum systems.

A framework for standards

However, as quantum science has already begun to move from the laboratory towards the market place, the need for rigorous yet robust methods will become even more urgent. A transparent framework for standards and best practices is essential for the birth of any new technology, and will be a crucial stepping stone towards the much anticipated quantum industries of the future.

The requirements for certification and benchmarking tools are as diverse as the applications of quantum technology themselves. From computation and simulation, through to sensing and metrology or communication and cryptography, each task comes with priorities that will determine the optimal method of certification. How much diagnostic information should a certification tool provide? What overheads are acceptable in terms of time or cost? Is this certificate robust against hacking or other malicious interference, particularly for applications in encryption and data security?

As well as these questions, which are common to all technological standards, the nature of quantum physics brings unique challenges. For example, the power of quantum computation allows for the solution for some problems that cannot even be efficiently checked with a classical computer. Another issue is that many quantum phenomena are extremely fragile, and any certification technique must be carefully designed so as not to interfere with or damage the device under investigation.

To survey the myriad of techniques that have arisen to meet these challenges, the authors introduce a framework that classifies protocols according to the information that may be extracted, the underlying assumptions, and the resources required to execute the protocol.  As well as providing an accessible overview for newcomers to assess the relative strengths and weaknesses of available tools, this panoramic approach may serve as a starting point to improve, expand and, where possible, standardise the many methods that currently exist.