Superior by all means!

In 2019, Google announced that its quantum chip »Sycamore« had solved a task faster than a classic computer for the first time. Chinese researchers have now cracked the problem on a normal computer without quantum.

In 2019, Google opened the race for the quantum computer. For the first time, the team of the Google Quantum AI Lab announced in the journal "Nature", a quantum chip would have solved a special computing task within 200 seconds, for which the world's best supercomputer would need 10,000 years. Newspapers around the world at the time were headlined "Google depends on supercomputers" and "Proof of quantum superiority". Also asked: "The Sputnik moment of quantum physics?«.

Now Chinese scientists have carried out the same calculation in a few hours with normal processors. A real supercomputer, you write in an article that can already be called up on the Preprint server and has now been published in the "Physical Review Letters", the task could even solve in a few seconds and exceed Google's quantum chip "Sycamore" smoothly. Quantum superiority, ade?

The promises of technology are unbroken

This new algorithm takes Google's claim at least a little bit of the shine, Greg Kuperberg, mathematician at the University of California, told "Science": "It is less exciting to be 300 steps from the summit than actually reaching the summit "Researchers from IBM had already doubted that a supercomputer needed 10,000 years for the calculation. When checking their bill, the capacities of the supercomputer "Summit" in the OAK Ridge National Laboratories did not properly use, they wrote. However, Paper's authors are also direct competitors of the Google employees.

In any case, the technology promises are still unbroken. Research ecosystems for quantum computing are created all over the world. Start-ups sprout out of the ground like mushrooms, and companies like Google or IBM outbid to switch more quBITs together on a chip. Only the solution to a practical "problem" is still pending.

Because the task that Sycamore solved in 2019 was also designed in such a way that the solution was extremely difficult for a conventional computer, but as easy as possible for a quantum computer. Put simply, the test consisted of a completely useless calculation for complex random numbers. The Google researchers had a circuit of coupled qubits, the quantum mechanical counterpart to classical bits, perform many randomly selected arithmetic operations, repeated the sequence millions of times and recorded the results. For comparison, the whole thing was simulated on a conventional supercomputer.

Because, unlike the bits of an ordinary computer, quables can now take on the conditions 0 and 1, but can also remain in an overlay of these conditions, the 53 quBITs of the Sycamore chip are possible in a parallel representation of 253 conditions.

Based on the fact that all quBITs were initially set to 0, the Google researchers had the quantum chip per round run 20 random arithmetic operations. Then read the condition of the quBITs. Roughly speaking, quantum waves that initially represent all possible results spilled back and forth between the qubits. There was a 53-digit random sequence from 0 and 1. They repeated the process a few million times. The interactions between the qubits generated interference, which increased some random consequences and wiped out others. For this reason, some results are more likely to occur than others. How likely a single sequence of numbers is can only be said after countless passes. After all, there was a characteristic distribution of probability.

Classical computers that simulate the circuit have to painstakingly test every conceivable sequence of the calculation steps. With an increasing qubit number, the effort for this increases immeasurably. The theoretically predicted limit is about 48 qubits. For a quantum computer, on the other hand, the computing time remains manageable, since – as soon as he has undergone the computing operations – he can output a single random result practically at the push of a button.

The Chinese researchers around the theoretical physicist Pan Zhang have now presented the 2019 task as a large three -dimensional network of so -called tensors. This network consists of 20 layers - one for any arithmetic operation that Sycamore went through at the time. And each layer consists of 53 points - one for each of the 53 qubits. The execution of the simulation was then essentially limited to the multiplication of all tensors. The calculation took 15 hours to 512 graphics processors and actually delivered the expected probability distribution. On a supercomputer, Zhang and his colleagues write, the calculation would only take a few dozen seconds-ten billion times faster than the Google team estimated in 2019.

It was to be expected that research on classical computers and the search for better algorithms would not stop either. It would now be important to finally find practical applications to demonstrate the quantum advantage. Or as Dominik Hangleiter, a quantum physicist at the University of Maryland, told Science: "The Google experiment has done what it should do, which is to start this race." Who will win in the end and develop a real universal quantum computer, on the other hand, is still open.

Share In Social Media

Cookies allow us to offer the everyg website and services more effectively. For more information about cookies, please visit our Privacy Policy.
More info
 
This website is using KUSsoft® E-commerce Solutions.