Scientists have made great strides in the development of large-scale quantum computers.
“Noise” remains the biggest problem for the development of quantum computers, and they must be solved before they can be widely used and in the proposed revolutionary ways. The new work suggests a way to deal with this noise, which in turn could open up a way to control this noise and develop much better quantum computing systems.
Quantum computers could change the way we use technology, allowing for the resolution of impossible problems with current equipment. But to do that, they need noises low enough to be reliable.
Download the new Independent Premium app
Share the full story, not just the headlines
The noise problem remains critical to creating useful and useful quantum equipment. In short, it is the result of the errors that are introduced as quantum scientists manipulate the “qubits”; that power a quantum computer, so that noise must be eliminated before any system can be used reliably. .
Noise becomes a problem, the more quibits there are and the bigger the system, which means that the problem is a particular barrier to building the types of large quantum computers that will be offered in the future that offer new technology. revolutionary.
To be able to do this, scientists must be able to understand how noise works through a quantum system. So far, they have only been able to do this with very small devices.
But new research, published in Nature Physics, includes new algorithms capable of working on larger-scale quantum computing devices.
And it has already been used successfully in the IBM Quantum Experience, an online platform that allows researchers to make use of companies ’quantum computing systems.
They found that the algorithm was able to successfully diagnose system noise: finding problems that had not been previously detected.
If quantum computers are successful, they will need to be accurately calibrated to avoid noise or errors. But they will also need to be able to correct these errors if important calculations need to be relied upon.
To be able to do this, quantum scientists will need to know where errors are likely to be introduced. Knowing that will allow them to optimize their error correction for specific problems, rather than doing it generically.
The new algorithm that allows scientists to better understand how many of these errors there should be, and where they may appear, that could be included within future devices to allow them to better correct errors.
“This protocol opens up endless opportunities for new diagnostic tools and practical applications,” the researchers write in the new work, which noted that it could be used in various ways to improve quantum computers to handle the noise they generate.
“The results are the first implementation of probably rigorous and scalable diagnostic algorithms, capable of running on current quantum devices and beyond,” said Robin Harper of the University of Sydney, who is the lead author of the new work.
“Efficient learning of quantum noise” is published today in Nature Physics.