Quantum computing is perceived as something from the field of science fiction, where it is about time travel or changing the size of characters. However, Yahoo! finance reminds us that this technology is real, and Silicon Valley is increasingly interested in it.
Over the past few months, Amazon, Google, and Microsoft have announced their own quantum computing chips. As part of the annual GDC developer conference, which will be held on March 20, Nvidia is preparing to host the first Day of quantum computing. IBM, Intel and many other companies specializing in quantum technologies also have their own microchips.
According to the forecast, quantum computers will be able to perform complex calculations in minutes and hours. It would take thousands of years for ordinary computers to build them. What does this mean from a practical point of view? There is potential for significant progress in materials science, chemistry, medicine, and more. Scientists and researchers will be able to perform calculations that could only be dreamed of on modern computers.
What is a quantum computer?
To better understand quantum computers, let's first talk about ordinary computers. Laptops, smartphones, and even smart watches have processors that are the "brains" of modern technology.
Every processor has transistors. These are small switches inside the computer that respond to an electrical signal. Microchip manufacturers often brag about how many billions of transistors are in their chips. Apple says there are 28 billion transistors in M4 chips for Mac and iPad, while Nvidia claims there are 208 billion transistors in its Blackwell chips.
Central processing units transform applications and programs using electrical impulses using binary code, a machine language consisting of ones and zeros. Each one and each zero are called bits.
The strings of bits make up a complex sequence of instructions, which, after processing by modern programming languages, are transformed into videos and images that you see on the screen, into games that you play, and applications that you use during the working day.
One series of electrical pulses can pass through the transistors of the microcircuit, creating various combinations of ones and zeros along the way. The chip then interprets these signals to access the computer's memory or storage. As the commands become more complex, they can interpret everything from moving a finger across the phone screen to opening a page when clicking on a link on a website.
These binary systems are used in all computing devices, whether it's a computer, medical equipment, automobiles, or any other technology that requires a central processor.
However, quantum computers use subatomic objects known as quantum bits, or qubits, rather than classical bits. This is a completely different way of storing and processing data. Instead of existing as 1's or 0's like bits, qubits can exist as 1's and 0's or any combination of them simultaneously due to superposition.
Scientists don't know if a qubit is a one or a zero until they observe it. In the quantum world, a system can be in several different states at the same time, and without taking measurements, it is impossible to know exactly what state it is in.
This ability to exist as 1s and 0s provides so-called parallel processing, which allows quantum computers to perform calculations much faster than conventional computers.
There are several ways to control qubits: using superconducting systems that lower the temperature around the qubits to almost absolute zero, using lasers interacting with atoms, or using photon light particles in high vacuum conditions.
Giant octopus-like gold machines, often referred to as quantum computers, are essentially cooling systems powered by superconducting qubits to ensure they are cold enough to function properly.
Moreover, quantum computers do not exist on their own. They are connected to ordinary computers that control the interaction of a quantum system with a qubit and read the results of its operation.
Quantum errors
So, quantum computers can use a superposition of qubits to process calculations at high speed compared to conventional computers. Then why don't we still use them to solve the most difficult problems in the world? Because of quantum errors.
The fact is that because of their small size, qubits are very sensitive to interference from the outside world. This means that even a single atom can confuse qubits, causing them to lose information.
According to scientists, an error can occur in quantum systems every 300 quantum operations. However, in order for a quantum computer to be useful, an error can occur only once in a trillion operations. This is a significant difference.
Companies use quantum error correction to solve the problem. Microsoft, Google and Amazon have paid special attention to it in the latest microchips. The idea of quantum error correction is to use redundant qubits.
However, quantum error correction requires a huge amount of resources. Researchers are working to solve this problem. Until then, quantum computers remain insufficiently reliable for use in chemical calculations, for the discovery of new compounds or for understanding the interaction of atoms.
Do we need quantum computers for everything?
Although quantum computers are necessary to run certain algorithms and solve complex problems, they will not replace classical supercomputers even after researchers can solve the problem of quantum errors.
They are not designed for everyday tasks such as faster launching of applications on smartphones or PCs. This requires regular computers. In other words, don't expect to come home, turn on your personal quantum computer, and start watching Netflix. Your regular computer can handle this perfectly.