So-called AI has received a lot of attention over the past couple years. A different type of computing, quantum computing, has not received as much attention.
As was/is the case with so-called AI, quantum computing has been overhyped recently. Quantum computing requires two things, 1) hardware big enough to handle a specific problem and 2) software capable of handling that problem on the given hardware.
There has been slow progress over the past 50 years on quantum hardware, and progress is likely to continue, albeit slowly. Quantum software has been a different story. Some algorithms were created years ago, but adapting these algorithms to a specific situation is frequently tricky. Creating new algorithms is also tricky. There are stubborn issues that don’t occur with conventional software. To date, quantum computers have never been used to solve real life problems.
This may change at some point, but it will probably take a few years if it happens at all.
A quantum computer is a conventional computer with special quantum hardware. The quantum hardware manipulates one or more qubits; a qubit can store information but does so very differently than conventional computers. Each time you add a qubit, you double the amount of information that can be stored. 3 qubits can store twice as much information as 2 qubits; 4 qubits can store twice as much information as 3 qubits and so on. In addition, the stored information can be manipulated very efficiently. In operation, the conventional computer manipulates the qubits, and under the right conditions the combination of conventional and quantum can produce a result faster than a conventional computer alone.
Two properties of specific quantum hardware need to be considered: 1) the number of qubits, the more qubits the better, and 2) the implementation design. There are different ways to implement the quantum hardware each having advantages and disadvantages. These advantages and disadvantages fall into the following categories:
A quantum computation involves three steps:
Perhaps surprisingly, steps 1 and 3 are the difficult steps. Step 2 is generally straightforward. Given infinite time, step 1 would always be possible. However, the time available is limited. It must be completed withing the decoherence time. Determining what is and what is not possible given these limitations is currently not completely understood but is being researched.
Step 3 is limited by the Heisenberg Uncertainty Principle. It is not possible to transfer all the information from a quantum state to the conventional hardware without some loss. There are ways to deal with this, but it puts hard limits on what is possible.
The bottom line is many potential applications of quantum computers will prove to be impractical and/or better implemented on conventional hardware. The computation must be sufficiently complex, and there must be practical solutions to steps 1 and 3. Only then do the advantages of a quantum computer exceed the disadvantages.
See Book List for AI, Machine Learning and Quantum Computing