U.S. Quantum Efforts Gain Momentum

//php echo do_shortcode(‘[responsivevoice_button voice=”US English Male” buttontext=”Listen to Post”]’) ?>

I have been researching quantum computing since 2015. A U.S. Government Accountability Office (GAO) report affords an opportunity to evaluate the current state of the technology.

Along with this review of the GAO report, I highly recommend Maurizio Di Paolo Emilio’s recent quantum computing overview.

In January 2019, GAO launched a Science, Technology Assessment, and Analytics team. Since then, the agency has released many technology reports on topics such as AI, blockchain, brain-computer interfaces and augmented reality, cybersecurity, IoT and quantum computing.

The table below summarizes the findings of GAO quantum computing report released on Oct. 19.

(Click on image to enlarge.)

Why are qubits different?

A quantum bit, or qubit, is the basic building block for quantum computing. The qubit is equivalent to a digital bit used in current computers. It has other unique characteristics that can dramatically increase computing capabilities.

The qubit has two unique characteristics—superposition and entanglement. Superposition is a property of quantum physics that allows qubits to exist in a combination of states simultaneously. When measured, a qubit will assume one of those states, destroying the superposition.

Quantum technologies also use entanglement, in which qubits are inherently linked. When one qubit is acted upon, it can then reveal information about the other linked qubits.

Superposition and entanglement dramatically increase the capability of quantum computers for many applications, but not for all. The range of applications is expected to grow as new technology, algorithms and other advances become available.

Quantum computers perform calculations based on the laws of quantum mechanics, which behave according to particles at the sub-atomic level.

Physical qubits

Development of quantum computers requires physical qubits as the basic building blocks. Physical qubits can be divided into two categories—naturally-occurring particles and artificial structures. Natural particles include atoms, trapped ions and photons.

Artificial qubits mimic particles, and include superconducting circuits, quantum dots and defects in a crystal. One example is a nitrogen atom within a diamond’s carbon lattice, known as a color center.

Manipulating the quantum properties of each qubit and entangling multiple qubits are required for developing quantum computer programs. Manipulations are performed with lasers, microwaves, electric or magnetic fields.

The GAO report provides an overview of how physical qubits work and their status in the development of quantum computers.

(Click on image to enlarge.)

Superconducting circuits and trapped-ion particles are among the most common approaches for developing quantum computers. Superconducting versions are used by D-Wave, Google, IBM and Rigetti. Trapped-ion quantum computers are attracting investments, and are available from Honeywell and IonQ. IonQ completed its special purpose acquisition company IPO on Oct. 1. Rigetti is in the midst of its SPAC IPO.

Quantum segments

Two segments, quantum computers and quantum communications, are interconnected and likely to develop together. Here we focused primarily on computing. Two main quantum computing methods are analog and gate-based quantum computers.

The analog version requires an initial set of qubits representing all possible solutions to a problem. The approach leverages the properties of superposition and entanglement, enabling a set of qubits to evolve and identify an optimal solution.

A mature analog version is a quantum annealing machine capable of solving specific problems without performing computations on individual qubits. The approach could be used for drug design, organ donor matching, traffic flows and other transportation planning.

Quantum annealing machines can be purchased or accessed via the cloud for applications development. Quantum annealing machines cannot solve some problems—at least for now.

Other analog quantum computing technologies include adiabatic quantum computing and quantum simulation.

Gate-based quantum computers are similar to digital computers in that quantum gates are analogous to logic gates. Gate-based quantum computers containing upwards of hundreds of physical qubits are available now.

They are considered “noisy,” intermediate-scale quantum machines. Noisy refers to the inability to correct errors.

Limited by the number of qubits and the inability to correct errors, the computational resources remain limited. That means users can only tackle certain problems, and must do so efficiently.

Gate-based quantum computers also differ from analog versions in their predicted ability to fully correct errors and break problems down into gate-based operations.

Future systems will use error-correction methods, including logical qubits that allow for versatile and error-free quantum computers. The number of physical qubits needed for one logical qubit could range from a few hundred to more than 15,000, with the exact number dependent on factors such as error rate, physical qubit performance and the choice of quantum error-correction codes.

Quantum R&D

The 2018 National Quantum Initiative Act established a 10-year U.S. quantum initiative program to accelerate development of quantum applications. It also promotes investment in workforce development and improves coordination of federal quantum information science research.

A National Quantum Coordination Office is responsible for coordinating federal and university research. The National Institute of Standards and Technology established a quantum consortium to identify future measurement, standards and cybersecurity in support of the U.S. quantum technology industry.

Meanwhile, the National Science Foundation established five multidisciplinary centers for quantum research and education while the Energy Department has formed five national quantum information science research centers.

The 2019 National Defense Authorization Act authorized the Defense Department’s quantum R&D effort. The measure also accelerates quantum information science along with technology transition and deployment across the U.S. military.

Amendments to the most recent DoD spending bills include creating quantum information science workforce development plans, reducing the risk of cybersecurity threats posed by quantum information science and developing ethical guidelines for deployment. At least one quantum information science research center was also established. Additional provisions promote collaboration among U.S., industry and university researchers along with compiling a list of research challenges that can be addressed by quantum technology over the next three years.

Quantum complications

Quantum information remains fragile and can be irreversibly lost through interactions with the environment, a process known as decoherence. Coherence refers to the amount of time a qubit maintains a superposition or entangled state before decoherence, a factor that limits how long a qubit can be used.

Moreover, quantum information cannot be copied, and measurement disrupts the information. This prevents the use of classical error-correction technology. Quantum error-correction techniques have been proposed and demonstrated, but remain difficult to implement.

Correction procedures use many error-prone physical qubits working together and in tandem with classical processing to create a system that mimics a stable single qubit—known as a logical qubit.

Building on the theories of physicist Richard Feynman and a key algorithm introduced by Peter Shor in 1994, researchers then demonstrated the first quantum logic gate based on individual qubits.

Since then, experiments have demonstrated that quantum error correction is possible, a necessary step for cost-effective quantum computing and communication by reducing excess noise that destroys quantum information.

Quantum computing has advanced over the last two decades, including successive demonstrations that include a 76-qubit, photon-based system in 2020. A 1,000-plus qubit device is planned by 2023.

Those advances have fostered intense global competition to lead in quantum computing. The GAO ranked the leaders based on the number of research papers and computing and communications applications.

(Click on image to enlarge.)

The U.S. currently leads in the publication of quantum computing research papers, while China leads in quantum communications research. Canada, Germany, Japan and the U.K., Japan and Canada are also among the leaders.

Source link