top of page
Immagine del redattoreAndrea Viliotti

Willow: the new quantum chip from Google Quantum AI

Google Quantum AI’s new quantum technology takes shape in the Willow chip, a quantum processor designed to overcome current barriers in quantum computing and pave the way toward large-scale machines that are genuinely useful. Introduced by Hartmut Neven and his team, Willow marks a firm step forward, thanks to improved qubit coherence times, integrated error-correction strategies, and experimental results that surpass the performance of today’s most powerful classical computing infrastructures. Developed at Google Quantum AI’s dedicated facilities, this new processor lays the groundwork for real, commercially relevant applications, confirming the potential of next-generation quantum technologies.

Willow: the new quantum chip from Google Quantum AI
Willow: the new quantum chip from Google Quantum AI

Willow’s performance and quantum error correction

The Willow chip emerges in a research landscape where quantum computing has been grappling for decades with a crucial obstacle: the rapid onset of errors in qubits, the elementary units of quantum information processing. A qubit is a physical entity capable of representing and manipulating information by leveraging principles of quantum physics, such as the superposition of states. Unwanted interactions with the external environment degrade its state, leading to errors that accumulate as the number of qubits grows. If these errors are not effectively corrected or reduced, the ability of a quantum computer to surpass the performance of a classical system diminishes until it disappears. At the core of this issue lies quantum error correction, namely techniques intended to preserve the information processed by the device.


Willow demonstrates a remarkable result in the quantum computing landscape: it achieves what is termed “below threshold,” showing that it can reduce errors exponentially as the number of qubits increases. This means that by moving from an array of nine physical qubits to one with twenty-five and then forty-nine, the system managed to halve the error rate with each incremental scale-up. Achieving such a condition is a milestone the scientific community has pursued since the 1990s, when the very idea of quantum error correction was first formalized. It is a result that is both practical and conceptual: it shows that beyond a certain threshold, for every additional qubit, the system’s overall quality not only does not deteriorate but actually improves.


Such behavior is no accident but arises from a series of structural and logical optimizations. Willow is a superconducting chip produced in one of the rare facilities fully dedicated to manufacturing quantum processors, in this case located in Santa Barbara. The controlled manufacturing environment enabled an increase in the quantum coherence of the qubits, meaning their ability to maintain the superposition of states without the quantum signal deteriorating in a matter of moments. Measured in microseconds (µs), this parameter was brought to about 100 µs with Willow, representing about a fivefold improvement over previous results. Having more stable qubits means that they can interact for longer periods and handle greater computational complexity without losing useful information.


At the same time, the chip’s architecture was not designed merely to increase the quality of individual qubits, but to ensure that the entire system can be configured, via tunable components, to correct “defective” or less performant qubits, thus realigning the entire array to a homogeneous level of performance. This strategy is combined with high-frequency calibration protocols that can act on each qubit and their interactions, intervening via software to keep errors low and fully exploit the hardware reconfigurability of the processor.


The results achieved with Willow demonstrate that error correction is now truly implementable and useful on the path to large-scale quantum computers. The realization of a logical qubit—i.e., a set of physical qubits working together to represent a more stable one suitable for prolonged computations—marks the crossing of a historic threshold. It is no longer just a theoretical concept or an elusive goal, but a phenomenon observed experimentally. This aspect has strategic implications for the future: if it is possible to build a chip that improves its performance as it grows in size, it is conceivable to reach configurations large enough to tackle computational problems currently beyond the reach of classical machines.


Benchmarking and comparison with classical supercomputers

Evaluating a quantum computer means comparing it to its classical counterparts—machines that still dominate the scene in high-performance computing. To test Willow, random circuit sampling (RCS) was used, a benchmarking procedure that has become a standard in the field. RCS involves having the quantum computer sample random quantum circuits, a type of problem that classical machines find extremely difficult to simulate. This difficulty increases exponentially with the number of qubits and the complexity of the circuit. The idea behind this test is to verify whether the quantum processor can perform, in a reasonable time, a task that a classical calculator would execute—under the same conditions—over a time span so large as to be impractical. If the quantum computer shows a clear advantage, it means that we are coming closer to applications that are no longer reproducible on even the best classical systems.


Willow performed RCS in less than five minutes, an extremely short time compared to how long it would take one of the world’s most powerful supercomputers. If one considers the optimal classical resources, the corresponding algorithm might require about 10^25 years, an astronomical length that surpasses the age of the universe. It should be emphasized that this type of benchmark is not directly related to a practical application of interest for businesses or the real economy. It is a stress test, a baseline to understand whether quantum power surpasses classical limits. It establishes a fixed point: Willow demonstrated an enormous scaling advantage, creating a gap that is hard to bridge with classical methods. Nonetheless, it is reasonable to expect that classical supercomputers may improve, optimizing their algorithms and leveraging more advanced memory. Even so, the rate of performance growth of the quantum chip suggests that the gap will only widen.

This experiment is not limited to stating that Willow is “faster” for a specific task. Its deeper significance lies in showing that quantum computers can already carry out tasks that are difficult for classical calculators, even if the task itself does not yet have a direct commercial application. It is like opening a door to a world where modeling natural phenomena, exploring advanced materials, understanding complex systems, and investigating solutions in fields such as drug chemistry can be approached with a more powerful and flexible methodology.


Reference performance and future goals

Willow was born in a unique production environment, a facility dedicated to the fabrication of quantum chips designed to maximize quality and yield. Willow’s evolution did not stop at increasing coherence times: the device counts 105 qubits, a nontrivial number for such an advanced chip, and it was optimized for two-qubit logical operations, higher readout speeds, and uniform quality across the entire processor. The T1 time scale—i.e., how long a qubit maintains its quantum state before decaying—now shows noteworthy values, indicating that engineering the system with optimized connectivity between qubits and continuous calibration strategies is the right path toward increasing stability and reliability.


A stated goal is to move beyond the mere demonstration of superiority over classical models in non-application-specific tasks, aiming for results that are useful in the real world. Thus far, research has polarized around two main areas: on the one hand, benchmarks like RCS that certify a performance gap compared to classical supercomputers; on the other, quantum simulation experiments of physical systems with scientific value but still reproducible by classical computers, albeit with difficulty. The ultimate goal is to combine these two aspects, proving the ability to perform a calculation that cannot be replicated by classical machines and that also has practical repercussions. The path might lead to applications in the pharmaceutical sector, the development of more efficient batteries, and the investigation of complex reactions, driving research in directions not yet explored with conventional approaches. The message is clear: rather than insisting on the number of qubits alone, it is necessary to maintain and increase the chip’s quality and the reliability of its operations to reach the threshold at which quantum computing becomes a strategic element in a variety of industrial and scientific contexts.


Conclusions

Technologies like Willow emerge in a scenario where the boundary between what is efficiently computable and what is not is being redefined. Today, companies face a complex landscape made up of investments in established classical technology and new hopes placed in quantum machines. It is inevitable that there will be a hybrid phase in which the cooperation between quantum and classical hardware, along with the development of targeted software, will help identify the problems best suited to each paradigm. It makes no sense to expect a sudden leap into a reality where quantum tech supplants everything that came before; rather, what is emerging is a slow but steady approach toward unprecedented levels of performance.


The real stakes lie in the ability to redesign business models and understand when and how the data processed by a quantum computer can open the door to discoveries and solutions hitherto out of reach. It is like having a new tool capable of modeling aspects of the physical world otherwise unmanageable: not necessarily faster, but different and complementary. It is important not to focus solely on numerical comparisons with classical supercomputers, but to fully understand the strategic and competitive implications: where will this exponential error correction capability, this extended coherence, and this ability to reduce the gap between theoretical potential and practical implementation lead us? For businesses, figuring out how to integrate or leverage quantum computing will be like learning a new language: it will require time, training, searching for partners and expert consultants, and above all, an open vision.


The deeper reflection lies in considering that quantum computing is not merely a race for raw power, but a step toward a different conception of computation. Companies that are already investing in understanding the significance of these advancements should not simply ask whether a particular technology is faster or more efficient, but rather how its approach to problems can highlight unexpected dynamics, new metrics of value, and strategic pathways not yet explored. This ability to reshape computational thinking—and not merely to surpass the runtimes of a classical machine—offers the potential for nontrivial competitive advantages and a more profound comprehension of the complex systems that companies confront. The evolution of Willow and similar devices should not be viewed as an isolated event, but rather as a process, a continuum of refinements, a gradual alignment of managerial thinking with new technological and intellectual coordinates. For those who know how to seize its benefits, the promise will not be a sudden jolt, but the acquisition of analytical tools capable of making the strategic fabric of enterprises more versatile, resilient, and open to a future still waiting to be interpreted.

 

21 visualizzazioni0 commenti

Post recenti

Mostra tutti

Comments

Rated 0 out of 5 stars.
No ratings yet

Add a rating
bottom of page