Willow can achieve in five minutes what current supercomputers would need ten septillion years to complete.
Google has introduced a groundbreaking quantum computer prototype that marks a significant milestone in computational capability. This new machine achieves what current supercomputers would require ten septillion years—an incomprehensibly vast period exceeding the universe’s age—to compute, completing the task in a mere five minutes. This advancement, announced Monday, is powered by “Willow,” Google’s latest quantum computing chip.
The Willow chip represents a leap forward in quantum computing by doubling the number of qubits, or quantum bits, compared to its predecessor, Sycamore. This increase in qubit count has enabled significant progress in reducing quantum computing errors. Error reduction is critical to transitioning quantum computers from theoretical constructs to practical devices that can revolutionize scientific research and discovery.
Willow System Metrics | |
Number of qubits | 105 |
Average connectivity | 3.47 (4-way typical) |
Quantum Error Correction (Chip 1) | |
Single-qubit gate error’ (mean, simultaneous) | 0.035% ‡ 0.029% |
Two-qubit gate error’ (mean, simultaneous) | 0.33% $ 0.18% (CZ) |
Measurement error (mean, simultaneous) | 0.77% ‡ 0.21% (repetitive, measure qubits) |
Reset options | Multi-level reset (1) state and above) Leakage removal (12) state only) |
T, time (mean) | 68 us = 13 us? |
Error correction cycles per second | 909,000 (surface code cycle = 1.1 us) |
Application performance | ^3,57 = 2.14 $ 0.02 |
Random Circuit Sampling (Chip 2) | |
Single-qubit gate error’ (mean, simultaneous) | 0.036% ÷ 0.013% |
Two-qubit gate error’ (mean, simultaneous) | 0.14% + 0.052% (iswap-like) |
Measurement error (mean, simultaneous) | 0.67% + 0.51% (terminal, all qubits) |
Reset options | Multi-level reset (11) state and above) Leakage removal (2) state only) |
T, time (mean) | 98 us + 32 us |
Circuit repetitions per second | 63,000 |
Application performance | XEB fidelity depth 40 = 0.1% |
Estimated time on Willow vs. classical supercomputer | 5 minutes vs. 1025 years |
Quantum computers differ fundamentally from the classical computers used daily. While classical computers process information using transistors that toggle between “1” and “0,” quantum computers rely on subatomic particles manipulated into quantum states. These particles, known as qubits, leverage two key principles of quantum mechanics:
Despite their immense potential, quantum computers face significant challenges due to error rates. Qubits are highly sensitive and prone to interference from even minute disruptions, such as cosmic rays. Observing a qubit directly collapses its quantum state, turning it into a classical bit and negating its quantum properties. Consequently, error correction in quantum systems requires indirect observation through other qubits.
As quantum computers increase in size, error rates have historically escalated, creating a bottleneck for scaling these systems. However, Google’s Quantum AI division has developed an innovative error correction method that exponentially decreases error rates as qubits grow. This breakthrough represents a pivotal achievement in building scalable, practical quantum computers.
While Google’s advancements in quantum error correction bring the field closer to realizing its transformative potential, substantial challenges remain. Scaling quantum computers to operational sizes will require further innovation. Nonetheless, the long-term possibilities are profound.
Quantum computers are expected to drive breakthroughs in materials science and biology by enabling calculations at molecular and atomic levels that are infeasible for classical computers. Additionally, their applications may extend well beyond today’s known use cases, unlocking opportunities yet to be imagined.
Google’s unveiling of the Willow chip and its associated advancements in quantum error correction represents a monumental step forward in quantum computing. While practical, large-scale quantum computers are still on the horizon, this milestone underscores the technology’s potential to reshape scientific discovery and address computational challenges that are currently insurmountable.
Engage with StorageReview
Newsletter | YouTube | Podcast iTunes/Spotify | Instagram | Twitter | TikTok | RSS Feed
NVIDIA Spectrum-X includes adaptive routing to stem the flow of collisions and optimize bandwidth utilization. (more…)
Leaseweb adds NVIDIA GPUs to its infrastructure portfolio. (more…)
Ubiquiti unveiled the AI Key, a sophisticated Edge AI appliance designed to improve video surveillance through advanced smart detection and…
Ocient's software and AMD's processor performance set to deliver performance gains for enterprises when using Ocient's tools. (more…)
DeepSeek-R1 is an open-source AI model rivaling OpenAI’s best, proving that innovation isn’t just about compute—it's about smart engineering. (more…)
Spectra Logic unveils data center tape storage connectivity innovation to improve flexibility and low costs. (more…)