Interview with Google's Quantum AI team: why Willow is a landmark breakthrough
Speaking of quantum computing, the American theoretical physicist Richard Feynman once made the memorable remark, "Nature is not classical, and if you want to model nature, you'd better make it quantum mechanical."
For the past 30 years, there has been a fundamental challenge to quantum computing: as the number of quantum bits increases, the error rate rises dramatically. However, this seemingly insurmountable chasm is now finally seeing the light of a breakthrough.
November 23, 2019googleHaving published a quantum computing breakthrough in the journal Nature, Google's superconducting quantum chipSycamoreIt took only 200 seconds to complete a calculation that would have taken the world's fastest supercomputer 10,000 years. At the time, Google CEO Sundar Pichai compared it to the "Wright Brothers' 12-second maiden flight".
On December 10, 2024, Google published its latest quantum chip in NatureWillow.The research results of the study have once again achieved a milestone breakthrough in two main areas:
- First, Willow achieves the goal of exponentially reducing the error rate as the number of quantum bits increases.By gradually scaling up the size of quantum bit arrays from 3×3 to 5×5 to 7×7, the error rate can be cut in half each time. This has been a formidable challenge for the field since Peter Shor introduced quantum error correction in 1995.
- Second, and more talked about, is its breakthrough in computing power.In the Random Circuit Sampling (RCS) benchmark, Willow completed a calculation in less than five minutes, while today's fastest supercomputer, Frontier, would take 10^25 years to complete, or 10,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000 years. To visualize the concept of this number, Google says "this time is older than the age of the universe".
Techwalker Techwalker participated in this Google video briefing with Googlequantum AIHartmut Neven, the founder and head of the"When we founded the Google Quantum AI team in 2012, the vision was to build a useful large-scale quantum computer that leverages quantum mechanics as we know it today (nature's 'operating system') to drive scientific discovery, develop useful applications, and solve some of society's key challenges."
Google's quantum AI team is led by Hartmut Nevin, and Google has built a dedicated quantum chip manufacturing facility in Santa Barbara to invest deeply in the field.

Interestingly, Julian Kelly, Google's Director of Quantum Hardware, introduced the briefing by saying thatPreviously Google's quantum chip Sycamore was built in a shared cleanroom at the University of California, Santa Barbara-The lab was announced in 2013 and provides Google researchers with more tools and more powerful features.And this time, Willow was produced at Google's own dedicated superconductor chip manufacturing facilityThis allows for better control of manufacturing process parameters and improved yields and consistency.
"You can think of Willow as essentially inheriting all the best aspects of Sycamore, but achieving an even bigger milestone."Julian Kelly said.
Exponential quantum error correction: below threshold
Quantum bits (qubits), the unit of operation of quantum computers, are very "unstable" and tend to lose information due to their surroundings, and typically the more qubits used, the more errors there will be - the more they are used.So, "error" is one of the biggest challenges facing quantum computing..
But Google did the opposite this time:As more quantum bits are used in Google's quantum chip Willow, errors are dramatically reduced instead, Google tested larger and larger arrays of physical quantum bits, expanding from a 3×3 grid of encoded quantum bits, to a 5×5 grid, to a 7×7 grid - cutting the error rate in half with each expansion.In other words, Google achieved an exponential reduction in error rates.

Here's a little explanation. In quantum error correction, it involves putting a number of physical quantum bits together and making them work together, i.e., correcting an error by creating a "logical qubit", a combination of 3×3, 5×5, 7×7 is called a "logical qubit". The combination of 3×3, 5×5, and 7×7 is called a "logical qubit.
The 1 centrally located physical quantum bit stores the actual quantum information (the data bit), and the 8 surrounding physical quantum bits are auxiliary bits (also called synchronization or anchoring bits), so that a 3 x 3 arrangement can actually store only 1 bit of information, but it protects this information from being corrupted by environmental interference.
It is like transporting a fragile product (quantum information), the center is the fragile product itself (data bits), surrounded by 8 positions is the packaging foam (auxiliary bits), although it seems to use 9 space positions, but the actual transport of the effective items only the center of the one, but these "packaging foam" to make the transport become more secure and reliable.
This explains why quantum computers require so many "physical quantum bits" and why the number of physical quantum bits seems to be large but the number of "logical quantum bits" that can actually be used for computation is much smaller!::For example, to store 10 bits of information, 90 physical quantum bits (10×9) would be needed for a 3×3 scheme, 250 physical quantum bits (10×25) for a 5×5 scheme, and 490 physical quantum bits (10×49) for a 7×7 scheme. This "redundancy" is necessary because it guarantees the reliability of quantum computing.
"We're hoping that as these sets get bigger and bigger, the error correction gets better and better so that the quantum bits get more and more accurate. The problem is that as these things get bigger and bigger, there's more and more opportunity for error, so we need equipment that's good enough so that as we make these things bigger and bigger, the error-correction capability can overcome these additional errors that we're introducing into the system." Michael Newman, a research scientist at Google Labs, said at the briefing.
Google says it's a goal that's been unfulfilled for 30 years, until now Willow has achieved a breakthrough - realizing that every time the size of a logic quantum bit increases, from 3×3 to 5×5 to 7×7, the error rate drops exponentially.
This is like building blocks, the higher the blocks are stacked, the more likely they are to fall down, but now Google's research not only allows the blocks to be stacked higher, but the higher they are, the more stable they are. This also strongly suggests that future practical super-large quantum computers can indeed be built.
This breakthrough is known in the industry as "below threshold" - the ability to increase the number of quantum bits while decreasing the number of errors..In this paper in Nature, the researchers write:"While many platforms have demonstrated different features of quantum error correction, no quantum processor has so far explicitly demonstrated below-threshold performance."
"There's really no point in doing quantum error correction if it's not below threshold, and that's really the key factor in realizing this technology in the future." Julian Kelly added: "The quality of the quantum bits themselves has to be good enough for error correction to take place, and our error correction demonstrations show that at the integrated system level, where everything works at the same time, it's not just a question of the number of quantum bits, or the T1 or double-quantum-bit error rates. This is one of the reasons why this challenge has been so difficult to solve for so long."
"Willow brings us closer to running practical, business-relevant algorithms that can't be replicated on traditional computers." Hartmut Nevin said.
5 minutes to complete a calculation compared to 10^25 years for Frontier
To measure Willow's performance, Google used the Random Circuit Sampling (RCS, Random Circuit Sampling) benchmark. "First pioneered by Google's Quantum AI team and now widely used as a standard in the field, RCS is the hardest classical benchmark in quantum computing today." Hartmut Nevin describes.
Specifically, RCS is used to demonstrate the rapidly growing gap between quantum and classical computers, and to emphasize how quantum processors are peeling off at a double-exponential rate and will outperform classical computers as quantum bits expand. It involves generating and measuring the output of random quantum circuits (random quantum circuits are sequences of quantum gates applied to quantum bits in a seemingly arbitrary manner).
As noted in the opening paragraph, Willow's performance in the RCS test was amazing: It completed a calculation in less than five minutes, compared to 10^25 years for Frontier, today's fastest supercomputer. "It confirms the idea that quantum computing occurs in many parallel universes, which is consistent with the 'we live in a multiverse' idea first proposed by David Deutsch." Hartmut Nevin said.
Figure : Computational cost is strongly influenced by available memory. Thus, Google's estimates consider a range of scenarios, from the ideal case of unlimited memory (▲) to more practical implementations that can be executed in parallel on the GPU (?) .
Asked at the briefing, "How close are we to seeing quantum computers in real-world applications?" Hartmut Nevin said.Quantum computers are useful in drug discovery, fusion reactors, fertilizer production, quantum machine learning, and electric car batteries.
in drug discovery. "About 751 TP4T of small molecule drugs are metabolized by the enzyme P450, which is essentially a hurdle that small molecule drugs must avoid, an enzyme that is not yet fully understood, and which quantum computers are expected to be able to better model, theGoogle is working on this application in an attempt to understand the enzyme complex P450 with a quantum computer."
On machine learning. "AI is everywhere now, but it's important to recognize that there are many fundamental and computational problems, such as solving difficult optimization problems or factorizing large numbers (Integer Factorization), that can't be solved by learning alone because you need huge amounts of training data. That's where quantum computers can help."
Charina Chou, Director and COO of Quantum AI at Google, added, "AI nowadays mainly refers to machine learning, which requires a large number of training samples. The phenomenal success of ChatGPT, for example, is due to the large amount of available training data. Quantum computing can also help in this regard. Google actually has some unfolding work in this area, which will give us algorithms that can get more value out of magnetic resonance imaging (MRI) and nuclear magnetic resonance (NMR). These new quantum algorithms can act as an atomic ruler, giving very precise distances between nuclei in a molecule. So quantum computing can help collect otherwise inaccessible training datasets, which is another important connection to AI."
In addition, Charina Chou pointed out that "the greatest opportunity to simulate nature may lie in quantum mechanical systems", and that Google is collaborating with many large companies, academic institutions, and startups in the fields of physics, chemistry, and materials science to explore scenarios where quantum computing can be used in various fields.

Systems engineering is key
For Hartmut Nevin, systems engineering is the key to designing and fabricating a quantum chip: all of the chip's components, such as single- and double-quantum bit gates, quantum bit resets, and read-outs, must be carefully designed and integrated simultaneously. If any component lags or two components don't work well together, it slows down system performance.
"Maximizing system performance therefore cuts across all aspects of our process, from chip architecture and fabrication to gate development and calibration. willow achieves results by evaluating quantum computing systems holistically, rather than one factor at a time."
Willow currently boasts top-notch performance in both of these system benchmarks (quantum error correction and random circuit sampling), in addition to theWillow's T1 time (a measure of the length of time a quantum bit can retain an excitation - a key quantum computing resource) is close to 100 µs (microseconds), a 5-fold improvement over the 20 µs of the Sycamore chip.
If you want to evaluate quantum hardware and compare it across platforms, here is a list of key specifications:

Chart: Willow's performance on a number of metrics
When asked, "From the 53-qubit Sycamore in 2019, to the new results of the 105-qubit Willow now, Google's technological route on quantum computing seems to be more focused on quality than quantity, does this mean that the industry's general pursuit of a 'more quantum bits' route needs to be adjusted?" This question when Hartmut Nevin told TechWalker:
Quantum computers require both "quantity" and "quality"..Simply increasing the number of quantum bits is not enough, because if the error rate is too high, those quantum bits cannot be effectively utilized. It's like a computer that can't work properly, even with a high configuration, if it crashes frequently.
If a quantum computer has an error rate of one in a thousand for gate operations, then after performing a thousand operations, the system is likely to be in error. And in practice, each quantum bit needs to perform at least ten gate operations. So for a system with 100 quantum bits, the error rate needs to be kept at one in 100,000 to qualify.
By contrast, some other designs claim to have thousands of quantum bits, but have error rates as high as 1/50 or 1/200, in which case it would be impossible to use all of them at once before the whole system collapsed. "That's why Google chose to focus on improving the 'quality' of the quantum bits first, because it only makes sense to increase the quantity if you solve the quality problem first."
Google's research team says they are developing new techniques to scale up the system. Current efforts are focused on reducing the error rate so that it meets the requirements for quantum error correction. As the technology matures, the number of quantum bits will gradually increase.
Google's Quantum Computing Journey
So far, Google has conducted two different types of experiments for quantum computing.
On the one hand, run the RCS benchmark, which measures performance against conventional computers but has no known real-world application.
On the other hand, scientific simulations of quantum systems have been carried out, which have led to some new scientific discoveries, but these are still within the scope of conventional computers.

Fig. Random Circuit Sampling (RCS), while challenging for conventional computers, has yet to demonstrate practical commercial applications.
At the video briefing, Google's Quantum Computing AI team unveiled the Google Quantum Computing Roadmap, which Google says focuses on unlocking the full potential of quantum computing through the development of large-scale computers capable of complex, error-correcting computations, and that these milestones will lead the way toward high-quality quantum computing hardware and software for meaningful applications. As seen on the chart, the roadmap contains six milestones, and Google has completed two milestone nodes so far.
Google's Quantum Computing Roadmap
Talking about taking the plunge into this quantum computing journey, Hartmut Nevin wrote on Google's official website:
"My colleagues sometimes ask me why I left the burgeoning field of artificial intelligence to focus on quantum computing. My answer is that both technologies will prove to be the most transformative of our time, but advanced AI will benefit greatly from quantum computing. This is why I named our lab Quantum AI."
"Quantum algorithms have fundamental scaling laws, as we saw in RCS, and many of the underlying computational tasks critical to AI have similar scaling advantages. As a result, quantum computing will be essential for collecting training data that is inaccessible to traditional machines, training and optimizing certain learning architectures, and modeling systems where quantum effects are important. This includes helping us discover new drugs, designing more efficient batteries for electric cars, and accelerating progress in fusion and new energy alternatives.Many of the future game-changing applications won't work on traditional computers; they're waiting for quantum computing to unlock them."
© Copyright notes
The copyright of the article belongs to the author, please do not reprint without permission.
Related posts
No comments...