Willow and The Race to Quantum Advantage (Feat. Kris Kaczmarek)
The future is closer, but more expensive than you thought
Reading time: 10 minutes.
After the Google Willow announcement and my interview with Semiqon of silicon spin, I wanted to share a quick state of play on quantum.
Re Willow, my two main takeaways:
First, Google hit a major milestone by getting error rates below the fault-tolerance threshold, which is crucial for quantum error correction (QEC). This isn’t just about fixing errors—it’s about creating a scalable way to build logical qubits that can handle long, complex computations. Most quantum systems struggle because error rates grow as the system gets bigger. But staying below this threshold flips the script: more physical qubits actually lead to exponentially fewer logical errors. That’s a big deal for making quantum processors practical at scale. That said, the error rates achieved by Willow depend a lot on specific hardware factors like how long qubits stay stable, how precise the gates are, and how well the system avoids environmental noise. Expanding this success to other quantum platforms will need major advances in materials and better error-correcting code designs.
Second, Google’s results show that raw qubit counts are becoming a pretty meaningless metric for evaluating quantum computers. It’s no longer about how many qubits you have—it’s about how good and useful they are. For example, 10,000 low-quality qubits might not hold a candle to 100 high-quality logical qubits. This shift encourages the industry to focus on real applications and the actual performance of quantum systems. The challenge, though, is we still don’t have a clear, standardized way to measure and compare systems—something like FLOPS in classical computing. Until we do, it’s hard for end-users to judge what’s actually useful.
I had more questions though. I couldn’t think of anyone better than Kris, a quantum physicist turned VC. Read on if you like quantum. Or context. Or the interview format. Lots to like.
3 things I learned
The Manufacturability Paradox: While established silicon manufacturing seems like an advantage for certain quantum approaches, it might actually be harder to improve these highly optimized processes than to develop entirely new ones. Some "lower quality" manufacturing processes for ion traps are performing better because the precision requirements aren't as extreme. Feels counter-intuitive to me, is this the actual contrarian bet we are all looking for?
The Talent Misallocation Problem: A surprising bottleneck in quantum progress isn't a lack of talent, but rather how it's being used. Many brilliant minds who could be working on breakthrough algorithms are instead being sent to conferences and sales meetings - a result of VC funding pressures to show constant progress. We all sort of know this is true, but seems fixable. Spicy take.
Winner-Takes-All Dynamic: Despite all the different approaches, the first company to demonstrate clear quantum advantage in any practical application could trigger a rapid consolidation of talent and capital around their approach. Huge if true because this story is basically first to market trumps better tech. It’s the path dependency story. Bull for superconducting and trapped ion probably.
Implications for VCs
The End of Full-Stack Bets: The billion-dollar full-stack quantum computing startup is even harder now than it was a few years ago. Instead, capital-efficient opportunities exist in focused $100M+ hardware plays, algorithm development, and systems integration. The key is finding companies that can demonstrate advantage in specific verticals (especially chemistry and materials science) without requiring massive capital deployment. Phasecraft, Qoro quantum, and Haiqu spring to mind.
Founder Profile: While early quantum companies were led by brilliant researchers focused on qubit counts and coherence times, the next wave needs pragmatic executives who can translate quantum advantage into business value. Look for founders who prioritize clear engineering and commercial milestones over purely academic achievements and can communicate quantum's value in specific business terms.
Milestones: We still don’t have clear technical milestones that translate into practical performance, even logical qubits aren’t a useful measure anymore. We don’t really have a raw FLOPs metric which would translate into some expected performance to benchmark against. Look for new business value milestones based around specific algorithms used for specifics applications like chemistry or molecular dynamics.
Hey Kris, you have a background in quantum memory, I've always wondered if we need quantum memory to go along with quantum processors. Well do we?
Well, different people mean different things by quantum memory. There are 3 main types:
Quantum RAM - an addressable memory for quantum states. This is a very low TRL with no good implementation proposals yet.
Photonic memories for quantum communications - these are "quantum repeaters" needed because signals attenuate in optical fiber and can't be amplified.
Short-lived quantum memories for enhancing photonic quantum computing - this is what we developed at Oxford.
The key issue with photonic quantum computing is that basic operations are probabilistic. If you’re going to make a crude comparison to other platforms, in conventional photonic approaches you're looking at base fidelities of 1-10% compared to 99.9% in ion traps. This means you need extensive error correction just to match ion trap performance before adding traditional error correction.
Given those low fidelities, how do you address that challenge?
Companies like Sparrow Quantum are developing deterministic photon gates and at ORCA we made significant progress on multiplexing - trying many times and selecting successful outcomes. PsiQuantum does this over space with many photon sources and a switching network. With our quantum memories, we could do it over time by buffering successful photons for synchronized release.
So essentially it acts as a cache memory?
Exactly. We developed a practical way to build this photonic quantum cache memory.
So with such poor fidelities compared to other approaches, what's the argument for pursuing photonic quantum computing at all?
Good question, the key advantage is scaling and networking. Photons can travel long distances over room temperature fiber using existing infrastructure. Even other platforms like IonQ's ion traps plan to use photonic links to connect their quantum computers. There's broad consensus that photonics will be essential for networking quantum computers. So the claim is, why not start with photonics in the first place.
Right so why not start and end with light, I get it, what is PsiQuantum's billion-dollar bet on this approach?
They're betting on multiplexing and manufacturing scale - using existing photonic fab facilities to put many components on a chip. But reality is proving tricky. Even one basic resource state needs millions of components, and the chips are very lossy. They switched from silicon photonics to silicon nitride for lower loss, but it's hard to achieve both low loss and high switching speeds.
Re manufacturing advantages, my crude take would be some approaches use existing semi manufacturing and other approaches need to invent manufacturing too, the old two Hail Marys. Could we categorize approaches as "easy" vs "hard" to manufacture?
It's more complex than that. While we have silicon processes, quantum requirements demand 10x better performance and tighter tolerances. Sometimes improving an optimized process is harder than developing a new one. Ion trap researchers argue their "lower quality" new processes actually perform better because the precision requirements aren't as extreme.
So manufacturability might be a red herring - you still need extreme precision either way?
Exactly. All quantum platforms need millions or billions of high-quality qubits. The debate is: do you make something scalable quantum, or make something quantum scalable? Ion traps have perfect qubits and need manufacturing solutions. PsiQuantum has great manufacturing but needs quantum improvements. So far, starting with good quantum properties and improving manufacturability has shown faster progress.
That makes intuitive sense, manufacturing is “easier” than science. Given the billions being invested, what's the main limiting factor preventing faster progress - talent, technical challenges?
There are two aspects. On hardware, it's mainly time and money - fab runs are expensive and time-consuming. We have enough engineering talent from classical photonics and semiconductors. But on the software/algorithm side, we're mismanaging talent. Instead of letting brilliant minds focus on breakthrough ideas, we're sending them to conferences and sales meetings. It’s not an easy thing to manage, but you need to let scientists be scientists and incentivise experimentation.
That seems like an obvious problem to fix - why haven't companies tried concentrating talent on core research?
It's largely due to the generalist VC funding model, which is great for scaling but not for breakthroughs. Public funding helps, but needs better incentives than the typical "give us more money next time" approach. While SpaceX shows private funding can work, few founders can plan and execute breakthroughs with such clarity. Most companies end up chasing quick wins to show progress for investors.
Naming no names lol. Okay, looking at earlier companies like Rigetti and D-Wave - why didn't they focus more on making their systems easier to program?
They deserve credit for creating the foundational software stack, OS, and simulators before the hardware existed. At the time, you needed to be full-stack to make the thing work. Today, a quantum startup with $1 billion in the bank would likely focus solely on qubit performance and scaling. So their work compounded and I don’t think you can say they are anything but successes.
That’s interesting, they created more value than they captured. Taking of capturing value then, wouldn’t it make more sense to start with a specific application that's impossible on classical computers, like Etched did with their real-time generative gaming demo and work backwards?
That approach could work if you know exactly what you need - say 100 logical qubits with specific fidelity. Microsoft took this path, determining only topological qubits would satisfy their application requirements and setting out to invent them. But few startup founders can articulate such a clear 10-year vision with specific milestones and funding needs.
What application do you think will demonstrate quantum advantage first?
Quantum chemistry and materials science are the most likely candidates. Most hardware companies are converging on this space - you can see it e.g. in PsiQuantum's and others’ algorithm papers and partnerships.
That could have massive downstream effects - from fusion to AI. Do you see quantum as more of a catalyst than a standalone market? Maybe it’s like trying to define the CMOS market size?
Yes, though many still think of quantum computers like next-generation laptops. More realistic comparisons are to supercomputers, but the NVIDIA analogy is interesting - they were built for gaming graphics, then someone realized neural networks use similar math. Quantum might follow this pattern - built for chemistry but enabling unexpected breakthroughs. The type of computation you can do with these machines will be difficult and unexpected. This is why it does sort of make sense to get the machines out into the world to try and get people to see what they want to build.
In which case, why haven't we seen more accessible development tools and open-source initiatives?
Companies are trying - Rigetti and D-Wave released open-source software, and the Unitary Fund supports open-source quantum software development. But the hardware is presently too weak to demonstrate real advantage. We need one clear win - solving a specific chemistry problem better than classical computers - to catalyze the field and we think it’s coming soon.
Between superconducting, trapped ions, silicon spin, and photonics - which might achieve that first win?
Superconducting became popular and showed some quick wins because it's easier to fabricate and iterate quickly. While there are scaling concerns due to qubit size, they could achieve the first useful demonstrations. (See Google Willow’s announcement last week). D-Wave is already showing interesting results in condensed matter theory that can be benchmarked against classical supercomputers. But until one of them shows undisputed quantum advantage on a useful problem, we can’t tell who will win.
If someone definitively demonstrates quantum supremacy, could we see a "winner-takes-all" scenario where all resources flow to one approach?
Likely. While people talk about different platforms for different uses, the first to show real advantage will attract massive investment. The solution will be highly optimized across hardware, middleware, and algorithms. From a buyer's perspective, why back alternatives when something works? National interests might maintain some diversity, but one approach will likely dominate.
So speed to market might matter more than having the theoretically best approach?
Exactly. My biggest lesson moving from academia to VC is that technology isn't everything. With enough funding and business strength, you can buy or incorporate better technology later. That's why we focus on founders who combine urgency with clear direction. Many technical founders are too exploratory, while the winners have a clear roadmap and keep moving forward.
Final thoughts on funding and metrics - we're moving away from simple qubit counts, but what's replacing that?
We lack good standardized metrics. Logical qubit count seemed promising but can be misleading - a logical qubit with 50% error rate isn't useful. We need to measure both raw capabilities (like 5-nines fidelity on 2-qubit gates) and practical performance. It's similar to the FLOPS vs. actual performance debate in classical computing.
Let’s talk shop, what opportunities remain for new quantum startups given the billions already invested in major players?
Full-stack quantum computing startups face tough odds against established players. But there are opportunities in focused approaches: developing specific chips for $100M instead of $1B, systems integration, combining algorithms with hardware for specific use cases, or pure algorithm development which mainly needs smart people and modest funding. New business models are exciting as well, for example foundries. Plus, quantum sensing and other applications beyond computing remain interesting and less crowded.
Thank Kris, people should follow you on linkedin probably.
Note, I went off to look more closely into quantum sensing this week and will report back. Why? Because NV diamond quantum sensors might, one day, enable non-invasive brain computer interfaces that’s why. Because, I haven’t covered BCI yet in this newsletter. And I have thoughts.
Great research, as always. What are your thoughts on neuromorphic computing?