If you’ve been paying even casual attention to tech news lately, you’ve probably noticed that quantum computing is having a moment. Again. Or maybe still? It’s honestly hard to tell at this point.
In just the past few months, we’ve seen a parade of breathless announcements: Google’s Willow chip achieving “quantum advantage” 13,000 times faster than classical supercomputers. IBM unveiling its Quantum Nighthawk processor on a path to “quantum advantage by 2026.” Caltech building a record-shattering 6,100-qubit system. Microsoft, Amazon, and others rolling out their own quantum chips, each more impressive-sounding than the last. Quantum computing stocks have gone on wild roller-coaster rides, with some companies seeing share prices surge by over 3,000 percent in a year.
And then there’s the skepticism. In January 2025, Nvidia CEO Jensen Huang casually mentioned that quantum computing is still 15 to 30 years away from being “truly useful,” and quantum stocks promptly tanked. The term “quantum winter” started circulating again—that dreaded period when hype deflates, funding dries up, and everyone pretends they never believed in the technology in the first place.
So which is it? Are we on the verge of a quantum revolution that will crack encryption, design miracle drugs, and solve climate change? Or are we being sold another round of expensive vaporware by companies desperately trying to justify their billion-dollar valuations?
The answer, as with most things in quantum mechanics, is complicated. And maybe a little bit of both.
What Even Is Quantum Computing (And Why Should You Care)?
Let’s start with the basics, because quantum computing suffers from a serious marketing problem: most people have no idea what it actually is or what it does. Including, if we’re being honest, a lot of people hyping it.
Classical computers—the one you’re probably reading this on right now—process information using bits that are either 1 or 0, on or off. Quantum computers use quantum bits, or “qubits,” which can exist in a state of superposition, meaning they’re simultaneously 1 and 0 until you measure them. Qubits can also be entangled, meaning the state of one qubit is intrinsically linked to the state of another, regardless of the distance between them. Einstein famously called this “spooky action at a distance,” which should give you some sense of how weird this all is.
This quantum weirdness theoretically allows quantum computers to process massive amounts of information simultaneously, solving certain types of problems exponentially faster than classical computers ever could. The keyword here is “certain types.” Quantum computers aren’t going to make your web browser load faster or help you edit photos more efficiently. They’re specialized tools for specialized problems.
The dream applications are genuinely exciting: simulating molecular structures for drug discovery and materials science, optimizing complex logistics networks, breaking modern encryption (which is terrifying if you’re a security professional), and modeling climate systems. These are all tasks that would take classical supercomputers thousands or millions of years to compute—or that are simply impossible with current technology.
The catch? Building a quantum computer that can actually do any of this reliably is one of the hardest engineering challenges humanity has ever tackled.
The Error Problem: Why Quantum Computers Are So Damn Fragile
Here’s the fundamental issue that’s plagued quantum computing since day one: qubits are incredibly fragile. Like, absurdly fragile. They’re sensitive to temperature fluctuations, electromagnetic interference, cosmic rays, and basically anything that exists in the physical universe. Even the act of measuring a qubit causes its quantum state to collapse.
This means that quantum computers generate errors. Lots of them. Way more than classical computers. A classical computer might have an error rate of one in a trillion operations. Early quantum computers had error rates closer to one in a hundred. You can’t build anything useful with those odds.
The solution is quantum error correction, which sounds straightforward until you realize it requires using hundreds or even thousands of “physical qubits” to create a single “logical qubit” that can actually perform reliable calculations. It’s like needing 1,000 employees to do the work of one person, just to make sure nobody makes a mistake. The overhead is staggering.
For years, this created a vicious cycle: as researchers added more qubits to increase computing power, error rates would climb even faster, eating away any gains. It was like trying to build a skyscraper where every new floor you added made the foundation shakier. Many physicists wondered if quantum computers would ever be practical at all.
But here’s where things get interesting: in 2025, we’re finally starting to see that cycle break.
The Breakthroughs Are Real (Sort Of)
Let’s talk about what’s actually happened recently, because amid all the hype, there are some genuinely impressive developments.
Google’s Willow chip, announced in late 2024 and demonstrated further in October 2025, achieved something called “below-threshold error correction” with 105 qubits. This means that as they added more qubits, the error rate actually went down instead of up—a reversal of the trend that had plagued the field for decades. Their “Quantum Echoes” algorithm ran 13,000 times faster than the best classical algorithm on one of the world’s fastest supercomputers, and crucially, it’s “verifiable”—meaning another quantum computer can reproduce the results to confirm they’re real.
This matters. A lot. As Google researchers put it, this is “the first time in history that any quantum computer has successfully run a verifiable algorithm that surpasses the ability of supercomputers.” Not a toy problem designed to make quantum computers look good, but an actual computation that models physical phenomena and can be cross-checked.
IBM, not to be outdone, announced its Quantum Nighthawk processor in November 2025, claiming it demonstrates “all the key processor components needed for fault-tolerant quantum computing.” The company has laid out an aggressive roadmap calling for quantum advantage by the end of 2026 and fault-tolerant quantum computing by 2029. They’ve also shifted to 300mm wafer fabrication, allowing them to double development speed while boosting the physical complexity of quantum chips by 10 times.
Meanwhile, Caltech researchers built a record-breaking array of 6,100 neutral-atom qubits that maintained superposition for 13 seconds—ten times longer than previous arrays. These qubits operate at room temperature (most quantum computers need to be cooled to temperatures colder than deep space), which could eventually make them much more practical to deploy. The fact that these qubits can be moved around opens the door to more efficient error correction.
Quantinuum, a quantum computing company formed from Honeywell’s quantum division, reported crossing what they call “the last major hurdle to deliver scalable universal fault-tolerant quantum computers by 2029.” They achieved the first fully fault-tolerant non-Clifford gate with logical error rates below physical error rates—geek speak for “we can now do useful quantum operations while detecting and correcting errors better than before.”
In June 2025, IBM partnered with Japan’s RIKEN research institute to use the IBM Quantum Heron processor alongside the Fugaku supercomputer to simulate molecules at a level beyond the ability of classical computers alone. IonQ and Ansys ran a medical device simulation on a 36-qubit computer that outperformed classical high-performance computing by 12 percent—one of the first documented cases of practical quantum advantage in a real-world application.
These aren’t just incremental improvements. They’re addressing the core challenges that have held the field back for decades.
The Money Is Pouring In (Maybe Too Much)
If you want to know whether smart people think quantum computing is real, follow the money. And boy, is there a lot of money moving around.
Quantum computing companies raised $3.77 billion in equity funding during the first nine months of 2025—nearly triple the $1.3 billion raised in all of 2024. National governments invested $10 billion by April of this year, up from $1.8 billion in all of 2024. The global quantum computing market reached somewhere between $1.8 billion and $3.5 billion in 2025, with projections of $5.3 billion by 2029 and potentially $20.2 billion by 2030.
Stock prices for publicly-traded quantum companies have been on an absolute tear. Shares have risen by more than 3,000 percent for some companies over the past year. JPMorgan Chase announced a $10 billion investment in strategic tech companies, including quantum computing. TIME Magazine named “Industry-wide Quantum Chip Advancements” as one of its Best Inventions of 2025.
But—and this is a big but—some of that money is starting to look suspiciously like a bubble.
Quantum Computing Inc. (QCI), at one point valued at over $2 billion despite having virtually no revenue, has been accused of egregious fraud and misrepresentation. The company touted revolutionary photonic quantum chips and partnerships with NASA, but investigative reports and lawsuits in late 2024 and early 2025 painted a much darker picture. One prominent quantum firm hit an $11 billion market cap on only $43 million in revenue—a price-to-sales ratio above 250. These aren’t valuations based on current reality; they’re bets on science fiction becoming science fact.
When JPMorgan’s $10 billion investment announcement hit, quantum stocks rose by roughly 20 percent—even though no specific investments were announced and quantum was lumped in with “frontier and strategic technologies.” The combined market cap increase exceeded the total investment amount. That’s not rational investing; that’s speculation fever.
The Skeptics Have a Point
So let’s talk about the elephant in the room: a lot of very smart people think quantum computing is overhyped, and they have some pretty good arguments.
Jensen Huang’s comment about quantum being 15 to 30 years away wasn’t pulled from thin air. The consensus among many physicists is that while small quantum computers exist and can perform some limited tasks, building truly useful, fault-tolerant quantum computers capable of solving real-world problems at scale remains extraordinarily difficult.
The challenges are daunting:
Qubit quality and quantity: Current quantum computers have dozens to hundreds of qubits. Most useful algorithms require thousands or millions of logical qubits, which translates to billions of physical qubits when you account for error correction overhead. No one has a clear path to scale to that level.
Coherence time: Qubits need to maintain their quantum state long enough to perform calculations. Most quantum computers operate in the millisecond range. Complex calculations might require seconds or longer. It’s like trying to solve a math problem before the numbers fade from the page.
Temperature requirements: Most quantum computers need to operate at temperatures approaching absolute zero—about -273 degrees Celsius. Maintaining these conditions is expensive, energy-intensive, and makes quantum computers impractical for most applications. As IBM’s Oliver Dial noted, these machines belong in data centers “where we can give them the care and attention they need,” not on anyone’s desk.
Error correction overhead: Even with the recent breakthroughs, the overhead for error correction is enormous. You need hundreds of physical qubits to create a single reliable logical qubit. Scaling this up is an engineering nightmare.
The connectivity problem: Qubits need to interact with each other to perform calculations, but connecting them without introducing more errors is incredibly difficult. Different qubit technologies (superconducting, ion trap, neutral atom, etc.) have different connectivity constraints, and error correction codes optimized for one platform may not work well for another.
The speed problem: Error correction needs to happen fast—faster than new errors accumulate. A nanosecond gate operation is useless if error correction takes 100 microseconds. Current decoding speeds are still too slow for many applications.
Physicist Sabine Hossenfelder, known for her no-nonsense approach to cutting through scientific hype, has been particularly vocal in her skepticism. She points out that “exactly zero” practical quantum algorithms have been demonstrated in real-world conditions so far, and notes that no quantum computing company is currently making a profit.
A comprehensive analysis published in 2025 examined six potential future scenarios for quantum computing, from “fundamental scalability failures” to “transformative breakthroughs.” The most pessimistic scenario argues that the challenges aren’t just engineering problems but may represent fundamental physical limitations. The scaling challenges—cryogenic infrastructure, control system complexity, quantum memory requirements—might grow superlinearly, meaning each additional qubit makes the whole system exponentially more difficult to manage.
The Quantum Winter Question
This brings us to the specter of “quantum winter”—a period where the gap between promises and reality leads to a cooling of interest, investment, and enthusiasm.
It’s happened before in other fields. AI went through multiple winters when researchers couldn’t deliver on lofty promises. Nuclear fusion has been “30 years away” for about 70 years now. The nanotechnology hype of the early 2000s fizzled when people realized manipulating individual atoms was hard. Theranos and its fake blood-testing technology became a cautionary tale about what happens when hype runs too far ahead of reality.
Several factors are aligning that could trigger a quantum winter:
Hype outpacing reality: Expectations are growing faster than the technology itself. When companies promise revolutionary breakthroughs within a few years but fail to deliver practical applications, investors lose patience.
Market immaturity: The technology is still at the “laboratory curiosity” stage. Despite impressive demos, quantum computers aren’t solving real business problems at scale. They’re research tools, not products.
Talent shortage: Only one qualified candidate exists for every three specialized quantum positions globally. The field needs over 250,000 new quantum professionals by 2030. You can’t build an industry without people who understand the technology.
Standards and ecosystem gaps: A 2025 survey of over 300 quantum professionals found that 96 percent will rely on external support to implement quantum error correction, but a lack of training, best practices, and limited resources are slowing progress. The infrastructure to support widespread quantum computing simply doesn’t exist yet.
The fraud factor: Cases like Quantum Computing Inc. damage the field’s credibility. When investors realize they’ve been sold snake oil, they become suspicious of everyone.
Some observers think we’re already entering a quantum winter. Venture capital funding, while still high, started pulling back from some quantum startups in 2025. Gartner’s Hype Cycle for Deep Technologies places quantum computing back at the “Peak of Inflated Expectations,” calling 2025 the “year of quantum” at least three years too early for real widespread business impact.
Others argue we shouldn’t worry. Technology hype cycles are natural and necessary—they drive innovation, attract talent, and force companies to make progress. As long as researchers continue delivering genuine technical advances, the field will survive the inevitable periods of disillusionment.
The Real Story: Progress Is Happening, Just Not How You Think
Here’s what I think is actually happening: quantum computing is making real, substantive progress, but that progress doesn’t match the breathless headlines or stock market valuations.
Fred Chong, a professor at the University of Chicago and chief scientist for quantum software at ColdQuanta, puts it well: “I think we’re very comfortably in the era of escape velocity. The quantum devices are fairly good, and error correction codes have gotten better. This means that building a big, useful quantum computer is no longer a physics problem but an engineering problem.”
That distinction matters. Physics problems require waiting for fundamental scientific breakthroughs that may or may not happen. Engineering problems require hard work, iteration, and resources, but they’re fundamentally solvable. We know how to engineer hard things. It just takes time and money.
The error correction breakthroughs we’ve seen in 2025 are genuinely significant. Researchers at the University of Osaka developed a much more efficient way to create “magic states”—a key component for useful quantum computing. Oxford physicists achieved the lowest-ever error rate for a quantum logic operation—just one error in 6.7 million operations. QuEra published algorithmic fault tolerance techniques that reduce quantum error correction overhead by up to 100 times.
These advances aren’t just laboratory curiosities. They’re moving timelines for practical quantum computing forward substantially. Steve Brierley, CEO of Riverlane, believes the first error-corrected quantum computer with around 10,000 physical qubits supporting 100 logical qubits and capable of a million quantum operations could arrive as soon as 2027. GigaQuOp machines should be available by 2030-2032, and teraQuOps by 2035-2037.
IBM’s roadmap calls for fault-tolerant quantum computing by 2029. IonQ projects 1,600 logical qubits in 2028, 8,000 in 2029, and 80,000 in 2030. These aren’t pie-in-the-sky predictions; they’re based on concrete technical achievements and clear engineering pathways.
Meanwhile, applications are starting to emerge in narrow domains. Quantum computers won’t replace classical computers for most tasks, but they’re showing promise in specific areas: molecular simulation for drug discovery, optimization problems in logistics and finance, materials science, and certain AI applications. Quantum computing as a service (QCaaS) is becoming a viable business model, allowing companies to access quantum resources without building their own machines.
The Investment Perspective: Patient Capital Required
If you’re thinking about quantum computing as an investment—whether you’re a venture capitalist, corporate strategist, or individual investor—the key word is patience.
Biotech provides a useful analogy. Biotech startups aren’t expected to show customer traction on a seed round. Investors understand that developing drugs takes 10-15 years and billions of dollars. They invest in teams, science, and long-term potential, not immediate revenue.
Quantum computing needs similar patient capital. Companies promising quantum supremacy within 3-5 years are either lying or delusional. Those making measured progress toward technical milestones over 8-10 year timelines might actually deliver.
The smart money is betting on:
Team quality over flashy claims: The field is on the cutting edge of innovation. The best predictor of success is having brilliant, persistent people who actually understand the physics and engineering.
Specific applications over general quantum computing: Companies focusing on particular problems—molecular simulation, quantum sensing, quantum communications—have clearer paths to market than those promising to revolutionize everything.
Error correction infrastructure: The companies building the tools, software, and platforms that make quantum error correction practical could capture enormous value regardless of which hardware approach ultimately wins.
Post-quantum cryptography: Quantum computers will eventually break current encryption standards. Companies preparing for that transition have near-term revenue opportunities.
Hybrid quantum-classical systems: The first practical applications will likely combine quantum and classical computing, using each for what it does best.
What you shouldn’t bet on: quick exits, mass consumer adoption, or quantum computers replacing classical computers anytime soon. This is a decade-plus play, minimum.
So, Are We Close to a Breakthrough?
Finally, we can answer the question: are we close to a breakthrough in quantum computing?
The answer is yes—kind of. It depends on what you mean by “breakthrough” and “close.”
If you mean “will quantum computers revolutionize everyday life in the next few years?”—then no. Your laptop won’t be quantum, your phone won’t be quantum, and quantum computers won’t be solving most of the problems regular computers handle just fine.
If you mean “are we approaching the point where quantum computers can reliably solve specific, valuable problems that classical computers cannot?”—then yes, we’re getting close. Maybe 3-7 years for narrow applications, 5-10 years for more substantial use cases, 10-20 years for transformative capabilities.
The key breakthroughs happening right now are in error correction, qubit quality, and coherence times. These advances are moving quantum computing from the “will this ever work?” phase to the “how do we engineer this at scale?” phase. That’s progress.
But here’s the thing: breakthroughs rarely look like breakthroughs when they’re happening. The invention of the transistor in 1947 didn’t immediately lead to smartphones. It took 50 years of iterative improvements to get from basic transistors to large-scale integrated circuits. The same is likely true for quantum computing.
We’re in the middle innings of a very long game. The physics works. The basic concepts are sound. The engineering challenges are being tackled systematically. Progress is real and accelerating. But anyone promising quantum miracles in the next couple years is selling something.
The Bottom Line: Cautious Optimism Is Warranted
Quantum computing in 2025 is a field at a fascinating inflection point. The technology has progressed enough that serious people are taking it seriously. Google, IBM, Microsoft, Amazon, and others are betting billions that they can make this work. Governments are investing tens of billions more. The fundamental problems that seemed insurmountable five years ago are yielding to clever engineering.
At the same time, the hype has gotten out of control. Stock prices disconnected from reality, companies making fraudulent claims, and breathless media coverage promising quantum computers will solve climate change and cure cancer next Tuesday—all of this threatens to trigger a quantum winter that could set the field back years.
The truth, as usual, is somewhere in the middle. Quantum computing is neither vaporware nor a near-term revolution. It’s an emerging technology making genuine progress toward becoming useful for specific, valuable applications. The timeline is measured in years and decades, not months. The applications will be specialized, not universal. The path forward is clearer than it’s ever been, but it’s still long and difficult.
If you’re building a business around quantum computing, focus on real technical milestones, not hype. If you’re investing in the space, think in terms of decades, not quarters. If you’re just watching from the sidelines, lower your expectations for when quantum computers will affect your daily life, but raise your expectations for how much they’ll eventually matter.
The breakthrough isn’t here yet. But for the first time, you can actually see the path to getting there. And that, in itself, might be the biggest breakthrough of all.
Now we just have to walk the path. Which, given that it goes through the quantum realm where everything is simultaneously happening and not happening until you look at it, should be interesting.