Heliox: Where Evidence Meets Empathy 🇨🇦
Join our hosts as they break down complex data into understandable insights, providing you with the knowledge to navigate our rapidly changing world. Tune in for a thoughtful, evidence-based discussion that bridges expert analysis with real-world implications, an SCZoomers Podcast
Independent, moderated, timely, deep, gentle, clinical, global, and community conversations about things that matter. Breathe Easy, we go deep and lightly surface the big ideas.
Curated, independent, moderated, timely, deep, gentle, evidenced-based, clinical & community information regarding COVID-19. Since 2017, it has focused on Covid since Feb 2020, with Multiple Stores per day, hence a sizeable searchable base of stories to date. More than 4000 stories on COVID-19 alone. Hundreds of stories on Climate Change.
Zoomers of the Sunshine Coast is a news organization with the advantages of deeply rooted connections within our local community, combined with a provincial, national and global following and exposure. In written form, audio, and video, we provide evidence-based and referenced stories interspersed with curated commentary, satire and humour. We reference where our stories come from and who wrote, published, and even inspired them. Using a social media platform means we have a much higher degree of interaction with our readers than conventional media and provides a significant amplification effect, positively. We expect the same courtesy of other media referencing our stories.
Heliox: Where Evidence Meets Empathy 🇨🇦
⚛️ The Living Machine: What Quantum Computing Teaches Us About Persistence
Please see the corresponding Substack resource.
We've been sold a particular story about progress. It goes something like this: breakthroughs happen suddenly, genius strikes like lightning, and revolution arrives in a single dramatic moment that changes everything overnight.
The reality, as usual, is messier and more interesting.
Consider what happened recently in a laboratory where scientists managed to keep a quantum computer running for two hours straight. Two hours doesn't sound particularly impressive until you understand that previous attempts measured their success in milliseconds—thousandths of a second. It's the difference between a sprinter managing a single stride and suddenly completing a marathon.
But here's what caught my attention, and what I think matters more than the technical achievement itself: the way they solved the problem tells us something important about how complex systems—including us—survive and thrive.
Neng-Chun Chiu et al, Continuous operation of a coherent 3,000-qubit system, Nature (2025)
Clearing significant hurdle to quantum computing
Harvard physicists working to develop game-changing tech demonstrate 3,000 quantum-bit system capable of continuous operation
This is Heliox: Where Evidence Meets Empathy
Independent, moderated, timely, deep, gentle, clinical, global, and community conversations about things that matter. Breathe Easy, we go deep and lightly surface the big ideas.
Thanks for listening today!
Four recurring narratives underlie every episode: boundary dissolution, adaptive complexity, embodied knowledge, and quantum-like uncertainty. These aren’t just philosophical musings but frameworks for understanding our modern world.
We hope you continue exploring our other podcasts, responding to the content, and checking out our related articles on the Heliox Podcast on Substack.
About SCZoomers:
https://www.facebook.com/groups/1632045180447285
https://x.com/SCZoomers
https://mstdn.ca/@SCZoomers
https://bsky.app/profile/safety.bsky.app
Spoken word, short and sweet, with rhythm and a catchy beat.
http://tinyurl.com/stonefolksongs
Curated, independent, moderated, timely, deep, gentle, evidenced-based, clinical & community information regarding COVID-19. Since 2017, it has focused on Covid since Feb 2020, with Multiple Stores per day, hence a large searchable base of stories to date. More than 4000 stories on COVID-19 alone. Hundreds of stories on Climate Change.
Zoomers of the Sunshine Coast is a news organization with the advantages of deeply rooted connections within our local community, combined with a provincial, national and global following and exposure. In written form, audio, and video, we provide evidence-based and referenced stories interspersed with curated commentary, satire and humour. We reference where our stories come from and who wrote, published, and even inspired them. Using a social media platform means we have a much higher degree of interaction with our readers than conventional media and provides a significant amplification effect, positively. We expect the same courtesy of other media referencing our stories.
This is Heliox, where evidence meets empathy. Independent, moderated, timely, deep, gentle, clinical, global, and community conversations about things that matter. Breathe easy. We go deep and lightly surface the big ideas. Welcome back. Today we're diving into some fascinating material, really a huge step forward in quantum computing. Welcome back. It really is. Yeah. And the scale involved. Yeah. It's almost hard to wrap your head around. Right. So the hook that grabbed me was this. A quantum machine, just 300 qubits. Yeah, quantum bits. Could theoretically hold more information simultaneously than all the particles we know of in the universe. That sheer potential, that exponential power, that's always been the dream, hasn't it? Absolutely. But it's been just that a dream because the big hurdle was always, well, how do you keep these things stable? Exactly. How do you keep such delicate systems running long enough to actually use that incredible power? It's been the fundamental bottleneck. And that brings us to our mission for this deep dive. We're looking at a major breakthrough from a Harvard-led team working with MIT and Quera Computing. A big collaboration. Yeah. And they didn't just build a big system, though they did that too, over 3,000 qubits. But the real news is they figured out how to run it continuously. For a long time. Yeah. That's the key. So we need to unpack how they cracked that problem, how they overcame the biggest operational issue. just keeping the machine running duration. It's what could potentially move quantum computing from, you know, these amazing lab curiosities towards being actual reliable tools. Okay. So maybe just a quick refresher for everyone on the core quantum advantage. Sure. So classical computers, these bits, simple, right? Zero or one. On or off. Exactly. But quibits, because they use these tiny atomic properties, they can be zero, they can be one, or, and this is the crucial part, they can be both at the same time. That's superposition. That's superposition. And it leads to this incredible scaling. Right, because in classical computing, if you double your bits, you roughly double your power. More or less. But with quantum, because of superposition and another effect called entanglement, adding quibits doesn't just add power, it multiplies it. exponentially okay so the power is immense theoretically why haven't we seen massive quantum computers everywhere then our sources really point to one main culprit in these neutral atom sultans atom loss yeah atom loss it sounds almost mundane but it's been a massive headache so the quibits the individual atoms they're held in these super delicate traps right like laser beam tiny optical tweezers yeah very precise very fragile and the atoms Well, they just escape sometimes. Or they lose their quantum state, the information they're holding. Which basically meant that previous experiments were like quick snapshots, one-shot deals. Pretty much. You'd set everything up, run it for a tiny fraction of time before too many atoms were lost, then you'd have to stop, reset, reload all the atoms. That sounds incredibly inefficient. A huge bottleneck. It absolutely crippled progress towards longer computations. You couldn't run complex algorithms. Okay, so let's get into the specifics of this breakthrough then. The Nature paper details a system. That sounds incredibly inefficient. How many qubits again? Over 3,000 qubits, which is already impressive, scale-wise. But the headline number isn't just the qubit count. It's the runtime. They kept it going for... Over two hours. Two hours compared to, what, milliseconds or seconds? Exactly. And critically, the way they designed it, they think it could potentially run indefinitely, continuously. Wow. OK, so the continuity is the real story here. It really is. One of the lead researchers, Mikhail Lukin, even said that this continuous operation might actually be more important in practice than a specific number of qubits. really drives home why just having more Klebits isn't enough. You need them to stay there and stay working. Precisely. It's like the difference between having a super fast car that runs out of shul in one lap versus a slightly slower car that can run the whole race. And this idea of continuous operation led to a really interesting framing from Loken. He described the system becoming. What was it? He said, we can literally reconfigure the atomic quantum computer while it's operating. Basically, the system becomes a living organism. A living organism? That's quite a metaphor. It implies self-repair, adaptation. It does. It suggests they found a way to, like, heal the system on the fly, replacing lost atoms without shutting down the whole process. So how did they do it? What's the engineering magic behind this living quantum computer? Well, they basically built a new kind of architecture, a zoned system specifically designed to feed new qubits in constantly and quickly tackling that atom loss problem head on. Okay, zoned. So different areas doing different jobs. Exactly. A key part is having a high speed preparation zone that's separate from the main computing zone where the qubits are doing their work. Let's break down the mechanics. What are the tools they're using to shuttle potentially millions of atoms around so fast? It comes down to two main things. First, for moving large numbers of atoms from a source like a reservoir, they use what they call optical lattice conveyor belts. Conveyor belts made of light. Sort of, yeah. Think of waves of light pushing the atoms along. Then, for the really precise work, grabbing individual atoms and placing them exactly where they need to go in the array. That's the tweezers you mentioned. Right, optical tweezers. These are very tightly focused laser beams that act like microscopic tractor beams for single atoms. Okay, so you've got bulk transport and precision placement. But the key to running continuously must be the speed of replacement, right? How fast can they cycle these atoms? The speed is absolutely critical. They can load atoms into the tweezers at an incredible rate, up to 300,000 atoms per second. 300,000 per second. Yeah. And then, once those atoms are prepared into the qubit state, they can maintain a continuous flow, a flux, of over 30,000 initialized qubits every second. 30,000 usable qubits per second flowing into the system. Correct. And even when they do more complex sorting, arranging them into perfect defect-free blocks, say, batches of 600, they can still manage 15,000 qubits per second. That's still incredibly fast. So over the two hours they demonstrated, how many atoms are we talking about cycling through? Correct. The paper mentions over 50 million atoms had cycled through the system in that two-hour period. 50 million. That's staggering. But wait, doesn't moving all those atoms around, all that laser light for the tweezers and conveyors, doesn't that create a ton of noise? Heat? Wouldn't that disturb the working quibbit? Yeah. Yes. That is the multimillion dollar question, isn't it? How do you do all this maintenance without destroying the fragile quantum states you're trying to preserve? Right. Seems like you'd solve one problem, atom loss, just to create another decoherence. And that's where the cleverness of the architecture really comes in. First, how they built the main array. They did it iteratively, section by section. How many sections? They built the big 3,240-site array in six segments. They call them subrays. Yeah. And assembling the whole thing took only about half a second. Okay, built-in segments. How does the continuous maintenance, the healing, work then? It's a constant rotating cycle. Roughly every 80 milliseconds really fast, the system identifies the subray that's been sitting there the longest. The oldest one. Essentially, yes. It ejects those atoms and instantly refills that segment with a fresh batch, newly prepared from the loading zone. So it's like constantly patching the array piece by piece, faster than the atoms can naturally decay or escape. Exactly. This rapid targeted replacement is what let them keep over 3,000 atoms active for those two hours. way, way beyond the typical lifetime of maybe 60 seconds you'd get just holding them in tweezers alone. Okay, the mechanics are brilliant, but it still brings us back to that core quantum problem, coherence. How do you run this intense reloading operation right next door to kovits that are actively computing, holding delicate superposition states without wrecking them? That's the crux of it. You've got loading, transport, preparation, all involving lasers and atom manipulation happening concurrently with the quantum operations. And any stray light, any vibration can cause decoherence, making the quibits lose their quantum magic. So they needed protection. Shields up. Shields up indeed. They implemented two main layers of protection, a physical one and a more, let's say, quantum level one. Okay, start with the physical. What did they do architectural? Remember the different zones. The big source of noise is the very first stage where they initially trap a cloud of atoms. It's called a MOT, a magneto-optical trap, and it uses intense lasers. Bright and noisy. Very. So to shield the main quantum register from that initial noise, they used a clever angle design for the transport system. them, the conveyor belts. Angled? How does that help? By using a tilted dual lattice setup, they physically blocked any direct line of sight between the noisy MOT area and the area where the quibbits were being stored and manipulated. Less stray light could leak across. So geometry is noise cancellation. Clever. What was the second layer? The more quantum shield. That was a technique they call quibit shielding. This deals with the more subtle noise, the scattered photons bouncing around from the nearby loading and transport of new atoms right next to the working atoms. How does that work? More lasers. Yes, but a very specific kind. They use light near a particular wavelength. 1529 nanometers shone onto the stored working quibits. This light effectively adjusts the energy levels of those quibits slightly. Okay, light shifting. What does that actually do for the quibit? How does it protect it? Think of it like tuning a radio. The noise, the scattered light, is like static at a certain frequency trying to interfere with your station. The 1529-nilometer light dynamically retunes the quibbit's energy levels so they're no longer sensitive to or resonant with that specific frequency of noise from the nearby operations. Ah, so you make the working quibbits effectively deaf to the noise being generated right next to them during the reload cycle. That's a great way to put it. You create this dynamic temporary shield that protects their quantum state, their coherence, from that specific disturbance. And did it work? Did the measurements show the coherence was preserved? They did. It was quite successful. They measured the coherence time, technically the T282 time, how long the superposition lasts. And with the shielding active, it recovered to about 1.09 seconds. How good is that? It's very close to the ideal reference time of 1.34 seconds they measured in a completely quiet, undisturbed setup. So the shielding restored almost all of the coherence that would have been lost due to the concurrent reloading. That's impressive. So they managed to keep the array full and kit the qubits in their quantum states, even the really fragile superposition ones, while all this maintenance was happening right beside them. Exactly. They showed they could sustain the storage zone with qubits ready for computation, potentially indefinitely, because the reloading cycles in adjacent subarrays didn't ruin the state of the qubits in the working subrays. This feels like the moment it shifts from a cool science experiment to a potentially viable technology roadmap. Exactly. I think that's fair to say. It adjusts as a fundamental barrier to scaling up. So let's synthesize this. For you, the listener, what's the big takeaway here? What does this continuous operation, this coherence protection actually enable? The big picture is that this architecture, this combination of continuous operation and proven coherent shielding, It really lays out a practical path towards fault-tolerant quantum computers. That's the holy grail. Fault tolerance, meaning the computer can actively correct errors as it runs. Precisely. And you need long run times and stable quivots to even begin implementing effective error correction. So what's the potential impact then if we get fault-tolerant machines based on this kind of continuous platform? Well, the projections are pretty staggering. The team suggests this could enable quantum computers to perform billions of quantum operations in a single run. Billions, compared to maybe thousands or millions now. Right, and running not just for seconds or minutes, but potentially for days. Okay, and if you have that kind of sustained quantum power, what problems can you tackle? Oh, the list is long. Think about designing new drugs and materials at the molecular level, something classical computers struggle with. Complex financial modeling, optimizing huge logistical networks, potentially revolutionizing AI, and even things like building vastly more precise atomic clocks and sensors. So science, medicine, finance, technology. a broad impact. Definitely. It unlocks a new regime of computational power. We really have arrived at that living organism computer, haven't we? The chip that heals itself and just keeps going. We have. It's a paradigm shift. And that brings us to a final thought, something for you to chew on. If the hardware problem of continuous operation is maybe not solved, but has a clear path forward, What's the next big challenge? The machine can run continuously now. Right. So if you have this nonstop fire hose of quantum information processing, the bottleneck shifts. Now the pressure is on the software, specifically the error correction protocols. They have to be fast enough and good enough to keep up with a machine that never needs to pause. Exactly. How fast does your error checking and correction need to be to guarantee reliable results from a system that's perpetually running and potentially making errors? The sources hint that these improvements might eventually allow for hundreds of logical error corrected qubits with incredibly low failure rates. maybe 10 to 8. But getting there means the error correction has to be incredibly efficient. That's the next frontier, matching the speed of error correction to the speed of this continuous quantum operation. A fascinating challenge ahead. Thanks for walking us through this incredible engineering achievement. My pleasure. It's exciting stuff. Thanks for listening today.
Four recurring narratives underlie every episode:boundary dissolution, adaptive complexity, embodied knowledge, and quantum-like uncertainty. These aren't just philosophical musings, but frameworks for understanding our modern world. We hope you continue exploring our other podcasts, responding to the content, and checking out our related articles at helioxpodcast.substack.com.
Podcasts we love
Check out these other fine podcasts recommended by us, not an algorithm.
Hidden Brain
Hidden Brain, Shankar Vedantam
All In The Mind
ABC listen
What Now? with Trevor Noah
Trevor Noah
No Stupid Questions
Freakonomics Radio + Stitcher
Entrepreneurial Thought Leaders (ETL)
Stanford eCorner
This Is That
CBC
Future Tense
ABC listen
The Naked Scientists Podcast
The Naked Scientists
Naked Neuroscience, from the Naked Scientists
James Tytko
The TED AI Show
TED
Ologies with Alie Ward
Alie Ward
The Daily
The New York Times
Savage Lovecast
Dan Savage
Huberman Lab
Scicomm Media
Freakonomics Radio
Freakonomics Radio + Stitcher
Ideas
CBCLadies, We Need To Talk
ABC listen