
Heliox: Where Evidence Meets Empathy
Join our hosts as they break down complex data into understandable insights, providing you with the knowledge to navigate our rapidly changing world. Tune in for a thoughtful, evidence-based discussion that bridges expert analysis with real-world implications, an SCZoomers Podcast
Independent, moderated, timely, deep, gentle, clinical, global, and community conversations about things that matter. Breathe Easy, we go deep and lightly surface the big ideas.
Curated, independent, moderated, timely, deep, gentle, evidenced-based, clinical & community information regarding COVID-19. Since 2017, it has focused on Covid since Feb 2020, with Multiple Stores per day, hence a sizeable searchable base of stories to date. More than 4000 stories on COVID-19 alone. Hundreds of stories on Climate Change.
Zoomers of the Sunshine Coast is a news organization with the advantages of deeply rooted connections within our local community, combined with a provincial, national and global following and exposure. In written form, audio, and video, we provide evidence-based and referenced stories interspersed with curated commentary, satire and humour. We reference where our stories come from and who wrote, published, and even inspired them. Using a social media platform means we have a much higher degree of interaction with our readers than conventional media and provides a significant amplification effect, positively. We expect the same courtesy of other media referencing our stories.
Heliox: Where Evidence Meets Empathy
π Xanadu Aurora Scalable Photonic Quantum Computer
In the race to build practical quantum computers, a fascinating dark horse is emerging: photonic quantum computing. While most media attention focuses on the superconducting approaches championed by tech giants, a different path using light itself might ultimately prove more practical and scalable.
Scaling and networking a modular photonic quantum computer
This is Heliox: Where Evidence Meets Empathy
Independent, moderated, timely, deep, gentle, clinical, global, and community conversations about things that matter. Breathe Easy, we go deep and lightly surface the big ideas.
Thanks for listening today!
Four recurring narratives underlie every episode: boundary dissolution, adaptive complexity, embodied knowledge, and quantum-like uncertainty. These arenβt just philosophical musings but frameworks for understanding our modern world.
We hope you continue exploring our other podcasts, responding to the content, and checking out our related articles on the Heliox Podcast on Substack.
About SCZoomers:
https://www.facebook.com/groups/1632045180447285
https://x.com/SCZoomers
https://mstdn.ca/@SCZoomers
https://bsky.app/profile/safety.bsky.app
Spoken word, short and sweet, with rhythm and a catchy beat.
http://tinyurl.com/stonefolksongs
Curated, independent, moderated, timely, deep, gentle, evidenced-based, clinical & community information regarding COVID-19. Since 2017, it has focused on Covid since Feb 2020, with Multiple Stores per day, hence a large searchable base of stories to date. More than 4000 stories on COVID-19 alone. Hundreds of stories on Climate Change.
Zoomers of the Sunshine Coast is a news organization with the advantages of deeply rooted connections within our local community, combined with a provincial, national and global following and exposure. In written form, audio, and video, we provide evidence-based and referenced stories interspersed with curated commentary, satire and humour. We reference where our stories come from and who wrote, published, and even inspired them. Using a social media platform means we have a much higher degree of interaction with our readers than conventional media and provides a significant amplification effect, positively. We expect the same courtesy of other media referencing our stories.
Welcome to the deep dive. We take the sources you send our way, dig in and pull out the most fascinating, most important insights just for you. That's right. Today, we're plunging into photonic quantum computing. We've got some great material, an overview called Photonic Quantum Computing Explained, a really detailed paper... scaling and networking a modular photonic quantum computer, and an exciting press release from Xanadu. Yeah, and our mission really is to get a handle on the potential of this photonic approach, PQC. PQC, right. We want to highlight what makes it different, you know, its unique advantages, and then look at these recent breakthroughs from Xanadu's Aurora system. Trying to cut through the jargon. Exactly. Get to those aha moments. Understand why this could be such a big deal. Okay, let's dive in. Photonic quantum computing. Sounds... Well, pretty complex. What is it, basically? At its heart, it's about using photons' individual particles of light as your quibits. Quibits, the quantum version of bits. Precisely. The fundamental units for quantum information. But unlike classical bits, they can, you know, be in multiple states at once. That's the quantum magic. Quantum mechanics with light. Okay, intriguing. But how do you actually encode information onto a photon? What do you use? Good question. Mainly, you use key properties of the photons themselves. Polarization is one that's like the direction the light wave is wiggling. Okay. Or you could use the path the photon takes. Yeah. Those are examples of what's called the discrete variable approach, DV. Discrete, like distinct options, path A or path B, polarization X or Y. Right. Then there's also the continuous variable approach, or CV, which is the same. That uses, well, continuous properties, things like squeezed states of light. Squeezed states. What does that mean? Ah, OK. So imagine light has inherent quantum uncertainty in certain pairs of properties like position and momentum, but for light. Squeezing means you reduce the uncertainty in one property. But the laws of physics mean you increase the uncertainty in the other. It's like squeezing a water balloon. It gets thinner one way but bulges out the other. We can engineer this squeezing to encode information. Huh. So discrete variables like path, continuous variables like these squeezed states. Yeah. Got it. And I saw the term LOQC, Linear Optical Quantum Computing. Where does that fit? LOQC is fundamental to PQC. It's basically the toolkit. It means using standard optical components, things like beam splitters, phase shifters, things that manipulate light in a linear way to perform the operations on your photonic quivots. So the basic building blocks for the computation? Exactly. It's how you guide the photons and make them interact to run algorithms. Alright, so photons as cribbits, encoding using things like polarization or squeezing, manipulated with linear optics, that lays the groundwork. But why this approach? What makes TQC stand out from other ways of building quantum computers? There are some really compelling advantages. A huge one is room temperature operation. Oh really? No extreme cooling? Right. Unlike many other quantum systems that need these massive, expensive cryogenic refrigerators to get down near absolute zero... Which sounds like a nightmare. It's a major engineering challenge. PQC, on the other hand, can potentially work in a much more standard environment. That could drastically lower costs and complexity, and maybe make it easier to integrate with classical computers eventually. Room temperature. That alone sounds like a game changer for practice. What else? Well, photons are also naturally quite robust against certain kinds of noise. Less easily disturbed. Yeah. They don't interact as strongly with their environment as, say, trapped ions or superconducting qubits sometimes do. Think of it like they're better at ignoring background chatter, less prone to decoherence from some sources. Lower noise qubits. Also a massive cluss in the quantum world, which is notoriously fragile. Okay, what about scaling up? Building really big, useful machines? That's another potential killer app for PQC. Scalability and connectivity. This is really interesting. How so? Because photons are already the carriers of information in our global fiber optic network. Ah, the internet backbone. Exactly. So you can, in principle, send photonic qubits over long distances using existing fiber optic cables. This opens the door to building large-scale distributed quantum computers, like connecting modules across a room or a building or maybe even cities. The idea of quantum data centers becomes much more tangible. Wow, quantum data centers linked by fiber. Using infrastructure we largely already have. That's quite a vision. It is. Plus, the sources suggest PQC might use relatively simpler components compared to some alternatives, and photonic qubits are versatile. You can perform a wide range of quantum operations with them. Okay, simpler parts, less noise, room temperature, and this incredible potential for networking and scaling. I'm definitely seeing the appeal now. So let's bring in Xanadu and their Aurora system. This sounds like a big step. It really does seem to be. Xanadu's whole focus has been on photonic quantum computing, trying to make it practical. And on January 22nd, 2025, they announced Aurora. They're calling it the world's first scalable, networked, and modular quantum computer. Scalable, networked, modular... Those words keep coming up. What does Aurora actually consist of? What's the hardware? It's built using four standard server racks. But crucially, these racks are independent modules that are photonically interconnected, networked together. Okay. Four racks talking to each other with light. Exactly. This first version is a 12-quivet machine. Inside, it uses 35 of their custom photonic chips. 35 chips for 12 quivets? Yeah. Gives you a sense of the complexity involved. And it's all connected by, get this, 13 kilometers of optical fiber. 13 kilometers. Inside four racks. Packed in there. Yeah. And again, operating at room temperature, really driving home that advantage. That's impressive engineering. What's Xanadu saying about... where this leads. Their CEO, Christian Weebrook, was quoted saying they believe they've essentially solved scalability in principle with this modular network design. Solid scalability. That's a bold statement. It is. Their vision is scaling this up, potentially thousands of racks, millions of qubits, making that quantum data center idea a reality. He did add, though, that the next focus is performance-reducing loss and achieving fault tolerance. Right. Scaling is one thing. Making it work reliably is another. And this work was published in Nature. Yes, which is significant. Publication in a top journal like Nature means it's gone through rigorous peer review, adding a lot of credibility to their claims. And Aurora isn't their first machine, right? It builds on earlier systems. Exactly. They had systems like X8 and Borealis before. Aurora seems to be the culmination of that development path, really focusing on this modular approach from the ground up. It's like they've been refining the building blocks. So it's a deliberate evolution towards this modular networked architecture. The press release really hammered those three pillars, scalability, modularity, networkability. And mentioned using commercial chips and less cooling. Yeah, using commercially available chip fabrication lines, even if not yet optimized for quantum, suggests a path towards more manufacturable and potentially cost-effective scaling compared to technologies needing highly bespoke tools. ultra-cold components. It's sort of like shifting from artisanal parts to something closer to mass production. Okay, let's dig into the nature paper a bit more. 35 photonic chips for 12 quivits still sounds like a lot. What are all those chips doing? Right. The paper explains that this 12-quivit setup is really a scale model to demonstrate the architecture works. The key is these rack-mounted modules connected by fiber. Inside, they have 84 squeezers. The things that make the squeeze light states. Exactly. And 36 photon number resolving detectors, detectors that can count individual photons accurately. This whole setup provides 12 physical qubit modes per clock cycle. Wow. 84 squeezers, 36 special detectors, just for 12 qubits. That highlights the resource overhead at this stage. What did they actually demonstrate with all this hardware? One of the headline results was synthesizing a massive entangled state, an 86.4 billion mode entangled cluster state. And critically, this state spanned across physically separate chips in the different modules. 86.4 billion modes entangled across different chips. That's hard to even picture. Why is that important? It demonstrates they can create these highly complex, interconnected quantum resources needed for computation and do it across their network. It's a key sign of the system's potential power and connectivity. They also showed they could run a basic error detection code, the foliated distance to repetition code, and do the decoding in real time, a first step towards error correction. billions of entangled Mohs and real-time error detection. The paper mentioned some key building blocks they demonstrated too. Yeah, they broke it down. Things like heralded synthesis of specific non-Gaussian states, basically, creating the right kind of quantum light on demand. Real-time multiplexing, using those special detectors to make efficient use of the photons. Creating the cluster state itself across space and time, using fiber buffers for precise timing. and, crucially, adaptive measurements using fast detectors and feedforward, meaning the system can react to a measurement result almost instantly and change the next step. So creating the states, using them efficiently, weaving them together, and reacting quickly. Sounds like all the essential ingredients are there. Pretty much. It shows the different parts of the system working together. The paper also emphasizes their approaches based on Gassman-Kiteev-Preskil states, GKP states, Why that specific method? The big appeal of the GKP approach, especially in photonics, is that it aims for deterministic quantum gates using linear optics at room temperature. Deterministic meaning not based on chance. Right. Many single photon approaches have probabilistic elements. Gates only work some of the time, and they often need cryogenic detectors. GKP tries to get around that. The main place cryogenics might still be needed is for initially preparing or heralding certain states, but the core processing could be room temp and deterministic. That's seen as a more practical path for scaling. Deterministic room temperature operations. That does sound much more appealing for building large systems. Can you walk us through the main stages of the Aurora architecture described in the paper? Sure. It's roughly three stages, implemented on different photonic integrated circuit chips, or PICs. First, they use something called Gaussian boson sampling, GBS, to prepare initial non-Gaussian quantum states in a heralded way, meaning a signal tells you when you've successfully made one. Okay, stage one. Make the initial state. Stage two involves what they call refineries. These are networks of adaptive interferometers that take these initial states, improve their quality, and entangle them into pairs specifically, GKP bell pairs. Refineries to clean up the states and make entangled pairs. Got it. it. Then stage three is the array of quantum processing units or QPUs. These take the best spell pairs from the refineries, use fiber connections and more optics to weave them into that big cluster state, and then perform the actual computation by making measurements on the qubits. So state creation, state refinement and pairing, then state weaving and measurement. all on different chips connected by fiber. Exactly. And those fiber links need to be carefully stabilized for phase and polarization to maintain the quantum coherence. You mentioned the refinery improves state quality. How does it do that? It sounded interesting. Yeah, it uses these clever binary tree networks of adaptive beam splitters. Based on measurements within the tree, it can effectively select the best incoming states and perform a kind of breeding operation to enhance their quantum properties, using measurement-based squeezing to get them just right before creating the entangled bell pairs. Breeding better quantum states. Cool analogy. And these bell pairs are the building blocks for the big cluster state in the QPU. That's right. The QPU uses careful fiber routing and delay lines to connect these pairs in the right way, both spatially and temporally. It selects the best available pair for each needed link in the cluster, uses phase shifters and other optics, and then performs the final measurements. There's also a classical controller managing this whole process and a real-time decoder figuring out errors. A highly orchestrated dance across multiple chips and fibers. The paper mentioned two main experiments to test all this. Yes. The first was that synthesis of the huge 12 by n mode Gaussian cluster state, the 86.4 billion mode one. This tested most parts of the system except the fast feed forward and the photon counting detectors. And even with significant optical loss, about 14 decibels, they show the entanglement quality, measured by something called nullifier variances, stayed below the background noise level, which is good. So experiment one showed they could generate massive entanglement reliably, even with imperfect components. What about the second experiment? The second one focused specifically on testing the feedforward and the non-Gaussian state generation. They ran that repetition code error detection experiment using lower quality approximated GKP states. The error detection one. Right. They show that the real-time decoder could analyze measurement results, estimate error probabilities, and then adapt the next measurement setting accordingly. Comparing this to random measurements show the feedforward was actively working and influencing the computation based on real-time results. Ah, so that proves the system can react and potentially correct itself, which is critical. Now, you mentioned loss earlier, 14 dB. Okay. The paper must address this challenge, right? It's a known killer for photonics. Oh, yes, definitely. Optical loss of the elephant in the room. They measure the losses in the key optical paths within Aurora called P1, P2, P3. How bad is it? Well, it's still quite high compared to where they need to be for fault tolerance. They estimate they need loss budgets around 1% for key paths. But in Aurora, for instance, some of the initial heralding paths had losses around 56%. Combined paths had losses over 95%. Wow, 95% loss versus a 1% target. That's a huge gap. It is a very significant gap. It means most of the photons are getting lost along the way in those specific paths in this current setup. So how can they claim they've solved scalability if the system is losing almost everything? That's the key point the researchers make. This first Aurora build prioritized demonstrating the architecture, the modularity, the networking, proving the concept of scalability works. They explicitly state they didn't optimize for loss in this phase. They used standard, commercially available chip fabrication, which isn't geared for ultra-low-loss quantum photonics yet. Okay, so it was a proof of concept for the architecture's scalability. Okay. accepting high loss for now. What's the plan to fix the loss problem then? They're working on it across the board, improving chip design, fabrication techniques, fiber components, connectors, everything. Their target is a 20 to 30 times improvement measured in decibels, which is a logarithmic scale, so it's a big reduction in the insertion loss of each component. 20 to 30 times better components. Yes. They believe if they can hit that target, the current architecture could support fault tolerance without needing fundamental changes to the overall design just because of loss. So better engineering of the existing parts is the path forward. That sounds achievable, if challenging. Did they touch on manufacturing for the really big systems' millions of quibits? Briefly. They acknowledge that scaling to that level demands huge advances in manufacturing. You need incredibly high component density on the chips and mass production methods to make it affordable. They threw out a number. Even for just 100 logical qubits with error correction, their current architecture might need tens of millions of those initial GVS cells. Tens of millions. And potentially tens of thousands of server racks. It's comparable to today's massive classical data centers in scale, but highlights that component performance and manufacturability have to improve together. Tens of thousands of racks. It really drives home the scale of the challenge, even if the modularity is solved. Okay, let's wrap this up. What are the key takeaways from this deep dive? I think the main things are photonic quantum computing has these really attractive features, room temp operation, potential for networking using fiber. And Xanadu's Aurora is a concrete, significant step towards realizing that. They've showed a plausible path to a scalable, modular, networked system. The architecture seems viable, particularly the modularity and room temperature aspects, which are huge for thinking about future quantum data centers. Exactly. But, and it's a big but, the journey isn't over. That nature paper makes it crystal clear. Reducing optical loss is the critical next step. It's the major hurdle to overcome for fault tolerance in PQC. So Aurora answered a key question about how to build it large and connected. But the next big push is improving the quality, getting those losses way down. Precisely. nail down the loss problem, and the door to effective error correction in truly useful quantum computers opens much wider. It's been fascinating to explore this. Incredible progress, but still major challenges ahead. So for you, our listener, here's a thought to chew on. Now that a path towards truly scalable and networked quantum computers has been demonstrated, what kind of scientific discoveries or tech breakthroughs, things that seem impossible today, might suddenly come within reach? What could interconnected quantum power unlock? Definitely something to think about. Thanks for joining us on the Deep Dive.