The New Quantum Era

Sebastian Hassinger & Kevin Rowney

Your hosts, Sebastian Hassinger and Kevin Rowney, interview brilliant research scientists, software developers, engineers and others actively exploring the possibilities of our new quantum era.

  • 29 minutes 55 seconds
    Peaked quantum circuits with Hrant Gharibyan

    In this episode of The New Quantum Era, Sebastian talks with Hrant Gharibyan, CEO and co‑founder of BlueQubit, about “peaked circuits” and the challenge of verifying quantum advantage. They unpack Scott Aaronson and Yushuai Zhang’s original peaked‑circuit proposal, BlueQubit’s scalable implementation on real hardware, and a new public challenge that invites the community to attack their construction using the best classical algorithms available. Along the way, they explore how this line of work connects to cryptography, hardness assumptions, and the near‑term role of quantum devices as powerful scientific instruments.

    Topics Covered

    • Why verifying quantum advantage is hard The core problem: if a quantum device claims to solve a task that is classi-cally intractable, how can anyone check that it did the right thing? Random circuit sampling (as in Google’s 2019 “supremacy” experiment and follow‑on work from Google and Quantinuum) is believed to be classically hard to simulate, but the verification metrics (like cross‑entropy benchmarking) are themselves classically intractable at scale.
    • What are peaked circuits? Aaronson and Zhang’s idea: construct circuits that look like random circuits in every respect, but whose output distribution secretly has one special bit string with an anomalously high probability (the “peak”). The designer knows the secret bit string, so a quantum device can be verified by checking that measurement statistics visibly reveal the peak in a modest number of shots, while finding that same peak classically should be as hard as simulating a random circuit.
    • BlueQubit’s scalable construction and hardware demo BlueQubit extended the original 24‑qubit, simulator‑based peaked‑circuit construction to much larger sizes using new classical protocols. Hrant explains their protocol for building peaked circuits on Quantinuum’s H2 processor with around 56 qubits, thousands of gates, and effectively all‑to‑all connectivity, while still hiding a single secret bit string that appears as a clear peak when run on the device.
    • Obfuscation tricks and “quantum steganography” The team uses multiple obfuscation layers (including “swap” and “sweeping” tricks) to transform simple peaked circuits into ones that are statistically indistinguishable from generic random circuits, yet still preserve the hidden peak.
    • The BlueQubit Quantum Advantage Challenge To stress‑test their hardness assumptions, BlueQubit has published concrete circuits and launched a public bounty (currently a quarter of a bitcoin) for anyone who can recover the secret bit string classically. The aim is to catalyze work on better classical simulation and de‑quantization techniques; either someone closes the gap (forcing the protocol to evolve) or the standing bounty helps establish public trust that the task really is classically infeasible.
    • Potential cryptographic angles Although the main focus is verification of quantum advantage, Hrant outlines how the construction has a cryptographic flavor: a secret bit string effectively acts as a key, and only a sufficiently powerful quantum device can efficiently “decrypt” it by revealing the peak. Variants of the protocol could, in principle, yield schemes that are classically secure but only decryptable by quantum hardware, and even quantum‑plus‑key secure, though this remains speculative and secondary to the verification use case. 
    • From verification protocol to startup roadmap Hrant positions BlueQubit as an algorithm and capability company: deeply hardware‑aware, but focused on building and analyzing advantage‑style algorithms tailored to specific devices. The peaked‑circuit work is one pillar in a broader effort that includes near‑term scientific applications in condensed‑matter physics and materials (e.g., Fermi–Hubbard models and out‑of‑time‑ordered correlators) where quantum devices can already probe regimes beyond leading classical methods.
    • Scientific advantage today, commercial advantage tomorrow Sebastian and Hrant emphasize that the first durable quantum advantages are likely to appear in scientific computing—acting as exotic lab instruments for physicists, chemists, and materials scientists—well before mass‑market “killer apps” arrive. Once robust, verifiable scientific advantage is established, scaling to larger models and more complex systems becomes a question of engineering, with clear lines of sight to industrial impact in sectors like pharmaceuticals, advanced materials, and manufacturing.

    The challenge: https://app.bluequbit.io/hackathons/

    12 December 2025, 1:00 pm
  • 36 minutes 53 seconds
    Diamond vacancies and scalable qubits with Quantum Brilliance

    Episode overview
    This episode of The New Quantum Era features a conversation with Quantum Brilliance co‑founder and CEO Mark Luo and independent board chair Brian Wong about diamond nitrogen vacancy (NV) centers as a platform for both quantum computing and quantum sensing. The discussion covers how NV centers work, what makes diamond‑based qubits attractive at room temperature, and how to turn a lab technology into a scalable product and business.

    What are diamond NV qubits? 
    Mark explains how nitrogen vacancy centers in synthetic diamond act as stable room‑temperature qubits, with a nitrogen atom adjacent to a missing carbon atom creating a spin system that can be initialized and read out optically or electronically. The rigidity and thermal properties of diamond remove the need for cryogenics, complex laser setups, and vacuum systems, enabling compact, low‑power quantum devices that can be deployed in standard environments.

    Quantum sensing to quantum computing 
    NV centers are already enabling ultra‑sensitive sensing, from nanoscale MRI and quantum microscopy to magnetometry for GPS‑free navigation and neurotech applications using diamond chips under growing brain cells. Mark and Brian frame sensing not as a hedge but as a volume driver that builds the diamond supply chain, pushes costs down, and lays the manufacturing groundwork for future quantum computing chips.

    Fabrication, scalability, and the value chain 
    A key theme is the shift from early “shotgun” vacancy placement in diamond to a semiconductor‑style, wafer‑like process with high‑purity material, lithography, characterization, and yield engineering. Brian characterizes Quantum Brilliance’s strategy as “lab to fab”: deciding where to sit in the value chain, leveraging the existing semiconductor ecosystem, and building a partner network rather than owning everything from chips to compilers.

    Devices, roadmaps, and hybrid nodes 
    Quantum Brilliance has deployed room‑temperature systems with a handful of physical qubits at Oak Ridge National Laboratory, Fraunhofer IAF, and the Pawsey Supercomputing Centre. Their roadmap targets application‑specific quantum computing with useful qubit counts toward the end of this decade, and lunchbox‑scale, fault‑tolerant systems with on the order of 50–60 logical qubits in the mid‑2030s.

    Modality tradeoffs and business discipline 
    Mark positions diamond NV qubits as mid‑range in both speed and coherence time compared with superconducting and trapped‑ion systems, with their differentiator being compute density, energy efficiency, and ease of deployment rather than raw gate speed. Brian brings four decades of experience in semiconductors, batteries, lidar, and optical networking to emphasize milestones, early revenue from sensing, and usability—arguing that making quantum devices easy to integrate and operate is as important as the underlying physics for attracting partners, customers, and investors.

    Partners and ecosystem 
    The episode underscores how collaborations with institutions such as Oak Ridge, Fraunhofer, and Pawsey, along with industrial and defense partners, help refine real‑world requirements and ensure the technology solves concrete problems rather than just hitting abstract benchmarks. By co‑designing with end users and complementary hardware and software vendors, Quantum Brilliance aims to “democratize” access to quantum devices, moving them from specialized cryogenic labs to desks, edge systems, and embedded platforms.

    6 December 2025, 9:12 pm
  • 49 minutes 26 seconds
    Macroscopic Quantum Tunneling with Nobel Laureate John Martinis

    Episode overview
    John Martinis, Nobel laureate and former head of Google’s quantum hardware effort, joins Sebastian Hassinger on The New Quantum Era to trace the arc of superconducting quantum circuits—from the first demonstrations of macroscopic quantum tunneling in the 1980s to today’s push for wafer-scale, manufacturable qubit processors. The episode weaves together the physics of “synthetic atoms” built from Josephson junctions, the engineering mindset needed to turn them into reliable computers, and what it will take for fabrication to unlock true large-scale quantum systems.

    Guest bio
    John M. Martinis is a physicist whose experiments on superconducting circuits with John Clarke and Michel Devoret at UC Berkeley established that a macroscopic electrical circuit can exhibit quantum tunneling and discrete energy levels, work recognized by the 2025 Nobel Prize in Physics “for the discovery of macroscopic quantum mechanical tunnelling and energy quantisation in an electric circuit.” He went on to lead the superconducting quantum computing effort at Google, where his team demonstrated large-scale, programmable transmon-based processors, and now heads Qolab (also referred to in the episode as CoLab), a startup focused on advanced fabrication and wafer-scale integration of superconducting qubits.

    Martinis’s career sits at the intersection of precision instrumentation and systems engineering, drawing on a scientific “family tree” that runs from Cambridge through John Clarke’s group at Berkeley, with strong theoretical influence from Michel Devoret and deep exposure to ion-trap work by Dave Wineland and Chris Monroe at NIST. Today his work emphasizes solving the hardest fabrication and wiring challenges—pursuing high-yield, monolithic, wafer-scale quantum processors that can ultimately host tens of thousands of reproducible qubits on a single 300 mm wafer.

    Key topics

    • Macroscopic quantum tunneling on a chip: How Clarke, Devoret, and Martinis used a current-biased Josephson junction to show that a macroscopic circuit variable obeys quantum mechanics, with microwave control revealing discrete energy levels and tunneling between states—laying the groundwork for superconducting qubits. The episode connects this early work directly to the Nobel committee’s citation and to today’s use of Josephson circuits as “synthetic atoms” for quantum computing.
    • From DC devices to microwave qubits: Why early Josephson devices were treated as low-frequency, DC elements, and how failed experiments pushed Martinis and collaborators to re-engineer their setups with careful microwave filtering, impedance control, and dilution refrigerators—turning noisy circuits into clean, quantized systems suitable for qubits. This shift to microwave control and readout becomes the through-line from macroscopic tunneling experiments to modern transmon qubits and multi-qubit gates.
    • Synthetic atoms vs natural atoms: The contrast between macroscopic “synthetic atoms” built from capacitors, inductors, and Josephson junctions and natural atomic systems used in ion-trap and neutral-atom experiments by groups such as Wineland and Monroe at NIST, where single-atom control made the quantum nature more obvious. The conversation highlights how both approaches converged on single-particle control, but with very different technological paths and community cultures.
    • Ten-year learning curve for devices: How roughly a decade of experiments on quantum noise, energy levels, and escape rates in superconducting devices built confidence that these circuits were “clean enough” to support serious qubit experiments, just as early demonstrations such as Yasunobu Nakamura’s single-Cooper-pair box showed clear two-level behavior. This foundational work set the stage for the modern era of superconducting quantum computing across academia and industry.
    • Surface code and systems thinking: Why Martinis immersed himself in the surface code, co-authoring a widely cited tutorial-style paper “Surface codes: Towards practical large-scale quantum computation” (Austin G. Fowler, Matteo Mariantoni, John M. Martinis, Andrew N. Cleland, Phys. Rev. A 86, 032324, 2012; arXiv:1208.0928), to translate error-correction theory into something experimentalists could build. He describes this as a turning point that reframed his work at UC Santa Barbara and Google around full-system design rather than isolated device physics.
    • Fabrication as the new frontier: Martinis argues that the physics of decent transmon-style qubits is now well understood and that the real bottleneck is industrial-grade fabrication and wiring, not inventing ever more qubit variants. His company’s roadmap targets wafer-scale integration—e.g., ~100-qubit test chips scaling toward ~20,000 qubits on a 300 mm wafer—with a focus on yield, junction reproducibility, and integrated escape wiring rather than current approaches that tile many 100-qubit dies into larger systems.
    • From lab racks of cables to true integrated circuits: The episode contrasts today’s dilution-refrigerator setups—dominated by bulky wiring and discrete microwave components—with the vision of a highly integrated superconducting “IC” where most of that wiring is brought on-chip. Martinis likens the current state to pre-IC TTL logic full of hand-wired boards and sees monolithic quantum chips as the necessary analog of CMOS integration for classical computing.
    • Venture timelines vs physics timelines: A candid discussion of the mismatch between typical three-to-five-year venture capital expectations and the multi-decade arc of foundational technologies like CMOS and, now, quantum computing. Martinis suggests that the most transformative work—such as radically improved junction fabrication—looks slow and uncompetitive in the short term but can yield step-change advantages once it matures.
    • Physics vs systems-engineering mindsets: How Martinis’s “instrumentation family tree” and exposure to both American “build first, then understand” and French “analyze first, then build” traditions shaped his approach, and how system engineering often pushes him to challenge ideas that don’t scale. He frames this dual mindset as both a superpower and a source of tension when working in large organizations used to more incremental science-driven projects.
    • Collaboration, competition, and pre-competitive science: Reflections on the early years when groups at Berkeley, Saclay, UCSB, NIST, and elsewhere shared results openly, pushing the field forward without cut-throat scooping, before activity moved into more corporate settings around 2010. Martinis emphasizes that many of the hardest scaling problems—especially in materials and fabrication—would benefit from deeper cross-organization collaboration, even as current business constraints limit what can be shared.

    Papers and research discussed

    • “Energy-Level Quantization in the Zero-Voltage State of a Current-Biased Josephson Junction” – John M. Martinis, Michel H. Devoret, John Clarke, Physical Review Letters 55, 1543 (1985). First clear observation of quantized energy levels and macroscopic quantum tunneling in a Josephson circuit, forming a core part of the work recognized by the 2025 Nobel Prize in Physics. Link: https://link.aps.org/doi/10.1103/PhysRevLett.55.1543
    • “Quantum Mechanics of a Macroscopic Variable: The Phase Difference of a Josephson Junction” – J. Clarke et al., Science 239, 992 (1988). Further development of macroscopic quantum tunneling and wave-packet dynamics in current-biased Josephson junctions, demonstrating that a circuit-scale degree of freedom behaves as a quantum variable. Link (PDF via Cleland group):
    26 November 2025, 3:39 pm
  • 35 minutes 53 seconds
    Trapped ions on the cloud with Thomas Monz from AQT

    Thomas Monz, CEO of AQT (Alpine Quantum Technologies), joins Sebastian Hassinger on The New Quantum Era to chart the evolution of ion-trap quantum computing — from the earliest breakthroughs in Innsbruck to the latest roll-outs in supercomputing centers and on the cloud. Drawing on a career that spans pioneering research and entrepreneurial grit, Thomas details how AQT is bridging the gap between academic innovation and practical, scalable systems for real-world users. The conversation traverses AQT’s trajectory from component supplier to systems integrator, how standard 19-inch racks and open APIs are making quantum computing accessible in Europe’s top HPC centers, what Thomas anticipates from AQT launching on Amazon Braket, a quantum computing service from AWS, and what it will take for quantum to deliver genuine economic value.

    Guest Bio  
    Thomas Monz is the CEO and co-founder of AQT. A physicist by training, his work has helped transform trapped-ion quantum computing from a fundamental research topic into a commercially viable technology. After formative stints in quantum networks, high-precision measurement, and hands-on engineering, Thomas launched AQT alongside Peter Zoller and Rainer Blatt to make robust, scalable quantum computers available far beyond the university lab. He continues to be deeply engaged in both hardware development and quantum error correction research, with AQT now deploying systems at EuroHPC centers and bringing devices to Amazon Braket.

    Key Topics  

    • From research prototype to rack-ready: How the pain points converting lab experiments into user-friendly hardware led AQT to build its quantum computers in the same form factors and standards as classical infrastructure, making plug-and-play integration with the supercomputing world possible.  
    • Hybrid quantum–HPC deployments: Why systems-level thinking and classic IT lessons (such as respecting 19-inch rack and power standards) have enabled AQT to place ion-trap quantum computers in Germany and Poland as part of the EuroHPC initiative — and why abstraction at the API level is essential for developer adoption.  
    • Error correction and code flexibility: How the physical properties of trapped ions let AQT remain agnostic to changing error-correcting codes (from repetition and surface codes to LDPC), enabling swift adaptation to new breakthroughs via software rather than expensive new hardware — and why end-users should never have to think about error correction themselves.  
    • Scaling and networking: The challenges moving from one-dimensional to two-dimensional traps, the emerging role of integrated photonics, and AQT’s vision for interconnecting quantum computers within and across HPC sites using telecom-wavelength photons.  
    • From local to cloud: What AQT’s move to Amazon Braket means for the range and sophistication of end-user applications, and how broad commercial access is shifting priorities from scientific exploration to real-world performance and customer-driven features.  
    • Collaboration as leverage: How AQT’s open approach to integration—letting partners handle job scheduling, APIs, and orchestration—positions it as a technology supplier while benefiting from advances across Europe’s quantum ecosystem.


    Why It Matters 
    AQT’s journey illustrates how “physics-first” quantum innovation is finally crossing into scalable, reliable real-world systems. By prioritizing integration, user experience, and abstraction, AQT is closing the gap between experimental platforms and actionable quantum advantage. From better error rates and hybrid deployments to global cloud infrastructure, the work Thomas describes signals a maturing industry rapidly moving toward both commercial impact and new scientific discoveries.

    Episode Highlights  

    • How Thomas’s PhD work helped implement the first three-qubit ion-trap gates and formed the foundation for AQT’s technical strategy.  
    • The pivotal insight: moving from bespoke lab systems to standardized products allowed quantum hardware to be deployed at scale.  
    • The surprisingly smooth physical deployment of AQT machines across Europe, thanks to a “box-on-a-truck” design.  
    • Real talk on error correction, the importance of LDPC codes, and the flexibility built into trapped-ion architectures.  
    • The future of quantum networking: sending entangled photons between HPC facilities, and the promise of scalable cluster architectures.  
    • What cloud access brings to the roadmap, including new end-user requirements and opportunities for innovation in error correction as a service.


    ---- 

    This episode offers an insider’s perspective on the tight coupling of science and engineering required to bring quantum computing out of the lab and into industry. Thomas’s journey is a case study in building both technology and market readiness — critical listening for anyone tracking the real-world ascent of quantum computers. In the spirit of full disclosure, Sebastian is an employee of AWS, working on quantum computing for the company, though he is not a member of the Braket service team. 

    18 November 2025, 4:44 pm
  • 33 minutes 32 seconds
    Quantum Materials and Nano Fabrication with Javad Shabani


    Quantum Materials and Nano-Fabrication with Javad Shabani

    Guest: Dr. Javad Shabani is Professor of Physics at NYU, where he directs both the Center for Quantum Information Physics and the NYU Quantum Institute. He received his PhD from Princeton University in 2011, followed by postdoctoral research at Harvard and UC Santa Barbara in collaboration with Microsoft Research. His research focuses on novel states of matter at superconductor-semiconductor interfaces, mesoscopic physics in low-dimensional systems, and quantum device development. He is an expert in molecular beam epitaxy growth of hybrid quantum materials and has made pioneering contributions to understanding fractional quantum Hall states and topological superconductivity.

    Episode Overview

    Professor Javad Shabani shares his journey from electrical engineering to the frontiers of quantum materials research, discussing his pioneering work on semiconductor-superconductor hybrid systems, topological qubits, and the development of scalable quantum device fabrication techniques. The conversation explores his current work at NYU, including breakthrough research on germanium-based Josephson junctions and the launch of the NYU Quantum Institute.

    Key Topics Discussed

    Early Career and Quantum Journey
    Javad describes his unconventional path into quantum physics, beginning with a double major in electrical engineering and physics at Sharif University of Technology after discovering John Preskill's open quantum information textbook. His graduate work at Princeton focused on the quantum Hall effect, particularly investigating the enigmatic five-halves fractional quantum Hall state and its potential connection to non-abelian anyons.

    From Spin Qubits to Topological Quantum Computing
    During his PhD, Javad worked with Jason Petta and Mansur Shayegan on early spin qubit experiments, experiencing firsthand the challenge of controlling single quantum dots. His postdoctoral work at Harvard with Charlie Marcus focused on scaling from one to two qubits, revealing the immense complexity of nanofabrication and materials science required for quantum control. This experience led him to topological superconductivity at UC Santa Barbara, where he collaborated with Microsoft Research on semiconductor-superconductor heterostructures.

    Planar Josephson Junctions and Material Innovation
    At NYU, Javad's group developed planar two-dimensional Josephson junctions using indium arsenide semiconductors with aluminum superconductors, moving away from one-dimensional nanowires toward more scalable fabrication approaches. In 2018-2019, his team published groundbreaking results in Physical Review Letters showing signatures of topological phase transitions in these hybrid systems.

    Gatemon Qubits and Hybrid Systems
    The conversation explores Javad's recent work on gatemon qubits—gate-tunable superconducting transmon qubits that leverage semiconductor properties for fast switching in the nanosecond regime. While indium arsenide's piezoelectric properties may limit qubit coherence, the material shows promise as a fast coupler between qubits. This research, published in Physical Review X, represents a convergence of superconducting circuit techniques with semiconductor physics.

    Breakthrough in Germanium-Based Devices
    Javad reveals exciting forthcoming research accepted in Nature Nanotechnology on creating vertical Josephson junctions entirely from germanium. By doping germanium with gallium to make it superconducting, then alternating with undoped semiconducting germanium, his team has achieved wafer-scale fabrication of three-layer superconductor-semiconductor-superconductor junctions. This approach enables placing potentially 20 million junctions on a single wafer, opening pathways toward CMOS-compatible quantum device manufacturing.

    NYU Quantum Institute and Regional Ecosystem
    The episode discusses the launch of the NYU Quantum Institute under Javad's leadership, designed to coordinate quantum research across physics, engineering, chemistry, mathematics, and computer science. The Institute aims to connect fundamental research with application-focused partners in finance, insurance, healthcare, and communications throughout New York City. Javad describes NYU's quantum networking project with five nodes across Manhattan and Brooklyn, leveraging NYU's distributed campus fiber infrastructure for short-distance quantum communication.

    Academic Collaboration and the New York Quantum Ecosystem
    Javad explains how NYU collaborates with Columbia, Princeton, Yale, Cornell, RPI, Stevens Institute, and City College to build a Northeast quantum corridor. The annual New York Quantum Summit (now in its fourth year) brings together academics, government labs including AFRL and Brookhaven, consulting firms, and industry partners. This regional approach complements established hubs like the Chicago Quantum Exchange while addressing New York's unique strengths in finance and dense urban infrastructure.

    Materials Science Challenges and Interfaces
    The conversation delves into fundamental materials science puzzles, particularly the asymmetric nature of material interfaces. Javad explains how material A may grow well on material B, but B cannot grow on A due to polar interface incompatibilities—a critical challenge for vertical device fabrication. He draws parallels to aluminum oxide Josephson junctions, where the bottom interface is crystalline but the top interface grows on amorphous oxide, potentially contributing to two-level system noise.

    Industry Integration and Practical Applications
    Javad discusses NYU's connections to chip manufacturing through the CHIPS Act, linking academic research with 200-300mm wafer-scale operations at NY Creates. His group also participates in the Co-design Center for Quantum Advantage (C2QA)  based at Brookhaven National Laboratory.

    Notable Quotes

    "Behind every great experimentalist, there is a greater theorist."

    "A lot of these kind of application things, the end users are basically in big cities, including New York...people who care at finance financial institutions, people like insurance, medical for sensing and communication."

    "You don't wanna spend time on doing the exact same thing...but I do feel we need to be more and bigger."

    12 November 2025, 5:42 pm
  • 39 minutes 40 seconds
    Incubating quantum innovation with Vijoy Pandey of Outshift by Cisco

    Vijoy Pandey joins Sebastian Hassinger for this episode of The New Quantum Era to discuss Cisco's ambitious vision for quantum networking—not as a far-future technology, but as infrastructure that solves real problems today. Leading Outshift by Cisco, their incubation group and Cisco Research, Vijoy explains how quantum networks are closer than quantum computers, why distributed quantum computing is the path to scale, and how entanglement-based protocols can tackle immediate classical challenges in security, synchronization, and coordination. The conversation spans from Vijoy's origin story building a Hindi chatbot in the late 1980s to Cisco's groundbreaking room-temperature quantum entanglement chip developed with UC Santa Barbara, and explores use cases from high-frequency trading to telescope array synchronization.

    Guest Bio
    Vijoy Pandey is Senior Vice President at Outshift by Cisco, the company's internal incubation group, where he also leads Cisco Research and Cisco Developer Relations (DevNet). His career in computing began in high school building AI chatbots, eventually leading him through distributed systems and software engineering roles including time at Google. At Cisco, Vijoy oversees a portfolio spanning quantum networking, security, observability, and emerging technologies, operating at the intersection of research and product incubation within the company's Chief Strategy Office.

    Key Topics
    From research to systems: How Cisco's quantum work is transitioning from physics research to systems engineering, focusing on operability, deployment, and practical applications rather than building quantum computers.
    The distributed quantum computing vision: Cisco's North Star is building quantum network fabric that enables scale-out distributed quantum computing across heterogeneous QPU technologies (trapped ion, superconducting, etc.) within data centers and between them—making "the quantum network the solution" to quantum's scaling problem and classical computing's physics problem.
    Room-temperature entanglement chip: Cisco and UC Santa Barbara developed a prototype photonic chip that generates 200 million entangled photon pairs per second at room temperature, telecom wavelengths, and less than 1 milliwatt power—enabling deployment on existing fiber infrastructure without specialized equipment.
    Classical use cases today: How quantum networking protocols solve present-day problems in synchronization (global database clocks, telescope arrays), decision coordination (high-frequency trading across geographically distributed exchanges), and security (intrusion detection using entanglement collapse) without requiring massive qubit counts or cryogenic systems.
    Quantum telepathy for HFT: The concept of using entanglement and teleportation to coordinate decisions across locations faster than the speed of light allows classical communication—enabling fairness guarantees for high-frequency trading across data centers in different cities.
    Meeting customers where they are: Cisco's strategy to deploy quantum networking capabilities alongside existing classical infrastructure, supporting a spectrum from standard TLS to post-quantum cryptography to QKD, rather than requiring greenfield deployments.
    The transduction grand challenge: Why building the "NIC card" that connects quantum processors to quantum networks—the transducer—is the critical bottleneck for distributed quantum computing and the key technical risk Cisco is addressing.
    Product-company fit in corporate innovation: How Outshift operates like internal startups within Cisco, focusing on problems adjacent to the company's four pillars (networking, security, observability, collaboration) with both technology risk and market risk, while maintaining agility through a framework adapted from Cisco's acquisition integration playbook.

    Why It Matters
    Cisco's systems-level approach to quantum networking represents a paradigm shift from viewing quantum as distant future technology to infrastructure deployable today for specific high-value use cases. By focusing on room-temperature, telecom-compatible entanglement sources and software stacks that integrate with existing networks, Cisco is positioning quantum networking as the bridge between classical and quantum computing worlds—potentially accelerating practical quantum applications from decades away to 5-10 years while solving immediate enterprise challenges in security and coordination.

    Episode Highlights
    Vijoy's journey from building Hindi chatbots on a BBC Micro in the late 1980s to leading quantum innovation at Cisco. 
    Why quantum networking is "here and now" while quantum computing is still being figured out. 
    The spectrum of quantum network applications: from near-term classical coordination problems to the long-term quantum internet connecting quantum data centers and sensors. 
    How entanglement enables provable intrusion detection on standard fiber networks alongside classical IP traffic. 
    The "step function moment" coming for quantum: why the transition from physics to systems engineering means a ChatGPT-like breakthrough is imminent, and why this one will be harder to catch up on than software-based revolutions. 
    Design partner collaborations with financial services, federal agencies, and energy companies on security and synchronization use cases.
    Cisco's quantum software stack prototypes: Quantum Compiler (for distributed quantum error correction), Quantum Alert (security), and QuantumSync (decision coordination)."

    31 October 2025, 3:32 pm
  • 37 minutes 5 seconds
    Nobel Laureate John Martinis Discusses Superconducting Qubits and Qolab

    This episode is a first for the show - a repeat of a previously posted interview on The New Quantum Era podcast! I think you'll agree the reason for the repeat is a great one - this episode, recorded at the APS Global Summit in March, features a conversation John Martinis, co-founder and CTO of QoLab and newly minted Nobel Laureate! Last week the Royal Swedish Academy of Sciences made an announcement that John would share the 2025 Nobel Prize for Physics with John Clarke and Michel Devoret “for the discovery of macroscopic quantum mechanical tunnelling and energy quantisation in an electric circuit.” It should come as no surprise that John and I talked about macroscopic quantum mechanical tunnelling and energy quantization in electrical circuits, since those are precisely the attributes that make a superconducting qubit work for computation.  

    The work John is doing at Qolab, a superconducting qubit company seeking to build a million qubit device, is really impressive, as befits a Nobel Laureate and a pioneer in the field. In our conversation we explore the strategic shifts, collaborative efforts, and technological innovations that are pushing the boundaries of quantum computing closer to building scalable, million-qubit systems. 

    Key Highlights

    • Emerging from Stealth Mode & Million-Qubit System Paper:
      • Discussion on QoLab’s transition from stealth mode and their comprehensive paper on building scalable million-qubit systems.
      • Focus on a systematic approach covering the entire stack.
    • Collaboration with Semiconductor Companies:
      • Unique business model emphasizing collaboration with semiconductor companies to leverage external expertise.
      • Comparison with bigger players like Google, who can fund the entire stack internally.
    • Innovative Technological Approaches:
      • Integration of wafer-scale technology and advanced semiconductor manufacturing processes.
      • Emphasis on adjustable qubits and adjustable couplers for optimizing control and scalability.
    • Scaling Challenges and Solutions:
      • Strategies for achieving scale, including using large dilution refrigerators and exploring optical communication for modular design.
      • Plans to address error correction and wiring challenges using brute force scaling and advanced materials.
    • Future Vision and Speeding Up Development:
      • QoLab’s goal to significantly accelerate the timeline toward achieving a million-qubit system.
      • Insight into collaborations with HP Enterprises, NVIDIA, Quantum Machines, and others to combine expertise in hardware and software.
    • Research Papers Mentioned in this Episode:
    13 October 2025, 6:36 pm
  • 26 minutes 42 seconds
    Carbon nanotube qubits with Pierre Desjardins

    Pierre Desjardins is the cofounder of C12, a Paris-based quantum computing hardware startup that specializes in carbon nanotube-based spin qubits. Notably, Pierre founded the company alongside his twin brother, Mathieu, making them the only twin-led deep-tech startups that we know of! Pierre’s journey is unconventional—he is a rare founder in quantum hardware without a PhD, drawing instead on engineering and entrepreneurial experience. The episode dives into what drew him to quantum computing and the pivotal role COVID-19 played in catalyzing his career shift from consulting to quantum technology.

    C12’s Technology and Unique Angle

    C12 focuses on developing high-performance qubits using single-wall carbon nanotubes. Unlike companies centered on silicon or germanium spin qubits, C12 fabricates carbon nanotubes, tests them for impurities, and then assembles them on silicon chips as a final step. The team exclusively uses isotopically pure carbon-12 to minimize magnetic and nuclear spin noise, yielding a uniquely clean environment for electron confinement. This yields ultra-low charge noise and enables the company to build highly coherent qubits with remarkable material purity.

    Key Technical Innovations

    • Spin-Photon Coupling: C12’s system stands out for driving spin qubits using microwave photons, drawing inspiration from superconducting qubit architectures. This enables the implementation of a “quantum bus”—a superconducting interconnect that allows long-range coupling between distant qubits, sidestepping the scaling bottleneck of nearest-neighbor architectures.
    • Addressable Qubits: Each carbon nanotube qubit can be tuned on or off the quantum bus by manipulating the double quantum dot confinement, providing flexible connectivity and the ability to maximize coherence in a memory mode.
    • Stability and Purity: Pierre emphasizes that C12’s suspended architecture dramatically reduces charge noise and results in exceptional stability, with minimal calibration drift, over years-long measurement campaigns—a stark contrast with many superconducting platforms.


    Recent Milestones

    C12 celebrated its fifth anniversary and recently demonstrated the first qubit operation on their platform. The company achieved ultra-long coherence times for spin qubits coupled via a quantum bus, publishing these results in *Nature*. The next milestone is demonstrating two-qubit gates mediated by microwave photons—a development that could set a new benchmark for both C12 and the wider quantum computing industry.

    Challenges and Outlook

    C12’s current focus is scaling up from single-qubit demonstrations to multi-qubit gates with long-range connectivity, a crucial step toward error correction and practical algorithms. Pierre notes the rapid evolution of error-correcting codes, remarking that some codes they are now working on did not exist two years ago. The interview closes with an eye on the race to demonstrate long-distance quantum gates, with Pierre hoping C12 will make industry headlines before larger competitors like IBM.

    Notable Quotes

    • “The more you dig into this technology, the more you understand why this is just the way to build a quantum computer.”
    • “We have the lowest charge noise compared to any kind of spin qubit—this is because of our suspended architecture.”
    • “What we introduced is the concept of a quantum bus… really the only way to scale spin qubits.”


    Episode Themes

    • Entrepreneurship in deep tech without a traditional research background
    • Technical deep dive on carbon nanotube spin qubits and quantum bus architecture
    • Materials science as the foundation of scalable quantum hardware
    • The importance of coherence, noise reduction, and tunable architectures in quantum system design
    • The dynamic evolution of error correction and industry competition


    Listeners interested in cutting-edge hardware, quantum startup journeys, or the science behind scalable qubit platforms will find this episode essential. Pierre provides unique clarity on why C12’s approach offers both conceptual and practical advantages for the future of quantum computing,

    27 September 2025, 4:25 pm
  • 33 minutes 13 seconds
    Quantum sensitivity breakthrough with Eli Levenson-Falk

    Dr. Eli Levenson-Falk joins Sebastian Hassinger, host of The New Quantum Era to discuss his group’s recent advances in quantum measurement and control, focusing on a new protocol that enables measurements more sensitive than the Ramsey limit. Published in Nature Communications in April 2025, this work demonstrates a coherence stabilized technique that not only enhances sensitivity for quantum sensing but also promises improvements in calibration speed and robustness for superconducting quantum devices and other platforms. The conversation travels from Eli’s origins in physics, through the conceptual challenges of decoherence, to experimental storytelling, and highlights the collaborative foundation underpinning this breakthrough.

    Guest Bio
    Eli Levenson-Falk is an Associate Professor at USC. He earned his PhD at UC Berkeley with Professor Irfan Siddiqui, and now leads an experimental physics research group working with superconducting devices for quantum information science.

    Key Topics

    • The new protocol described in the paper: “Beating the Ramsey Limit on Sensing with Deterministic Qubit Control." 
    • Beyond the Ramsey measurement: How the team’s technique stabilizes part of the quantum state for enhanced sensitivity—especially for energy level splittings—using continuous, slowly varying microwave control, applicable beyond just superconducting platforms.
    •  From playground swings to qubits: Eli explains how the physics of a playground swing inspired his passion for the field and lead to his understanding of the transmon qubit, and why analogies matter for intuition.
    •  Quantum decoherence and stabilization: How the method controls the “vector” of a quantum state on the Bloch sphere, dumping decoherence into directions that can be tracked or stabilized, markedly increasing measurement fidelity.
    •  Calibration and practical speedup: The protocol achieves greater measurement accuracy in less time or greater accuracy for a given time investment. This has implications for both calibration routines in quantum computers and for direct quantum measurements of fields (e.g., magnetic) or material properties.
    •  Applicability: While demonstrated on superconducting transmons, the protocol’s generality means it may bring improved sensitivity to a variety of platforms—though the greatest benefits will be seen where relaxation processes dominate decoherence over dephasing.
    •  Collaboration and credit: The protocol was the product of a collaborative effort with theorist Daniel Lidar and his group, also at USC. In Eli's group, Malida Hecht conducted the experiment.

    Why It Matters
    By breaking through the Ramsey sensitivity limit, this work provides a new tool for both quantum device calibration and quantum sensing. It allows for more accurate and faster frequency calibration within quantum processors, as well as finer detection of small environmental changes—a dual-use development crucial for both scalable quantum computing and sensitive quantum detection technologies.

    Episode Highlights

    •  Explanation of the “Ramsey limit” in quantum measurement and why surpassing it is significant.
    •  Visualization of quantum states using the Bloch sphere, and the importance of stabilizing the equatorial (phase) components for sensitivity.
    •  Experimental journey from “plumber” lab work to analytic insights, showing the back-and-forth of theory confronting experiment.
    •  Immediate and future impacts, from more efficient calibration in quantum computers to potentially new standards for quantum sensing.
    •  Discussion of related and ongoing work, such as improvements to deterministic benchmarking for gate calibration, and the broader applicability to various quantum platforms.

    If you enjoy The New Quantum Era, subscribe and tell your quantum-curious friends! Find all episodes at www.newquantum.era.com.

    19 September 2025, 5:48 pm
  • 37 minutes 51 seconds
    Mechanical Quantum Memories with Mohammad Mirhosseini

    Assistant Professor Mohammad Mirhosseini (Caltech EE/APh) explains how his group built a mechanical quantum memory that stores microwave-photon quantum states far longer than typical superconducting qubits, and why that matters for hybrid quantum architectures. The discussion covers microwave photons, phonons, optomechanics, coherence versus lifetime (T2 vs. T1), current speed bottlenecks, and implications for quantum transduction and error mechanisms. The discussion centers on a paper from Mirhosseini's paper from December of 2024 titled, “A mechanical quantum memory for microwave photons,” detailing strong coupling between a transmon and a long‑lived nanomechanical oscillator for storage and retrieval of nonclassical states.

    Guest

    Mohammad Mirhosseini is an Assistant Professor of Electrical Engineering and Applied Physics at Caltech, where his group engineers hybrid superconducting–phononic–photonic systems at millikelvin temperatures for computing, communication, and sensing. He completed his PhD at the University of Rochester’s Institute of Optics and was a postdoc in Oscar Painter’s group at Caltech before starting his lab. His recent team effort demonstrates mechanical oscillators as compact, long‑lived quantum memories integrated with superconducting circuits.


    Key topics

    • What “microwave photons” are and how qubits emit/absorb single microwave photons in circuit QED analogously to atoms and optical photons.
    • Why “memory” is missing in today’s quantum processors and how a dedicated long‑lived storage element can complement fast but dissipative superconducting qubits.
    • Optomechanics 101: mapping quantum states between electrical and mechanical degrees of freedom, with phonons as the quantized vibrational excitations.
    • T1 vs. T2: demonstrated order‑of‑magnitude gains in lifetime (T1) and more modest current gains in coherence (T2), plus paths to mitigate dephasing.
    • Present bottleneck: state conversion between qubit and oscillator is about 100Ă— slower than native superconducting operations, with clear engineering avenues to speed up.
    • Quantum transduction: leveraging the same mechanical intermediary to bridge microwave and optical domains for interconnects and networking.
    • Two‑level system (TLS) defects: shared decoherence mechanisms across mechanical oscillators and superconducting circuits and why comparing both can illuminate materials limits.

    Why it matters

    Hybrid architectures that pair fast processors with long‑lived memories are a natural route to scaling, and mechanical oscillators offer lifetimes far exceeding conventional superconducting storage elements while remaining chip‑integrable.. Demonstrating nonclassical state storage and retrieval with strong qubit–mechanics coupling validates mechanical oscillators as practical quantum memories and sets the stage for on‑chip transduction. Overcoming current speed limits and dephasing would lower the overhead for synchronization, buffering, and possibly future fault‑tolerant protocols in superconducting platforms.


    Episode highlights

    • A clear explanation of microwave photons and how circuit QED lets qubits create and absorb them one by one.
    • Mechanical memory concept: store quantum states as phonons in a gigahertz‑frequency nanomechanical oscillator and read them back later.
    • Performance today: roughly 10–30Ă— longer T1 than typical superconducting qubits with current T2 gains of a fewĂ—, alongside concrete strategies to extend T2.
    • Speed trade‑off: present qubit–mechanics state transfer is ~100Ă— slower than native superconducting gates, but device design and coupling improvements are underway.
    • Roadmap: tighter coupling for in‑oscillator gates, microwave‑to‑optical conversion via the same mechanics, and probing TLS defects to inform both mechanical and superconducting coherence.


    14 September 2025, 4:57 pm
  • 54 minutes 21 seconds
    A Programming Language for Quantum Simulations with Xiaodi Wu

    In this episode, host Sebastian Hassinger sits down with Xiaodi Wu, Associate Professor at the University of Maryland, to discuss Wu’s journey through quantum information science, his drive for bridging computer science and physics, and the creation of the quantum programming language SimuQ.

    Guest Introduction

    • Xiaodi Wu shares his academic path from Tsinghua University (where he studied mathematics and physics) to a PhD at the University of Michigan, followed by postdoctoral work at MIT and a position at the University of Oregon, before joining the University of Maryland.
    • The conversation highlights Wu’s formative experiences, early fascination with quantum complexity, and the impact of mentors like Andy Yao.

    Quantum Computing: Theory Meets Practice

    • Wu discusses his desire to blend theoretical computer science with physics, leading to pioneering work in quantum complexity theory and device-independent quantum cryptography.
    • He reflects on the challenges and benefits of interdisciplinary research, and the importance of historical context in guiding modern quantum technology development.

    Programming Languages and Human Factors

    • The episode delves into Wu’s transition from theory to practical tools, emphasizing the major role of human factors and software correctness in building reliable quantum software.
    • Wu identifies the value of drawing inspiration from classical programming languages like FORTRAN and SIMULA—and points out that quantum software must prioritize usability and debugging, not just elegant algorithms.


    SimiQ: Hamiltonian-Based Quantum Abstraction

    • Wu introduces SimuQ, a new quantum programming language designed to treat Hamiltonian evolution as a first-class abstraction, akin to how floating-point arithmetic is fundamental in classical computing.
    • SimiQ enables users to specify Hamiltonian models directly and compiles them to both gate-based and analog/pulse-level quantum devices (including IBM, AWS Braket, and D-Wave backends).
    • The language aims to make quantum simulation and continuous-variable problems more accessible, and serves as a test bed for new quantum software abstractions.

    Analog vs. Digital in Quantum Computing

    • Wu and Hassinger explore the analog/digital divide in quantum hardware, examining how SimuQ leverages the strengths of both by focusing on higher-level abstractions (Hamiltonians) that fit natural use cases like quantum simulation and dynamic systems.


    Practical Applications and Vision

    • The conversation highlights targeted domains for SimuQ, such as quantum chemistry, physics simulation, and machine learning algorithms that benefit from continuous-variable modeling.
    • Wu discusses his vision for developer-friendly quantum tools, drawing parallels to the evolution of classical programming and the value of reusable abstractions for future advancements. 

    Listen to The New Quantum Era podcast for more interviews with leaders in quantum computing, software development, and scientific research.

    5 September 2025, 2:54 pm
  • More Episodes? Get the App