The Pathfinder Framework
1 NNF Quantum Computing Programme, Niels Bohr Institute, University of Copenhagen, Denmark
2 Quantum Foundry Copenhagen, Denmark
The Novo Nordisk Foundation Quantum Computing Programme (NQCP), Quantum Foundry Copenhagen (QFC) and partners are committed to the mission of realizing quantum computing hardware and algorithms capable of solving otherwise intractable challenges in materials- and life sciences. With a focus on enabling the development of utility-scale fault-tolerant quantum computing (FTQC), we rely on a large interdisciplinary R&D effort, a large collaborative ecosystem, as well as integrating education and outreach to foster quantum workforce development. The programme is structured into three overarching phases: the Preparation Phase (2023–2024), the Pathfinder Phase (2025–2029), and the Scaling Phase (2030–2034), with each phase named to reflect its primary focus.With the completion of the preparation phase, we are now ready to launch the pathfinder phase, which will define a clear roadmap toward achieving the mission’s objectives by 2035. A key focus of the pathfinder phase is to assess quantum processor platforms on their potential to scale with universal logic gates of required precision and runtime.
The pathfinder framework consists of three interconnected sub-frameworks:
- The Mission Ladder – A due-diligence framework guiding and evaluating the path to utility scale quantum computing.
- Mission-Driven Project Management – A framework for coordinating mission-driven projects across multiple teams.
- Data-Driven Workflow Framework – A methodology for utilizing statistical data to inform decision-making.
Here we introduce these framework concepts which support the mission-driven and data-driven management of the pathfinder phase.
The mission ladder is a framework that guides the mission-driven activities across the program. The Applications & Algorithms (A&A) area spans six levels, from use-case identification to computational resource estimation, while quantum processor hardware (HW) R&D is organized into nine levels, analogous to hardware technology readiness levels (TRLs) for the mission. Each level has defined requirements for Quality, Speed and Scale metrics, informed by resource estimates derived from the A&A algorithms.
Figure 1. The FTQC mission ladder for metric tree resource estimations
The three (interdependent) Quality, Speed and Scale pillars provide a foundation for mission-driven project management, establishing clear objectives based on key metrics of interest (MOIs) across the mission ladder. Notably, for systems featuring long-range qubit connectivity—whether within a quantum processing unit (QPU) or via distributed entanglement between QPUs—the metric tree will exhibit bidirectional dependencies.
The mission-driven project management framework (figure 2a) is structured around the mission ladder, where projects are defined by objectives directly linked to key MOI requirement estimates. Deliverables and milestones are framed in terms of MOI resource requirements across levels on the ladder. The framework ensures both alignment across interdependent projects on a common program roadmap and coordination and alignment between the different science and engineering teams within NQCP and QFC.
Figure 2. a) Mission-driven projects are defined by clear objectives aligned with the mission-TRL in Figure 1. Deliverables and milestones are set based on the MOI resource requirements from the MOI tree and integrated into a unified roadmap to ensure program-wide visibility of dependencies. b) All data generated is stored in the Data warehouse. Assumed critical parameters for process stabilization or optimization is stored in Labeled database for data driven workflows. Insights from parameter correlation analyses refine workflow inputs and may also inform adjustments to the MOI tree assumptions.
For data driven workflows (figure 2b), we employ parameter correlation analysis across projects, structured around four fundamental parameter classes: design, control, monitors and metrics. High-quality parameter generation is ensured through robust, version-controlled protocols spanning modelling, design, fabrication, measurement, and analysis.
We categorize parameters into two input and two output classes:
Process stabilization / automation. Data driven R&D depends on efficient parameter correlation analysis, which can only be achieved with high quality data. In HW R&D, process stabilization is critical, as high-quality data depends on process reproducibility through stringent process control. This involves assessing MOI variations within and across samples under given input parameters. PMPs are used to identify the origin of MOI variability through MOI vs. PMP correlations under fixed inputs. Addressing the process variability may be achieved through process automation (e.g. real time and close-loop adjustment of PCPs), with significant time savings gained by ensuring process control before optimization.
Systems and component optimization. Performance optimization is conducted using design of experiments, where critical DPs or PCPs are systematically varied to refine MOI outcomes through DP/PCP vs. MOI correlations. Since high-level MOIs are often more expensive (i.e. associated with longer process turnaround times), a key is to establish strong correlations with lower-level MOIs that enable faster optimization cycles. The labeled database provides clarity across workstreams and facilitates efficient system engineering. A goal is to leverage supervised machine learning algorithms to fully automate and streamline the performance optimization process.
Numerous quantum processor candidates are being explored as potential solutions for utility-scale fault-tolerant quantum computing, each qubit modality offering distinct advantages and limitations. In the pathfinder phase of the NQCP/QFC programme, our goal is to identify solutions that meet the stringent hardware requirements for quality, speed, and scalability, as defined by impactful use-case algorithms. As outlined in this document, we will address this challenge using our pathfinder framework, which supports a broad, interdisciplinary, and cohesive systems engineering approach that remains both mission- and data-driven.
For more details on the programme see NQCP – University of Copenhagen and Quantum Foundry Copenhagen
Cite as: NQCP & QFC, The pathfinder framework (2025). URL: nqcp.ku.dk/pathfinder-framework, DOI: doi.org/10.60612/DATADK/KJIS0S
* Corresponding author: Peter Krogstrup, krogstrup@nbi.dk