Blockchain, Smart Contracts and Decentralized Systems

Blockchain and decentralized systems enable secure, transparent transactions and governance through technologies like decentralized finance and autonomous organizations. This includes personal sovereignty applications and smart contract-based communities. It is particularly worthy as blockchain is already transforming finance and data management with real implementations like cryptocurrencies. Exploration can foster economic inclusion and resistance to centralization in an increasingly digital world. This could extend into decentralized freelance contract networks leverage smart contracts on blockchain to automate payments and disputes for short-term gigs, ensuring transparency and reducing platform fees. Inspired by 2025's DeFi growth, these networks connect workers directly with clients via AI matching. It merits study because X posts emphasize escaping centralized platforms like Fiverr for higher earnings through referrals. This approach fosters trust in global collaborations, as Quora insights highlight innovative ways to secure extended stays and projects. This includes things like blockchain-based startup ideation tools or games. These blockchain-based startup ideation tools or games would use decentralized ledgers to secure and collaborate on new tech concepts, enabling crowdsourced validation and IP protection for ideas in AI and biotech. With 2025 trends from McKinsey highlighting quantum and synthetic biology, these tools facilitate tokenization of concepts for early funding. Worthy of immediate exploration as WEF reports note transformative potential in democratizing innovation, bypassing traditional VC barriers. X discussions reveal real-time applications in niches like modular robotics, providing founders with secure, community-driven development paths.

Cryptography

Cryptology ePrint Archive ... since 1996 and Theory of Cryptography Library and then in 1998 the IACR Cryptology ePrint Server, forerunners of the Cryptology ePrint Archive have been publishing pre-print papers relevant to the field of cryptology.

Research Frontiers in Cryptology or Cryptography

The field of cryptology is undergoing its most significant transformation in decades, driven by the concurrent rise of quantum computing and the escalating demand for privacy in a data-centric world. This report provides an exhaustive exploration of the research frontiers that are defining the future of secure communication and computation. It moves beyond a surface-level review to deliver a multi-layered analysis of the technical underpinnings, strategic implications, and practical challenges shaping the cryptographic landscape.

The most immediate and urgent frontier is the global migration to Post-Quantum Cryptography (PQC). The standardization of the first quantum-resistant algorithms by the U.S. National Institute of Standards and Technology (NIST) in 2024 marks the beginning of a multi-year, multi-billion-dollar transition for governments and industries worldwide. This report details the technical specifics of the new standards—ML-KEM, ML-DSA, SLH-DSA, and HQC—which are based on diverse mathematical foundations such as lattices, hashes, and codes. This portfolio-based approach represents a strategic shift away from the monoculture of classical cryptography, mandating a new level of risk management and architectural agility. However, the transition is not a simple "drop-in" replacement; the new algorithms introduce significant performance and size trade-offs that necessitate careful planning, protocol re-engineering, and, in many cases, hardware upgrades, particularly in resource-constrained sectors like automotive and the Internet of Things (IoT).

Concurrently, a longer-term vision for a physically secure communication infrastructure is being pursued through Quantum Key Distribution (QKD). This report provides a comparative analysis of major international QKD initiatives, revealing a field shaped as much by geopolitical strategy as by technological progress. China has established a clear lead in large-scale deployment with its integrated satellite-fiber network, while Europe is pursuing a continent-wide infrastructure for digital sovereignty. In contrast, the United States remains more cautious, with security agencies highlighting the practical limitations and vulnerabilities of QKD, preferring the more pragmatic path of PQC. The ultimate vision of a global quantum internet, however, remains a distant scientific frontier, contingent on solving the profound scientific and engineering challenges of quantum repeaters, which are essential to overcome the fundamental distance limitations imposed by photon loss.

Beyond quantum resistance, the algorithmic frontier is rapidly advancing to address the challenge of protecting data in use. This report provides a deep dive into three transformative paradigms of Privacy-Enhancing Technology (PET):

  1. Homomorphic Encryption (HE): Enabling direct computation on encrypted data, crucial for secure cloud computing and outsourced data analysis.
  2. Zero-Knowledge Proofs (ZKPs): Allowing for the verification of computational integrity without revealing underlying secret data, a technology revolutionizing blockchain scalability and verifiable machine learning.
  3. Secure Multi-Party Computation (MPC): Facilitating collaborative analysis among multiple parties on their combined private data without any single party having to disclose its inputs.

These technologies, while computationally intensive, are foundational to the future of secure data collaboration and trustworthy artificial intelligence. Their development is increasingly driven by specific application needs, such as the invention of Verifiable Delay Functions (VDFs) to ensure fairness in blockchain consensus mechanisms.

This report synthesizes these complex and intersecting frontiers into a strategic roadmap. For the near term, the imperative is a well-planned migration to PQC, centered on cryptographic discovery and agility. For the medium and long term, continued investment in fundamental research—from quantum repeaters to more efficient PETs—is critical for maintaining security leadership. The future of cryptology will be defined not by a single silver-bullet solution, but by the sophisticated management of a diverse portfolio of cryptographic tools tailored to a new era of computational threats and data-driven opportunities.


Part I: The Post-Quantum Transition: Standardization and Implementation

The most pressing and consequential frontier in modern cryptology is the global transition away from classical public-key algorithms, which are rendered insecure by the advent of large-scale quantum computers. This section details the nature of this threat, the international standardization effort to counter it, the technical foundations of the new cryptographic standards, and the immense practical challenges associated with their global implementation.

Section 1: The NIST Post-Quantum Cryptography Standardization Landscape

The imperative to develop and standardize new forms of public-key cryptography is a direct response to a specific, well-understood threat posed by the theoretical power of quantum computation. This threat has catalyzed a global, multi-year effort to establish a new foundation for digital security.

1.1. The Quantum Threat: Shor's Algorithm and the "Harvest Now, Decrypt Later" Imperative

The security of the vast majority of today's public-key infrastructure—including the RSA, Diffie-Hellman (DH), and Elliptic Curve Cryptography (ECC) algorithms that protect everything from web traffic to financial transactions—is predicated on the computational difficulty of two related mathematical problems: integer factorization and the discrete logarithm problem.1 For classical computers, solving these problems for sufficiently large numbers is practically intractable, estimated to take billions of years.2

In 1994, mathematician Peter Shor developed a quantum algorithm that fundamentally alters this security assumption.1 Shor's algorithm demonstrates that a sufficiently powerful quantum computer, known as a cryptographically relevant quantum computer (CRQC), could solve both integer factorization and the discrete logarithm problem in polynomial time, effectively breaking the cryptographic foundations of our current digital world.1 The construction of a CRQC, while an immense engineering challenge, is a stated goal of nations and corporations worldwide, with expert consensus placing its arrival as a credible threat within the next few decades.1

This future threat creates an immediate vulnerability due to the "Harvest Now, Decrypt Later" (HNDL) attack model, also referred to as "Store Now, Decrypt Later" (SNDL).5 In this scenario, adversaries are actively intercepting and storing encrypted data today with the intention of decrypting it once a CRQC becomes available.7 Any data that must remain confidential for a long period—such as government secrets, intellectual property, financial records, or personal health information—is already at risk. This reality transforms the quantum threat from a future problem into a present-day data security imperative, necessitating an urgent transition to cryptographic algorithms that are resistant to attacks from both classical and quantum computers.5

1.2. An Overview of the NIST PQC Competition and its Outcomes

In response to this threat, the U.S. National Institute of Standards and Technology (NIST) initiated its Post-Quantum Cryptography (PQC) Standardization Project in 2016.2 Rather than developing standards internally, NIST launched a global, public competition, inviting cryptography experts from around the world to submit candidate algorithms for quantum-resistant key establishment and digital signatures.2

The process was designed to be open and transparent, fostering years of rigorous public scrutiny and cryptanalysis from the global academic and industrial communities. The first round, which closed in 2017, saw 82 submissions from teams in 25 different countries.2 Over the subsequent years, NIST managed a multi-round evaluation process, winnowing the candidates based on security analyses, performance benchmarks on various platforms, and implementation characteristics.11 This collaborative effort culminated in NIST's July 2022 announcement of the first four algorithms selected for standardization.12 Following a period of public comment on draft standards, the final versions of the first three standards were officially published in August 2024, marking a pivotal moment in the transition to a quantum-safe cryptographic era.11

1.3. The First Wave of Standards: FIPS 203, 204, and 205

The initial set of finalized standards provides a core toolkit for quantum-resistant public-key cryptography, covering both key encapsulation for confidentiality and digital signatures for authentication and integrity.14

  • FIPS 203, Module-Lattice-Based Key-Encapsulation Mechanism Standard (ML-KEM): Based on the CRYSTALS-Kyber submission, ML-KEM is a Key-Encapsulation Mechanism (KEM) designed to establish a shared secret between two parties over an insecure channel.15 It serves as the quantum-resistant replacement for key exchange protocols like Elliptic Curve Diffie-Hellman (ECDH) and is intended for use in applications such as Transport Layer Security (TLS) handshakes and Virtual Private Networks (VPNs).5 The standard specifies three parameter sets—ML-KEM-512, ML-KEM-768, and ML-KEM-1024—offering increasing security levels roughly equivalent to AES-128, AES-192, and AES-256, respectively.5
  • FIPS 204, Module-Lattice-Based Digital Signature Standard (ML-DSA): Based on the CRYSTALS-Dilithium submission, ML-DSA is a digital signature algorithm intended as the primary replacement for RSA and ECDSA signatures.15 It is a versatile, high-performance algorithm suitable for general-purpose applications such as code signing, document signing, and authenticating protocol messages.5 It provides three parameter sets: ML-DSA-44, ML-DSA-65, and ML-DSA-87.5
  • FIPS 205, Stateless Hash-Based Digital Signature Standard (SLH-DSA): Based on the SPHINCS+ submission, SLH-DSA is a stateless hash-based digital signature scheme.15 Its security is derived from the well-understood properties of cryptographic hash functions, offering a more conservative security foundation compared to the newer assumptions of lattice-based cryptography.5 This high level of assurance comes at the cost of significantly larger signatures and slower signing performance, making it suitable for high-value, long-term applications like government archives, legal documents, or critical infrastructure where long-term robustness is paramount over performance.5 SLH-DSA supports numerous parameter sets across NIST security categories 1, 3, and 5.5

1.4. The Principle of Cryptographic Diversity: The Selection of HQC

A central theme of the NIST PQC process has been the avoidance of a cryptographic monoculture. The widespread reliance on algorithms based on the related mathematical problems of integer factorization and discrete logarithms created a single point of failure that Shor's algorithm exploits. To mitigate this risk in the post-quantum era, NIST has deliberately pursued cryptographic diversity—the practice of standardizing algorithms based on different underlying mathematical problems.1

This strategy is exemplified by the selection of both a lattice-based signature (ML-DSA) and a hash-based signature (SLH-DSA). It was further reinforced in March 2025 with the conclusion of the "fourth round" of the competition, which focused on alternative KEMs. From this round, NIST selected Hamming Quasi-Cyclic (HQC) for standardization as an additional KEM.5

HQC's security is based on the hardness of problems in coding theory, specifically the Quasi-Cyclic Syndrome Decoding problem.18 This mathematical foundation is entirely different from the structured lattices that underpin ML-KEM.8 The selection of HQC provides a robust backup to ML-KEM; in the event that a breakthrough classical or quantum attack is discovered against lattice-based cryptography, the global community will have a standardized, vetted alternative from a different mathematical family to fall back on.10 This portfolio approach—managing a set of algorithms with different risk profiles—is a defining characteristic of the post-quantum cryptographic landscape and mandates that organizations build systems with the flexibility to adapt to new cryptographic primitives over time, a concept known as crypto-agility.5 NIST plans to release a draft standard for HQC in approximately one year, with finalization expected by 2027.10

Section 2: Technical Deep Dive and Performance Analysis of Standardized Algorithms

The new PQC standards are built upon mathematical foundations that are fundamentally different from their classical predecessors. Understanding these foundations, along with their concrete performance trade-offs and implementation security challenges, is essential for architects and engineers tasked with deploying them.

2.1. Mathematical Foundations: Lattices, Hashes, and Codes

The security of the new cryptographic standards rests on the presumed difficulty of several distinct classes of mathematical problems for both classical and quantum computers.

  • Lattice-Based Cryptography (ML-KEM, ML-DSA, FALCON): This is the most prominent family among the new standards, valued for its strong balance of security, efficiency, and versatility.2 Its security is derived from the hardness of problems on high-dimensional geometric structures known as lattices.1 The two core problems are:
    • Learning With Errors (LWE): In its module variant (Module-LWE), this problem involves solving a system of noisy linear equations over a polynomial ring, such as finding a secret vector of polynomials s given a public matrix A and a vector t=A⋅s+e, where e is a vector of small "error" or "noise" polynomials.21 The security of ML-KEM (Kyber) is based on the hardness of this problem.16

    • Short Integer Solution (SIS): In its module variant (Module-SIS), this problem involves finding a short, non-zero vector z such that A⋅z=0 for a public matrix A.21 The security of ML-DSA (Dilithium) is based on the hardness of Module-SIS and Module-LWE.21

      The use of structured lattices—specifically, module lattices over polynomial rings—allows for highly efficient implementations. Operations like polynomial multiplication can be performed rapidly using the Number Theoretic Transform (NTT), an analogue of the Fast Fourier Transform.21

  • Hash-Based Cryptography (SLH-DSA): The security of hash-based signatures is exceptionally conservative, as it relies only on the security properties of the underlying cryptographic hash function (e.g., SHA-256 or SHAKE).5 The core idea is to use a one-time signature scheme (like a Lamport signature) and build a large structure, a "hypertree" of hash values, on top of it to sign multiple messages.21 SLH-DSA (SPHINCS+) is stateless, meaning it does not require the signer to keep track of which one-time keys have been used, a significant improvement over earlier stateful hash-based schemes. This minimal set of security assumptions provides very strong, long-term assurance against future cryptanalytic breakthroughs but results in very large signatures and slow signing times.5
  • Code-Based Cryptography (HQC): This family of cryptography dates back to the McEliece cryptosystem from 1978 and is one of the oldest public-key proposals.10 Its security is based on the hardness of decoding a random linear error-correcting code, a problem known to be NP-hard.24 HQC's security specifically relies on the Quasi-Cyclic Syndrome Decoding (QCSD) problem.18 An adversary is given a syndrome (the result of multiplying a noisy vector by a parity-check matrix) and must find the original low-weight error vector. The long history of resistance to cryptanalysis gives code-based schemes a high degree of confidence.10 HQC uses quasi-cyclic codes to achieve more compact keys compared to the original McEliece proposal.24

2.2. Quantitative Benchmarking: A Comparative Analysis of Performance and Size

A critical aspect of the PQC transition is understanding that the new algorithms are not direct drop-in replacements for RSA and ECC in terms of performance characteristics. They introduce a new set of trade-offs between key size, signature/ciphertext size, and computational speed. While some PQC operations are computationally faster than their classical counterparts, they universally involve larger data sizes, which can have a significant impact on network protocols and resource-constrained devices.5

The selection of ML-KEM and ML-DSA as primary standards was heavily influenced by their excellent performance across a wide range of platforms.22 ML-KEM, for instance, was noted by NIST engineers as being "near the top (if not the top) in most benchmarks" among KEM candidates.22 Its computational core relies on efficient polynomial arithmetic, which can be heavily optimized with vector instructions like AVX2 on modern CPUs. Benchmarks show that AVX2 optimization can yield an average speedup of nearly 6x for Kyber operations, with decapsulation seeing gains of up to 6.65x.25 Similarly, ML-DSA offers fast signing and verification without requiring complex floating-point computations, making it easier to implement securely.22

However, this computational efficiency comes with a trade-off in size. An ML-KEM-768 public key and ciphertext are each over 1 KB, and an ML-DSA-65 signature is over 3 KB.12 This is a substantial increase compared to the ~32-byte keys and ~64-byte signatures common with ECC. This size increase can impact protocol latency, particularly in bandwidth-constrained environments, and may require fragmentation in protocols with strict packet size limits, such as UDP.22 This reality underscores the importance of a portfolio approach. The FALCON signature scheme, for example, was also selected for standardization because its signatures are significantly smaller than Dilithium's (e.g., 666 bytes for Falcon-512 vs. 2,420 bytes for ML-DSA-44), making it a superior choice for applications where bandwidth is the primary constraint, despite its more complex implementation.22

SLH-DSA represents the other end of the performance spectrum. Its security is paramount, but its signatures are very large (e.g., 7,856 bytes for the smallest parameter set) and its signing process is orders of magnitude slower than ML-DSA.5 HQC, the code-based KEM, sits between the high performance of ML-KEM and the very large keys of older code-based schemes like Classic McEliece. While its keys are larger and its operations are slower than ML-KEM, it offers a valuable security alternative.10

The following tables provide a consolidated view of these trade-offs, presenting both a high-level comparative analysis and specific quantitative benchmarks.

Table 1: Comparative Analysis of NIST PQC Standards

Standard (Algorithm)Cryptographic FamilyPrimary Use CaseKey/Signature Sizes (Level 3/AES-192 equiv.)Relative PerformanceKey StrengthsKey Weaknesses
FIPS 203 (ML-KEM)Lattice-Based (Module-LWE)Key Establishment (TLS, VPN)PK: 1,184 B, CT: 1,088 B 12Very FastHigh performance, small keys (relative to other PQC KEMs)Newer security assumptions
HQCCode-Based (QCSD)Key Establishment (Backup)PK: 4,489 B, CT: 8,978 B (HQC-192)Slower than ML-KEMAlgorithmic diversity, long security historyLarger keys and slower than ML-KEM
FIPS 204 (ML-DSA)Lattice-Based (Module-SIS)General-Purpose SignaturesPK: 1,952 B, Sig: 3,293 B (ML-DSA-65)FastGood all-around performance, ease of implementationLarger signatures than FALCON
FIPS 205 (SLH-DSA)Hash-BasedHigh-Assurance SignaturesPK: 48 B, Sig: 16,224 B (SLH-DSA-192s) 12Very Slow Signing, Fast VerificationConservative security, minimal assumptionsVery large signatures, extremely slow signing
FALCON (FN-DSA)Lattice-Based (NTRU/SIS)Compact SignaturesPK: 1,793 B, Sig: 1,280 B (Falcon-1024, Level 5) 26FastVery compact signaturesImplementation complexity (floating-point math)

Table 2: Performance Benchmarks of PQC Algorithms (Intel CPU with AVX2)

Algorithm & Security LevelOperationExecution Time (ms) - Reference CExecution Time (ms) - AVX2 OptimizedAVX2 Speedup Factor
ML-KEM-512Key Generation0.0750.0135.77x
Encapsulation0.0890.0165.56x
Decapsulation0.0240.0046.00x
ML-KEM-768Key Generation0.1130.0205.65x
Encapsulation0.1340.0245.58x
Decapsulation0.0380.0066.33x
ML-KEM-1024Key Generation0.1650.0285.89x
Encapsulation0.1970.0345.79x
Decapsulation0.0550.0086.88x
ML-DSA-44Key Generation0.1650.0344.85x
Signing0.3540.0635.62x
Verification0.1240.0274.59x
ML-DSA-65Key Generation0.2800.0545.19x
Signing0.6120.1085.67x
Verification0.2100.0454.67x
ML-DSA-87Key Generation0.4300.0785.51x
Signing0.9300.1605.81x
Verification0.3100.0664.70x
Note: Benchmark data synthesized from academic sources.25 Absolute times may vary by specific CPU, but relative performance and speedup factors are indicative.

2.3. Implementation Security: Side-Channel Attack Vectors and Countermeasures

While PQC algorithms may be mathematically secure, their physical implementations can leak information through side channels such as power consumption, electromagnetic emissions, or timing variations.21 These side-channel attacks (SCAs) can bypass the theoretical hardness of the underlying mathematical problems and extract secret keys from a device.

PQC algorithms present unique and significant challenges for side-channel protection. Unlike RSA or ECC, which are based on a few homogeneous operations, lattice-based schemes like Kyber and Dilithium involve a sequence of disparate steps: polynomial sampling, NTT, matrix-vector multiplication, and error correction procedures.29 Each of these steps can leak different information and may require its own specific countermeasure.30

The primary countermeasure against power analysis attacks is masking, where secret variables are split into multiple random "shares" that are processed independently.30 However, masking PQC is more complex than masking classical algorithms. PQC schemes often mix different types of arithmetic—for example, Boolean operations and arithmetic modulo a prime

q. This necessitates costly and complex conversions between Boolean masking and arithmetic masking, a problem that is less prevalent in traditional ciphers.30 Furthermore, specific operations like the decompression function in Kyber or the rejection sampling in Dilithium require novel, tailored masking "gadgets" to be designed.30

Research has demonstrated practical side-channel attacks against unprotected implementations of nearly all NIST PQC candidates, highlighting that secure implementation is a mandatory, non-trivial step for deployment, especially in hostile environments where an attacker may have physical access to a device (e.g., smart cards, IoT sensors).30 The development of efficient and verifiable countermeasures against SCAs remains a highly active and critical research frontier for PQC.

2.4. The Next Wave: A Look at FALCON and Other Candidates

While the first set of standards provides a robust foundation, the PQC landscape continues to evolve. NIST also selected FALCON for standardization as a digital signature algorithm, with a draft standard expected in late 2024.32 FALCON's primary advantage is its exceptionally compact signatures, which are significantly smaller than those of ML-DSA at equivalent security levels.22 This makes it highly attractive for use cases where bandwidth or storage is at a premium. However, its design is more complex, relying on NTRU lattices and a "fast Fourier sampling" technique that requires floating-point arithmetic, which can be more difficult to implement securely and efficiently, especially on constrained devices that lack a native floating-point unit.22

Recognizing that the current signature portfolio is heavily dominated by lattice-based schemes, NIST has also initiated a new call for proposals for additional digital signature algorithms to further enhance cryptographic diversity.19 This "on-ramp" process seeks schemes based on different mathematical foundations, such as code-based or multivariate quadratic systems, to provide future alternatives should vulnerabilities in structured lattices be discovered.34 This ongoing effort underscores that the PQC standardization process is a continuous, adaptive endeavor aimed at building a resilient and diverse cryptographic toolkit for the quantum era.

Section 3: The Migration Imperative: Strategies, Challenges, and Crypto-Agility

The standardization of PQC algorithms is not an end but a beginning. The transition from classical public-key cryptography to these new standards represents one of the most complex and far-reaching technological migrations in the history of computing. It is a multi-year, enterprise-wide undertaking that extends far beyond the technical realm of cryptography, touching on business risk, regulatory compliance, supply chain management, and strategic planning.

3.1. Principles of a Quantum-Safe Transition: Discovery, Risk Assessment, and Prioritization

There is broad consensus among government agencies and industry experts on a three-phase strategic framework for navigating the PQC migration.7 This approach transforms an overwhelming task into a manageable, risk-based program.

  1. Discover: The foundational step is to create a complete and accurate cryptographic inventory. Organizations must identify every system, application, protocol, and hardware device that uses public-key cryptography.7 This is a significant challenge in large, fragmented enterprises where cryptographic dependencies can be deeply embedded in legacy code, third-party libraries, and hardware modules like HSMs and TPMs.35 Without a comprehensive inventory, it is impossible to gauge the full scope of an organization's quantum risk.
  2. Assess: With an inventory in place, the next step is to conduct a risk assessment to prioritize systems for migration. This assessment is guided by Mosca's Theorem, which states that a system is at risk if the time it takes to migrate to a quantum-safe solution (X) plus the required security shelf-life of the data (Y) is greater than the time it will take for a CRQC to emerge (Z), or X+Y>Z.8 This formula highlights that systems protecting data with a long confidentiality requirement (e.g., decades for government archives or medical records) are the most urgent priorities, as they are most vulnerable to "Harvest Now, Decrypt Later" attacks.8
  3. Manage/Migrate: Based on the prioritized list, organizations can develop a phased migration roadmap.7 This involves engaging with vendors to understand their PQC roadmaps, conducting pilot projects to test the performance and interoperability of new algorithms in specific environments, and planning for the eventual rollout across the enterprise.37

This entire process is not merely a technical upgrade but a fundamental business transformation. The scope and cost are substantial; the White House, for instance, estimates the migration will cost U.S. federal agencies over $7.1 billion by 2035.20 The low current rate of formal planning—a 2025 survey found that only 7% of U.S. federal agencies had a dedicated PQC transition plan—suggests a widespread underestimation of the scale and complexity of the task.20 Success requires treating the PQC transition as a strategic business risk, which necessitates executive sponsorship, a dedicated cross-functional project team involving IT, cybersecurity, and data management, and its integration into long-term budget and planning cycles.7

3.2. Industry Case Studies: Navigating PQC Migration in Finance, Automotive, and Healthcare

Different industries face unique challenges and priorities in their PQC migration journeys, shaped by their specific regulatory environments, technological ecosystems, and data sensitivity requirements.

  • Financial Services: The financial sector is under immense pressure to migrate due to the extremely long shelf-life of financial data and stringent regulatory frameworks like the Digital Operational Resilience Act (DORA) and PCI DSS 4.0.35 A primary challenge is the sector's deeply fragmented and often decades-old cryptographic infrastructure, which includes multiple, disparate Hardware Security Module (HSM) platforms and key management services.35 This makes even the initial "discover" phase a monumental task. Furthermore, there is a severe shortage of skilled personnel with PQC expertise, and the vendor ecosystem for PQC-enabled financial hardware and software is still maturing.35 The strategic focus is on securing high-value assets like transaction processing systems, digital asset management, and the cryptographic keys stored in HSMs, which represent a catastrophic single point of failure if compromised.39
  • Automotive Industry: The automotive sector faces a unique set of challenges driven by the long lifecycle of vehicles—a car manufactured today will likely still be on the road when a CRQC exists.41 The industry's complex supply chain and the use of deeply embedded, resource-constrained Electronic Control Units (ECUs) make upgrades exceptionally difficult.43 PQC algorithms, with their larger keys and higher computational demands, can overwhelm the limited processing power and memory of these ECUs.43 Migration priorities include securing safety-critical systems like over-the-air (OTA) firmware updates, secure boot processes, and Vehicle-to-Everything (V2X) communications, where a cryptographic failure could have life-threatening consequences.42
  • Healthcare: The primary driver for PQC migration in healthcare is the need to protect the long-term confidentiality of sensitive patient data, such as Electronic Health Records (EHRs).46 This data is a prime target for "Harvest Now, Decrypt Later" attacks. A significant challenge is securing the vast and growing ecosystem of connected medical devices (IoT), which are often resource-constrained and may lack updatable firmware.37 The migration strategy involves a risk-based approach, prioritizing the protection of large patient data repositories and ensuring that new medical devices are designed with PQC capabilities from the outset.46

3.3. The Centrality of Crypto-Agility and Hybrid Implementations

Given that the PQC landscape is still evolving—with new standards like HQC and FALCON yet to be finalized and the potential for future cryptanalytic breakthroughs—the single most important strategic principle for migration is crypto-agility. This is the architectural capability to replace cryptographic algorithms in protocols and applications with minimal disruption to system operations.5 It involves avoiding hard-coded cryptographic choices and designing systems with modular cryptographic components that can be easily updated or swapped.20

During the lengthy transition period, a hybrid implementation is the recommended best practice for key exchange.11 In a hybrid scheme, both a classical algorithm (e.g., X25519) and a PQC algorithm (e.g., ML-KEM) are used in parallel to establish a shared secret. The final secret key is derived from the outputs of both algorithms. This approach provides a robust hedge: if a flaw is discovered in the new PQC algorithm, security falls back to the well-understood classical algorithm. Conversely, against a quantum adversary, security relies on the PQC algorithm.11 This dual-algorithm approach ensures continued security against both classical and quantum threats while the new PQC standards mature and gain widespread confidence.


Part II: The Physical Frontier: Quantum Communication and the Future Internet

While Post-Quantum Cryptography provides a software-based defense against quantum computers, a parallel research frontier is exploring the use of quantum mechanics itself to build a new, physically secure communication infrastructure. This long-term vision, often termed the "quantum internet," relies on technologies like Quantum Key Distribution (QKD) and the yet-to-be-realized quantum repeater. This section examines the global state of QKD network deployments and the fundamental scientific hurdles that must be overcome to achieve scalable quantum communication.

Section 4: Quantum Key Distribution (QKD) Networks: Global State of Deployment

Quantum Key Distribution is a technology that leverages the principles of quantum physics to distribute symmetric cryptographic keys between two parties in a way that is, in principle, secure against any computational attack, including from a quantum computer.

4.1. Principles of QKD: From BB84 to Modern Protocols

The concept of QKD, first proposed in the BB84 protocol by Charles Bennett and Gilles Brassard in 1984, uses individual photons to encode bits of a key.50 The security of QKD stems from fundamental quantum principles: the act of an eavesdropper measuring an unknown quantum state (such as a photon's polarization) inevitably disturbs it in a detectable way.50 This allows the legitimate parties to detect the presence of an eavesdropper and discard the compromised key material. When successfully executed, QKD provides a shared secret key with

information-theoretic security, meaning its security is guaranteed by the laws of physics, not by the presumed computational difficulty of a mathematical problem.52 This offers a fundamentally different and, theoretically, stronger security guarantee than the computational security provided by PQC.

4.2. A Comparative Review of International Initiatives

The development and deployment of QKD networks have become an area of intense international research and strategic competition. Different nations and blocs have adopted markedly different approaches, reflecting their unique technological priorities, risk appetites, and geopolitical goals.

  • China: China has established itself as the undisputed global leader in the scale and maturity of its QKD infrastructure. It has successfully deployed the world's first integrated quantum communication network, which combines a terrestrial fiber backbone of over 700 optical fibers with satellite-to-ground links.53 This network, which includes a 2,000 km fiber link connecting major cities like Beijing and Shanghai, is integrated with the
    Micius quantum science satellite, launched in 2016.52 This integrated architecture has enabled demonstrations of intercontinental QKD between China and Austria (a distance of 7,500 km) and achieved a total network distance of over 4,600 km.52 The
    Micius satellite has achieved average satellite-to-ground secret key rates of 47.8 kilobits per second, a significant performance milestone.52 This massive investment is explicitly tied to national security, providing secure communication channels for government and defense applications.54
  • Europe (EuroQCI): The European Union, through the European Quantum Communication Infrastructure (EuroQCI) initiative, is pursuing a collaborative, continent-wide QKD network.57 Involving all 27 EU member states and the European Space Agency (ESA), the EuroQCI aims to build a secure communication layer to protect government institutions, critical infrastructure, and data centers.57 The architecture is hybrid, comprising a terrestrial segment built on national fiber networks linked across borders, and a space segment of QKD satellites.57 The initiative is a key pillar of the EU's strategy for achieving "digital sovereignty" and is supported by large-scale projects like OPENQKD, which aims to create an open testbed for QKD technologies, and the upcoming Eagle-1 prototype satellite, scheduled for launch in late 2025 or early 2026.57
  • United States: The U.S. has adopted a more cautious and research-centric approach. While the National Quantum Initiative (NQI) supports the development of quantum networking testbeds at national labs like Oak Ridge National Laboratory (ORNL) and Fermilab, there is no large-scale national deployment plan analogous to those in China or Europe.60 This cautious stance is heavily influenced by the official position of the National Security Agency (NSA), which has publicly stated that it does not recommend or support the use of QKD for securing National Security Systems.62 The NSA's critique focuses on QKD's practical limitations, including its high cost, requirement for special-purpose equipment and dedicated fiber links, its vulnerability to denial-of-service attacks, and the security risks associated with implementation flaws and the necessity of "trusted relays" to extend its range.62 The U.S. government's official strategy prioritizes the migration to PQC as a more cost-effective, scalable, and easily maintainable solution.62
  • Japan: Japan has been a long-standing pioneer in QKD research. The National Institute of Information and Communications Technology (NICT) inaugurated the "Tokyo QKD Network" in 2010, a metropolitan-area testbed connecting several research sites across Tokyo.64 Japan's efforts have focused on achieving long-term operational stability, developing network management and key relay technologies, and driving international standardization through bodies like the International Telecommunication Union (ITU-T), where it has led the development of the first QKD network recommendations (e.g., Y.3800).64 Recent research by companies like Toshiba is focused on developing control technology to scale up QKD networks and increase key distribution speeds through techniques like wavelength multiplexing.68

The divergent paths taken by these global powers indicate that the choice to invest heavily in QKD is not purely a technical one. It is a strategic decision influenced by national security doctrine, industrial policy goals, and differing assessments of the trade-offs between the physical implementation risks of QKD and the mathematical hardness assumptions of PQC.

Table 3: Global QKD Network Initiatives

Region/InitiativeLead EntitiesPrimary ArchitectureReported Scale/DistanceKey Performance MetricsMaturity Level / Status
ChinaUSTC, CASIntegrated Satellite-Fiber4,600 km total network distance 53Satellite-ground SKR: ~47.8 kbps 52Operational / Experimental
Europe (EuroQCI)European Commission, ESAFederated Terrestrial & SpacePan-EU goal, >1000 km fiber testbeds 59Eagle-1 satellite launch ~2026 57In Deployment
USADOE, NSF, NSAResearch Testbeds~35-mile Chicago network 56, ORNL-EPB network 61Focus on R&D, not large-scale key ratesResearch / Skeptical
JapanNICT, ToshibaMetropolitan FiberTokyo QKD Network (~100 km area) 67Focus on long-term stability, standardization 66Long-term Operation / Upgrading

4.3. Architectural Analysis: Terrestrial Fiber vs. Satellite-Based QKD

QKD networks are primarily deployed using two physical media, each with distinct advantages and limitations.

  • Terrestrial Fiber Networks: These networks leverage existing or dedicated optical fiber infrastructure to connect QKD nodes.52 They are well-suited for building secure communication links within a metropolitan area or between nearby cities, as demonstrated by the Tokyo, Cambridge, and various European testbeds.64 The primary limitation of fiber-based QKD is distance. Due to intrinsic photon absorption and scattering in glass fiber, the rate of successful key generation decreases exponentially with distance, practically limiting a single, un-relayed link to a few hundred kilometers.52 To span longer distances, terrestrial networks must employ a chain of
    trusted relays or trusted nodes. These nodes receive a key on one link, decrypt it, and then re-encrypt and transmit it on the next link. While this extends the network's reach, it introduces a significant security vulnerability: the trusted node itself must be physically and digitally secure, as it holds the key in plaintext.62
  • Satellite-Based QKD: Free-space optical links via satellites are the only currently viable technology for achieving intercontinental and truly global QKD coverage.52 A satellite in Low Earth Orbit (LEO) can establish a line-of-sight connection with ground stations thousands of kilometers apart.54 Since most of the signal path is through the vacuum of space, the signal loss is concentrated in the few kilometers of atmosphere it must traverse, making it far more efficient for long distances than fiber.74 China's
    Micius satellite has successfully demonstrated this capability, enabling secure communication over thousands of kilometers.52 However, satellite QKD also has limitations, including dependence on clear weather, limited communication windows during satellite passes, and the high cost of designing, launching, and operating the necessary space and ground infrastructure.56

The most powerful and resilient architecture, as pioneered by China, is an integrated network that combines the strengths of both, using the terrestrial fiber network for high-density regional coverage and the satellite network to act as a trusted relay connecting these regional networks over vast distances.52

4.4. Practical Viability and Security Limitations

Despite its promise of information-theoretic security, QKD is not a panacea and faces significant practical and security challenges that limit its widespread adoption.

  • Partial Solution: QKD is not a complete cryptographic solution. It is a protocol exclusively for distributing symmetric keys.62 It does not provide authentication, meaning the parties must have a way to verify that they are talking to the correct entity. This authentication step almost always requires classical digital signatures, which today means using PQC algorithms like ML-DSA.62
  • Specialized Infrastructure: QKD cannot be implemented as a software update on existing network equipment. It requires dedicated, specialized hardware, including single-photon sources and detectors, and often requires dedicated fiber optic channels, which is prohibitively expensive for most use cases.62
  • Implementation Vulnerabilities: The theoretical security of QKD is based on idealized physical models. Real-world hardware implementations are imperfect and can leak information in ways not accounted for in the theory. Numerous successful attacks on commercial QKD systems have been demonstrated by exploiting these implementation-specific side-channel vulnerabilities, proving that the security of a QKD system is highly dependent on its physical engineering, not just the laws of physics.62
  • Denial of Service: The very property that gives QKD its security—its sensitivity to eavesdropping—also makes it highly susceptible to denial-of-service attacks. An attacker can simply shine light into the fiber to disrupt the quantum state and prevent the parties from ever agreeing on a key.62

These limitations have led security agencies like the NSA to conclude that for most applications, PQC offers a more cost-effective, scalable, and easily maintained solution for achieving quantum resistance.62 The current consensus is that QKD is best suited for niche, high-security applications where the cost and complexity are justified, such as securing backbone links between highly sensitive government or financial data centers.63

Section 5: The Quantum Repeater Challenge: Overcoming the Distance Barrier

The fundamental limitation of all current QKD networks, whether terrestrial or satellite-based, is their reliance on trusted nodes to extend communication beyond a few hundred kilometers. A true, global quantum internet—one capable of transmitting quantum information between any two points on Earth without trusted intermediaries—requires a technology that does not yet exist in a practical form: the quantum repeater.

5.1. The Physics of Photon Loss: Why Repeaters are Non-Negotiable for a Quantum Internet

Direct transmission of quantum states via photons through optical fiber is subject to exponential signal loss. With a typical attenuation of 0.2 dB/km, the probability of a single photon successfully traversing a 1,000 km fiber is astronomically small, on the order of 10−20.71 In classical communication, this problem is solved with optical amplifiers that boost the signal strength at regular intervals. However, the

no-cloning theorem of quantum mechanics forbids the creation of an identical copy of an unknown quantum state.71 This means that a quantum bit, or qubit, cannot be simply copied and amplified like a classical bit, rendering classical repeaters useless for quantum communication.80

The quantum repeater is the proposed solution to this fundamental problem. Instead of amplifying the signal, a quantum repeater works by dividing a long-distance channel into shorter elementary links. It first generates entanglement between adjacent repeater nodes and then uses a process called entanglement swapping to "stitch" these short-distance entangled links together, ultimately creating a single entangled pair between the two distant endpoints without any photon ever having to traverse the full distance.71 The development of a functional quantum repeater is therefore the single most critical and formidable challenge on the path to a scalable quantum internet.78

5.2. Core Scientific and Engineering Obstacles

Building a practical quantum repeater is an immense scientific and engineering undertaking that pushes the boundaries of experimental physics. The primary obstacles are multifaceted and deeply interconnected.71

  • Entanglement Swapping and Purification: The core operation of a repeater, entanglement swapping, relies on a quantum measurement called a Bell State Measurement. With current linear optical techniques, this measurement is inherently probabilistic, with a maximum success rate of 50% even in an ideal, lossless system.78 Furthermore, every swapping operation, even when successful, introduces noise and degrades the fidelity (a measure of quality) of the resulting entangled state. To combat this degradation, protocols for
    entanglement purification are required, where multiple low-fidelity entangled pairs are consumed to distill a single pair of higher fidelity.71 Performing these nested purification and swapping operations with high fidelity and efficiency remains a major experimental challenge.78
  • Quantum Memories: Because entanglement generation and swapping are probabilistic, a repeater node must be able to store the quantum state of a successfully entangled link while it waits for an adjacent link to also become successfully entangled.71 This requires a
    quantum memory. The demands on these memories are extraordinary. They must have long coherence times (the duration for which a quantum state can be preserved) to outlast the multiple communication and processing attempts required across the network; for long-distance communication, coherence times on the order of seconds may be required.78 They must also offer high efficiency in both storing and retrieving photons, and allow for high-fidelity quantum gate operations to be performed on the stored qubits.81 Achieving all of these properties simultaneously in a single physical system is a frontier of materials science and quantum physics research.81
  • Photon Sources and Detectors: The entire repeater architecture relies on the ability to generate single photons or entangled photon pairs with high purity and efficiency, and to detect them with near-perfect reliability.80 Creating sources that produce photons on demand, at telecommunications wavelengths, and with the precise properties needed for efficient interaction with a quantum memory is a significant challenge.80 Moreover, efficiently coupling these photons into and out of optical fibers and memories requires advanced photonic integration and micro-optical engineering to minimize losses at every interface.80

5.3. A Review of Current Research and Experimental Progress

The development of quantum repeaters is currently at a very early stage of research, largely confined to laboratory settings. Progress is typically measured by demonstrations of individual components or the integration of a few components into elementary links. Research consortia, such as the German QR.X and QR.N projects, are focused on developing and optimizing the basic building blocks—from ultra-bright single-photon sources to 3D-printed micro-optics for efficient coupling—and demonstrating their operation in small-scale testbeds with two or three nodes.80

Projects funded by the U.S. National Science Foundation, such as the SCY-QNet project at Stony Brook University, aim to evolve existing QKD testbeds into more advanced networks incorporating heralded quantum memories and quantum processing units, with the goal of demonstrating robust entanglement over a 350 km fiber network.85 These efforts, while significant, highlight the current state of the art: building a handful of interconnected repeater nodes is a frontier research project. The technical and scientific chasm between these small-scale lab demonstrations and a reliable, scalable quantum repeater capable of underpinning a global quantum internet is vast. The language used by researchers themselves—describing the quantum internet as a "holy grail" and repeaters as an "enormous technical challenge"—underscores that this is a long-term scientific endeavor, not a technology on the cusp of deployment.78 For strategic planning purposes, it must be assumed that a global, trustless quantum communication network will not be a viable technological option for at least the next one to two decades.


Part III: The Algorithmic Frontier: Advanced Cryptographic Paradigms

While post-quantum cryptography addresses the security of data at rest and in transit, another major research frontier is developing cryptographic tools to protect data in use. These advanced paradigms—Homomorphic Encryption (HE), Zero-Knowledge Proofs (ZKP), and Secure Multi-Party Computation (MPC)—are not primarily about preventing eavesdropping but about enabling new forms of secure and verifiable computation on sensitive or private data. They represent a fundamental shift in what cryptography can achieve, moving from a tool for confidentiality to an enabler of trustworthy collaboration and computation.

Section 6: Privacy-Enhancing Cryptography: HE, ZKP, and MPC in Practice

This section explores the three most prominent families of Privacy-Enhancing Technologies (PETs), analyzing their core functionalities, performance trade-offs, and practical applications, particularly in the burgeoning field of privacy-preserving machine learning.

6.1. Computing on Encrypted Data: A Performance Review of Homomorphic Encryption (HE) Schemes

Homomorphic Encryption is a revolutionary form of encryption that allows computations to be performed directly on encrypted data (ciphertexts) without decrypting it first.86 An untrusted third party, such as a cloud server, can execute a function on a user's encrypted data and produce an encrypted result. When the user decrypts this result, it is identical to the result they would have obtained by performing the same function on their original, unencrypted data.87 This capability is the "holy grail" for secure outsourced computation, as it allows sensitive data to be processed by third parties without ever exposing the underlying private information.88

The main challenge in practical HE is managing the "noise" that is inherent in most schemes. Each homomorphic operation, particularly multiplication, increases the amount of noise in a ciphertext. If the noise grows too large, the ciphertext becomes corrupted and can no longer be correctly decrypted.86 This limitation led to the development of different categories of HE:

  • Partially Homomorphic Encryption (PHE): Supports an unlimited number of one type of operation (e.g., addition or multiplication) but not both.
  • Somewhat Homomorphic Encryption (SHE): Supports a limited number of both addition and multiplication operations.
  • Fully Homomorphic Encryption (FHE): Supports an unlimited number of arbitrary computations. This is achieved through a computationally expensive procedure called bootstrapping, which effectively "resets" the noise in a ciphertext, allowing further computations to be performed.89

Several families of FHE schemes have been developed, each with different performance characteristics and suitability for different types of computation. The primary libraries implementing these schemes include Microsoft SEAL, OpenFHE (which merged PALISADE and HElib), and TFHE-rs.88

  • BFV and BGV: These schemes are designed for exact integer arithmetic. They are highly efficient for computations involving integers and are often used in applications where precise results are required. They manage noise through a technique called modulus switching.90
  • CKKS (Cheon-Kim-Kim-Song): This scheme is designed for approximate arithmetic on real or complex numbers. It treats the noise as part of the approximation error, making it extremely efficient for applications like machine learning inference, where exact precision is not required.90 CKKS manages noise through a "rescaling" operation.90
  • TFHE (and FHEW): These schemes are optimized for Boolean circuits and bit-wise operations. They feature a very fast bootstrapping procedure, allowing for the evaluation of arbitrary functions with high multiplicative depth, but individual arithmetic operations can be slower than in BFV/CKKS.92

The choice of an HE scheme involves significant trade-offs. For machine learning inference on real-valued data, CKKS is often the most efficient due to its handling of approximate numbers and its ability to pack many values into a single ciphertext (SIMD processing).90 For exact integer arithmetic, BFV and BGV are typically faster.92 TFHE excels in applications requiring many non-arithmetic operations or very deep computational circuits where frequent bootstrapping is necessary.94 Despite significant performance improvements, FHE remains orders of magnitude slower than computation on unencrypted data, and its practical use is an active area of research focused on algorithmic optimization and hardware acceleration.86

6.2. Verifiable Computation: A Comparative Analysis of zk-SNARKs vs. zk-STARKs

Zero-Knowledge Proofs (ZKPs) are cryptographic protocols that enable a "prover" to convince a "verifier" that a statement is true, without revealing any information beyond the validity of the statement itself.95 This powerful concept is the foundation for

verifiable computation, where a user can outsource a computation to an untrusted server and receive a result along with a succinct proof that the computation was performed correctly. This is particularly transformative for applications like blockchain scaling (ZK-Rollups) and verifiable machine learning.

Two main types of non-interactive ZKPs dominate the field: zk-SNARKs and zk-STARKs.

  • zk-SNARKs (Zero-Knowledge Succinct Non-Interactive Argument of Knowledge):
    • Strengths: The defining features of zk-SNARKs are their "succinctness"—the proofs are extremely small (hundreds of bytes) and verification is very fast and constant-time, regardless of the complexity of the original computation.95 This makes them highly efficient for on-chain verification in blockchain applications.97
    • Weaknesses: The primary drawback of most practical zk-SNARKs (like Groth16) is their requirement for a trusted setup.95 This is a one-time ceremony to generate a public parameter called the Common Reference String (CRS) or Structured Reference String (SRS).98 The generation of this CRS involves a secret (often called "toxic waste") which, if not properly destroyed, could be used to forge fake proofs.98 To mitigate this risk, trusted setups are performed as elaborate multi-party computation (MPC) ceremonies, where the trust assumption is that at least one participant acts honestly and destroys their share of the secret.98 Additionally, the cryptographic assumptions underlying most efficient zk-SNARKs (based on elliptic curve pairings) are not resistant to quantum computers.96
  • zk-STARKs (Zero-Knowledge Scalable Transparent Argument of Knowledge):
    • Strengths: zk-STARKs were developed to address the main weaknesses of zk-SNARKs. They are transparent, meaning they do not require a trusted setup; their public parameters are generated from public randomness.95 They are also
      quantum-resistant, as their security relies on collision-resistant hash functions rather than elliptic curve assumptions.96 Furthermore, their proving time scales quasi-logarithmically with the size of the computation, making them more
      scalable for very large and complex computations compared to the super-linear proving time of SNARKs.97
    • Weaknesses: The main trade-off is proof size. STARK proofs are significantly larger than SNARK proofs, often measured in kilobytes rather than bytes.96 This larger size increases communication overhead and can make on-chain verification more costly.97

The choice between SNARKs and STARKs is application-dependent. For applications where proof size is paramount and a trusted setup is acceptable (e.g., many current ZK-Rollups), SNARKs are often preferred. For applications where transparency and quantum resistance are critical, or for extremely large computations, STARKs are the superior choice.96

6.3. Secure Collaborative Computation: An Overview of Multi-Party Computation (MPC) Protocols and Frameworks

Secure Multi-Party Computation (MPC) enables a group of parties to jointly compute a function over their private inputs without revealing those inputs to each other.103 It effectively creates a virtual trusted third party using cryptography, allowing for collaborative data analysis while preserving input privacy.105

MPC protocols are typically built on one of two main techniques: secret sharing or garbled circuits. They are also categorized by their security guarantees and the assumptions they make about the number of potentially corrupt parties.

  • Security Models:
    • Semi-Honest (Passive) vs. Malicious (Active): In the semi-honest model, corrupt parties are assumed to follow the protocol but will try to learn information from the messages they receive. In the malicious model, corrupt parties can deviate arbitrarily from the protocol in an attempt to cheat.106 Maliciously secure protocols are much stronger but also more computationally expensive.
    • Honest vs. Dishonest Majority: Protocols for the honest majority setting (where fewer than half the parties are corrupt) can often achieve higher efficiency and stronger security guarantees (e.g., security with abort, where honest parties are guaranteed to get the correct output or learn that cheating occurred). Protocols for the dishonest majority setting (where all but one party could be corrupt) are more challenging to construct but are necessary for certain use cases.107
  • Key Protocol Families:
    • BGW and GMW: These are foundational protocols. BGW (Ben-Or, Goldwasser, Wigderson) is based on Shamir's Secret Sharing and is well-suited for arithmetic circuits in the honest majority setting, offering information-theoretic security.108 GMW (Goldreich, Micali, Wigderson) is based on garbled circuits or oblivious transfer and is suited for Boolean circuits, typically in the dishonest majority setting with computational security.108
    • SPDZ: This is a modern and highly influential family of protocols for the malicious, dishonest majority setting. It uses secret sharing combined with MACs (Message Authentication Codes) and a pre-processing phase where cryptographic material (e.g., "Beaver triples") is generated to make the online computation phase extremely fast.107

Frameworks like MP-SPDZ provide a versatile platform for implementing and benchmarking a wide array of MPC protocols, allowing developers to choose the protocol that best fits their application's specific security requirements and performance constraints.103 While MPC is computationally faster than FHE for many tasks, its primary overhead is communication, as protocols often require multiple rounds of interaction between the parties.108

Table 4: Comparative Analysis of Privacy-Enhancing Cryptographic Primitives

PrimitiveCore FunctionKey VariantsPrimary Performance OverheadTrust AssumptionsExample Application
Homomorphic Encryption (HE)Compute on encrypted dataBFV/BGV (Integer), CKKS (Approximate), TFHE (Boolean)Computation (Bootstrapping)Untrusted serverOutsourced medical data analysis 89
Zero-Knowledge Proofs (ZKP)Prove correctness of a statementzk-SNARK, zk-STARK, BulletproofsProof Generation Time/SizeProver/Verifier model (some require trusted setup)Verifiable blockchain transactions (ZK-Rollups) 112
Secure Multi-Party Computation (MPC)Jointly compute on secret inputsSecret Sharing-based (SPDZ), Garbled Circuits-based (Yao, GMW)Communication Rounds/BandwidthSet of N parties with security threshold (e.g., honest majority)Joint fraud detection between banks 107

6.4. Applications in Verifiable and Privacy-Preserving Machine Learning (PPML)

The convergence of HE, ZKP, and MPC is creating a powerful toolkit for addressing the critical challenges of privacy and verifiability in machine learning and AI.113 As ML models become more pervasive and are trained on increasingly sensitive data, these cryptographic techniques are no longer theoretical curiosities but essential components for building trustworthy AI systems.

  • Privacy-Preserving Training: MPC and Federated Learning (FL) are used to train a shared model on data distributed among multiple parties without those parties having to share their raw data.113 HE can be used to further protect the model updates that are sent to a central server in FL, a technique known as Secure Federated Learning.116
  • Privacy-Preserving Inference (ML-as-a-Service): HE and MPC allow a client to get a prediction from a cloud-hosted ML model without revealing their sensitive input data to the model owner.113 Conversely, ZKPs allow the model owner to prove that they used a specific, high-quality model for the inference without revealing the proprietary model itself.117
  • Verifiable ML: ZKPs are being used to create verifiable ML frameworks (ZKML).118 This allows a model provider to generate a cryptographic proof that an inference was computed correctly according to a publicly committed model. This is crucial for high-stakes applications in finance (e.g., verifiable credit scoring), healthcare (e.g., trustworthy diagnostic models), and autonomous systems, where users need to trust the integrity of the AI's output.117

These applications demonstrate a paradigm shift where cryptography is used not just to protect data, but to enable new forms of secure and trustworthy collaboration and computation, which will be a defining feature of the next decade of digital infrastructure.

Section 7: Emerging and Unconventional Cryptographic Research

Beyond the major frontiers of PQC, QKD, and PETs, cryptographic research continues to explore a diverse range of novel concepts. This section examines several of these emerging areas, including the application of machine learning to cryptanalysis, the development of new primitives for specific applications like blockchains, and a critical assessment of more speculative approaches.

7.1. The Application of Machine Learning to Modern Cryptanalysis

Machine learning is a double-edged sword in cryptography. While it enables privacy-preserving computation as discussed previously, it also provides powerful new tools for cryptanalysis. The ability of neural networks to identify subtle patterns in vast datasets makes them well-suited for attacking cryptographic implementations.

Recent research has focused heavily on applying ML, particularly deep learning, to two main areas:

  • Cryptanalysis of Symmetric Ciphers: Neural networks are being trained as distinguishers in differential cryptanalysis. By feeding a network vast quantities of plaintext-ciphertext pairs from a block cipher, it can learn to distinguish the cipher's output from random noise, identifying statistical biases that can be exploited in an attack.121 While attacks on full-strength ciphers like AES remain out of reach, ML-based techniques have shown success against reduced-round or "toy" versions of ciphers like DES and AES, and have proven effective against certain lightweight ciphers.121
  • Side-Channel Attacks: This is where ML has had the most practical impact. Deep learning models, especially Convolutional Neural Networks (CNNs), are exceptionally effective at analyzing power or electromagnetic traces to recover secret keys from a device.125 They can automatically learn the features of a leakage signal, proving more robust against noise and countermeasures like masking compared to traditional statistical attacks like Correlation Power Analysis (CPA).125

This research frontier underscores that the security of future cryptographic systems will depend not only on their mathematical hardness but also on their resilience to increasingly sophisticated, AI-driven implementation attacks.

7.2. Verifiable Delay Functions (VDFs) and Their Role in Blockchain Consensus

The rise of new application domains like blockchain technology is driving the invention of entirely new cryptographic primitives tailored to solve specific problems. A prime example is the Verifiable Delay Function (VDF).127

A VDF is a function that is intentionally slow to compute but fast to verify.128 Crucially, its computation is inherently sequential, meaning it cannot be significantly sped up by using parallel processors.128 The core application of VDFs is the generation of trustworthy public randomness in decentralized systems.127 In many Proof-of-Stake blockchain protocols, a random number is needed to fairly select the next block producer. If this randomness can be predicted or manipulated by a powerful adversary (e.g., a miner with many computers), they can bias the selection in their favor.130

A VDF solves this problem by taking a public input (e.g., a hash from a recent block) and forcing a time delay before the final random output is known.130 This delay is long enough to prevent any party from pre-computing outcomes and manipulating the process, but the result, once computed, can be quickly verified by everyone on the network.129 VDFs are a key component in the design of next-generation consensus protocols, including those proposed for Ethereum, and represent a new class of cryptography designed not for confidentiality or authentication, but for enforcing the passage of time in a trustless environment.128

7.3. Speculative Frontiers: Assessing the Viability of DNA and Chaotic Cryptography

The search for novel cryptographic foundations has led researchers to explore unconventional paradigms, though their practical viability remains highly speculative.

  • DNA Cryptography: This approach seeks to leverage the immense information density of DNA molecules and the complexity of biological processes for cryptography.133 Ideas range from using DNA for steganography (hiding messages in DNA strands) to performing computations on DNA molecules.133 The security is proposed to come from the "biological difficulty" of processes like DNA synthesis and sequencing.134 However, this field faces profound challenges that make it impractical for the foreseeable future. There is a lack of standard protocols, the biological processes are slow and extremely error-prone, and it is difficult to quantify the security in a rigorous way.134 Most current research is theoretical or involves simulations rather than practical biological implementations.133
  • Chaotic Cryptography: This field attempts to use the properties of mathematical chaotic systems—such as extreme sensitivity to initial conditions—to design encryption algorithms.136 The idea is that the chaotic evolution of a system can be used as a pseudorandom number generator for a stream cipher. However, the history of chaotic cryptography is fraught with broken schemes.136 A fundamental challenge is that the properties of chaos are defined over continuous real numbers, while cryptography must operate on finite, discrete sets of numbers. This discretization process often destroys the very chaotic properties that were meant to provide security, leading to systems with short periods or other predictable behaviors that can be easily exploited by cryptanalysis.137

While these areas represent intriguing intellectual explorations, they currently lack the rigorous security foundations and practical feasibility of mainstream cryptographic research.


Part IV: Strategic Outlook and Recommendations

The cryptographic frontiers explored in this report—from the immediate PQC migration to the long-term vision of a quantum internet and the rise of privacy-preserving computation—are not independent research tracks but interconnected components of a new security paradigm. Navigating this complex landscape requires a strategic, forward-looking approach that aligns technological adoption with realistic timelines and tailored organizational priorities.

Section 8: Navigating the Cryptologic Frontiers: A Strategic Roadmap

A coherent strategy for cryptographic modernization must be grounded in a clear understanding of the maturity and timelines of different technologies. This allows for the allocation of resources to address immediate threats while simultaneously investing in research and development to prepare for future capabilities.

8.1. Synthesizing the Timelines: Near-Term, Mid-Term, and Long-Term Developments

The evolution of the cryptographic landscape can be projected across three distinct time horizons:

  • Near-Term (0–5 years): The singular focus in this period must be on the Post-Quantum Cryptography migration. The primary activities for all organizations should be cryptographic discovery, risk assessment, and the development of a detailed migration roadmap. This includes creating a comprehensive inventory of all cryptographic assets, prioritizing systems based on data sensitivity and longevity, and initiating pilot programs to test the performance and interoperability of the new NIST standards (ML-KEM, ML-DSA, etc.) within specific enterprise environments. Achieving crypto-agility through architectural redesign and adopting hybrid implementations will be the key technical goals.
  • Mid-Term (5–15 years): This period will be characterized by the widespread deployment of PQC across global IT infrastructure, from cloud services to embedded devices. The vendor ecosystem for PQC-ready hardware and software will mature, and regulatory bodies will likely mandate PQC for critical sectors. Concurrently, Privacy-Enhancing Technologies (PETs) will move from niche applications to broader enterprise adoption as their performance improves and standardization efforts progress. We can expect to see early but impactful uses of HE, ZKP, and MPC in finance, healthcare, and AI. In the quantum networking space, regional and metropolitan QKD networks will continue to operate and expand as experimental and high-assurance platforms, but their use will remain limited to specialized applications.
  • Long-Term (15+ years): The long-term vision is contingent on fundamental scientific breakthroughs. The potential viability of practical quantum repeaters could emerge in this timeframe, paving the way for the first true intercontinental quantum networks and the beginnings of a quantum internet. The performance of PETs will likely have advanced to the point where they are integrated more seamlessly into mainstream computing architectures, enabling a wide range of privacy-preserving and verifiable applications by default.

8.2. Recommendations for Enterprise, Government, and Research Institutions

Different stakeholders must adopt tailored strategies to effectively navigate this transition.

  • For Enterprise (CISOs, CTOs, and Architects):
    1. Act Now on PQC: Do not wait for a CRQC to appear. The "Harvest Now, Decrypt Later" threat is immediate. Initiate a formal PQC migration program with executive sponsorship and a cross-functional team.
    2. Prioritize Discovery and Agility: The first and most critical investment is in tools and processes to create a comprehensive cryptographic inventory. The primary architectural goal should be achieving crypto-agility to adapt to an evolving cryptographic landscape.
    3. Engage the Supply Chain: Scrutinize the PQC roadmaps of all hardware, software, and cloud vendors. PQC readiness must become a key criterion in procurement and vendor risk management.
    4. Invest in Talent: The demand for cryptographic expertise, particularly in PQC and PETs, will far outstrip supply. Invest in training existing staff and building internal competency.
  • For Government (Policymakers and National Security Agencies):
    1. Drive Migration Through Policy and Procurement: Use government's purchasing power and regulatory authority to accelerate PQC adoption in critical infrastructure and the broader economy. Set clear migration deadlines, as seen with U.S. federal mandates.
    2. Maintain a Balanced R&D Portfolio: Continue to fund fundamental research across all frontiers. This includes supporting the development of PQC countermeasures (e.g., for side-channel attacks), investing in the long-term scientific challenge of quantum repeaters, and fostering the growth of the PET ecosystem.
    3. Adopt a Nuanced Stance on QKD: Recognize QKD as a strategic technology with potential for specific high-assurance use cases, but also acknowledge its practical limitations and security risks. Avoid framing it as a universal replacement for PQC.
    4. Foster International Standardization: Continue to support and lead in open, international standards bodies like NIST and ITU-T to ensure global interoperability and avoid a fragmented cryptographic world.
  • For Research Institutions (Academia and National Labs):
    1. Focus on the Hard Problems: Prioritize research on the most significant unsolved challenges. This includes developing more efficient and practical PETs, designing provably secure countermeasures for PQC implementations, and overcoming the fundamental physics and engineering hurdles of quantum memories and entanglement swapping for quantum repeaters.
    2. Develop Robust Benchmarking and Verification Tools: The community needs standardized tools and methodologies to fairly compare the performance of PQC and PET implementations and to formally verify their security against a wide range of attacks, including side-channel vulnerabilities.
    3. Promote Interdisciplinary Collaboration: The future of cryptography lies at the intersection of computer science, physics, mathematics, and engineering. Foster collaboration between these fields to co-design new cryptographic primitives and the systems that will use them.

8.3. Concluding Remarks on Open Problems and the Future of Secure Computation

The cryptographic landscape is more dynamic and promising than at any point in its history. The transition to Post-Quantum Cryptography is a defensive necessity, a monumental effort to refound our digital security on principles resistant to the next generation of computational power. Yet, beyond this necessary defense, new cryptographic frontiers are opening up proactive possibilities that were once the realm of science fiction. The ability to compute on encrypted data, to verify computations without trust, and to collaborate on private data will fundamentally reshape industries that run on information.

However, significant open problems remain. The performance of PETs is still a major barrier to their widespread use. The security of PQC implementations in the face of physical attacks is a field of active and urgent research. And the dream of a global quantum internet remains contingent on solving some of the hardest problems in modern physics. The path forward requires a sustained, collaborative effort from researchers, engineers, and policymakers. By embracing a strategic, risk-managed approach to the near-term PQC migration while continuing to invest in the fundamental research that will build the future, we can navigate this complex transition and usher in a new era of secure and trustworthy computation.

Works cited

  1. Post-Quantum Cryptography, Explained - Booz Allen, accessed September 9, 2025, https://www.boozallen.com/insights/ai-research/post-quantum-cryptography-explained.html
  2. What Is Post-Quantum Cryptography? | NIST, accessed September 9, 2025, https://www.nist.gov/cybersecurity/what-post-quantum-cryptography
  3. Post-Quantum Cryptography | CSRC - NIST Computer Security Resource Center, accessed September 9, 2025, https://csrc.nist.gov/projects/post-quantum-cryptography
  4. Welcome to the post-quantum era: challenges and strategies for cybersecurity, accessed September 9, 2025, https://www.orangecyberdefense.com/global/blog/cybersecurity/welcome-to-the-post-quantum-era-challenges-and-strategies-for-cybersecurity
  5. Decoding NIST PQC Standards: What They Are, What's Final, and What's Next, accessed September 9, 2025, https://www.encryptionconsulting.com/decoding-nist-pqc-standards/
  6. Challenges for NIST PQC Adoption - Quantum Xchange, accessed September 9, 2025, https://quantumxc.com/featured/nist-pqc-adoption-challenges/
  7. From Risk to Readiness: A Strategic Approach to PQC Migration - GDIT, accessed September 9, 2025, https://www.gdit.com/perspectives/latest/from-risk-to-readiness-a-strategic-approach-to-pqc-migration/
  8. NIST Selects HQC as Fifth Algorithm for Post-Quantum Encryption: Why This Matters for Your Data Security - Arctiq, accessed September 9, 2025, https://arctiq.com/blog/nist-selects-hqc-as-fifth-algorithm-for-post-quantum-encryption-why-this-matters-for-your-data-security
  9. Preparing Payments for the Quantum Computing Disruption - Entrust, accessed September 9, 2025, https://www.entrust.com/blog/2025/01/the-post-quantum-era-demands-quantum-safe-payments
  10. Post-Quantum Cryptography Algorithms: NIST Selects HQC - SafeLogic, accessed September 9, 2025, https://www.safelogic.com/blog/nist-selects-hqc-as-fifth-pqc-algorithm
  11. NIST's first post-quantum standards - The Cloudflare Blog, accessed September 9, 2025, https://blog.cloudflare.com/nists-first-post-quantum-standards/
  12. NIST's final PQC standards are here – What you need to know - Utimaco, accessed September 9, 2025, https://utimaco.com/news/blog-posts/nists-final-pqc-standards-are-here-what-you-need-know
  13. Evaluating Post-Quantum Cryptographic Algorithms on Resource-Constrained Devices - arXiv, accessed September 9, 2025, https://arxiv.org/pdf/2507.08312
  14. NIST FIPS 203, 204, 205 Finalized | PQC Algorithms | CSA - Cloud Security Alliance, accessed September 9, 2025, https://cloudsecurityalliance.org/blog/2024/08/15/nist-fips-203-204-and-205-finalized-an-important-step-towards-a-quantum-safe-future
  15. Post-Quantum Cryptography FIPS Approved | CSRC - NIST Computer Security Resource Center - National Institute of Standards and Technology, accessed September 9, 2025, https://csrc.nist.gov/news/2024/postquantum-cryptography-fips-approved
  16. FIPS 203, Module-Lattice-Based Key-Encapsulation Mechanism Standard | CSRC, accessed September 9, 2025, https://csrc.nist.gov/pubs/fips/203/final
  17. What are FIPS 203, 204, and 205? - wolfSSL, accessed September 9, 2025, https://www.wolfssl.com/what-are-fips-203-204-and-205/
  18. HQC, accessed September 9, 2025, https://pqc-hqc.org/
  19. NIST advances post-quantum cryptography standardization, selects HQC algorithm to counter quantum threats - Industrial Cyber, accessed September 9, 2025, https://industrialcyber.co/nist/nist-advances-post-quantum-cryptography-standardization-selects-hqc-algorithm-to-counter-quantum-threats/
  20. NIST Outlines Strategies for Crypto Agility as PQC Migration Stalls, Available for Public Comment - The Quantum Insider, accessed September 9, 2025, https://thequantuminsider.com/2025/03/07/nist-outlines-strategies-for-crypto-agility-as-pqc-migration-stalls-available-for-public-comment/
  21. Inside NIST's PQC: Kyber, Dilithium, and SPHINCS+ - PostQuantum.com, accessed September 9, 2025, https://postquantum.com/post-quantum/nists-pqc-technical/
  22. Post-Quantum Cryptography (PQC) Introduction, accessed September 9, 2025, https://postquantum.com/post-quantum/post-quantum-cryptography-pqc/
  23. A Practical Performance Benchmark of Post-Quantum Cryptography Across Heterogeneous Computing Environments - MDPI, accessed September 9, 2025, https://www.mdpi.com/2410-387X/9/2/32
  24. Quantum-Safe Cryptography Standards: Forging an Unbreakable ..., accessed September 9, 2025, https://www.appsecengineer.com/blog/quantum-safe-cryptography-standards-forging-an-unbreakable-digital-fortress
  25. arxiv.org, accessed September 9, 2025, https://arxiv.org/html/2503.12952v1
  26. Falcon, accessed September 9, 2025, https://falcon-sign.info/
  27. A Look at Side Channel Attacks on Post-quantum Cryptography - SciELO México, accessed September 9, 2025, https://www.scielo.org.mx/scielo.php?script=sci_arttext&pid=S1405-55462024000401879
  28. Side-channel Attacks and Countermeasures for Post-Quantum Cryptography - People, accessed September 9, 2025, https://people-ece.vse.gmu.edu/coursewebpages/ECE/ECE646/F20/project/F18_presentations/Session_I/Session_I_Report_3.pdf
  29. Introduction to Side-Channel Security of NIST PQC Standards, accessed September 9, 2025, https://csrc.nist.gov/csrc/media/Projects/post-quantum-cryptography/documents/pqc-seminars/presentations/2-side-channel-security-saarinen-04042023.pdf
  30. The Challenge of Side-Channel Countermeasures on Post-Quantum Crypto - NIST Computer Security Resource Center, accessed September 9, 2025, https://csrc.nist.gov/csrc/media/Presentations/2022/the-challenge-of-side-channel-countermeasures-on-p/images-media/session2-zeitoun-challenge-of-side-channel-countermeasures-pqc2022.pdf
  31. Side-Channel Analysis of Lattice-Based Post-Quantum Cryptography: Exploiting Polynomial Multiplication | Request PDF - ResearchGate, accessed September 9, 2025, https://www.researchgate.net/publication/365146913_Side-Channel_Analysis_of_Lattice-Based_Post-Quantum_Cryptography_Exploiting_Polynomial_Multiplication
  32. NIST Releases First 3 Finalized Post-Quantum Encryption Standards, accessed September 9, 2025, https://www.nist.gov/news-events/news/2024/08/nist-releases-first-3-finalized-post-quantum-encryption-standards
  33. Benchmarking and Analysing NIST PQC Lattice-Based Signature Scheme Standards on the ARM Cortex M7, accessed September 9, 2025, https://csrc.nist.gov/csrc/media/Events/2022/fourth-pqc-standardization-conference/documents/papers/benchmarking-and-analysiing-nist-pqc-lattice-based-pqc2022.pdf
  34. Post-Quantum Cryptography (PQC) Standardization - 2025 Update, accessed September 9, 2025, https://postquantum.com/post-quantum/cryptography-pqc-nist/
  35. PQC Migration Challenges & Compliance Risks for Financial ..., accessed September 9, 2025, https://www.cryptomathic.com/blog/pqc-migration-challenges-compliance-risks-for-financial-institutions
  36. Untold Challenge of Post-Quantum Cryptography Migration - Fortanix, accessed September 9, 2025, https://www.fortanix.com/blog/untold-challenge-of-post-quantum-cryptography-migration
  37. 10 Enterprise Must-Haves for a Successful Post-Quantum Cryptography (PQC) Migration, accessed September 9, 2025, https://www.encryptionconsulting.com/must-haves-for-a-successful-pqc-migration/
  38. CIOs Must Prepare Their Organizations Today For Quantum Safe Cryptography - IBM, accessed September 9, 2025, https://www.ibm.com/think/insights/cios-must-prepare-their-organizations-today-for-quantum-safe-cryptography
  39. Post-Quantum Financial Infrastructure Framework (PQFIF) - SEC.gov, accessed September 9, 2025, https://www.sec.gov/files/cft-written-input-daniel-bruno-corvelo-costa-090325.pdf
  40. Quantinuum and Luna HSMs Protect Financial Services - Case Study - Thales, accessed September 9, 2025, https://cpl.thalesgroup.com/resources/encryption/quantinuum-luna-hsms-protect-financial-services-case-study
  41. Why Automakers Need to Prepare for the Quantum Security Challenge Now | by DuoKey, accessed September 9, 2025, https://medium.com/@duokey.com/why-automakers-need-to-prepare-for-the-quantum-security-challenge-now-cb023553873a
  42. Post-quantum cryptography for automotive systems | Request PDF - ResearchGate, accessed September 9, 2025, https://www.researchgate.net/publication/356130542_Post-quantum_cryptography_for_automotive_systems
  43. Post-Quantum Cryptography in Automotive - Apriorit, accessed September 9, 2025, https://www.apriorit.com/dev-blog/post-quantum-cryptography-in-automotive
  44. Post-Quantum Cryptography: Migration Challenges for Embedded Devices - NXP Semiconductors, accessed September 9, 2025, https://www.nxp.com/docs/en/white-paper/POSTQUANCOMPWPA4.pdf
  45. Impacts of Post-Quantum Cryptography on Automotive Security: A ..., accessed September 9, 2025, https://d-nb.info/1316435830/34
  46. Future- Proofing Electronic Health Records Against Quantum Computing Threats and Cyber, accessed September 9, 2025, https://ijcat.com/archieve/volume14/issue3/ijcatr14031008.pdf
  47. (PDF) Post-Quantum Healthcare: A Roadmap for Cybersecurity Resilience in Medical Data, accessed September 9, 2025, https://www.researchgate.net/publication/380647953_Post-Quantum_Healthcare_A_Roadmap_for_Cybersecurity_Resilience_in_Medical_Data
  48. The PQC Migration Handbook - TNO (Publications), accessed September 9, 2025, https://publications.tno.nl/publication/34643386/fXcPVHsX/TNO-2024-pqc-en.pdf
  49. Post-quantum cryptography (PQC) - Google Cloud, accessed September 9, 2025, https://cloud.google.com/security/resources/post-quantum-cryptography
  50. A Comparative Analysis of Quantum Cryptography Protocols: From BB84 to Device-Independent QKD | by Tedislava Vasileva | Aug, 2025 | Medium, accessed September 9, 2025, https://medium.com/@tedislava.vasileva/a-comparative-analysis-of-quantum-cryptography-protocols-from-bb84-to-device-independent-qkd-98c9be855e95
  51. Exploration of Evolving Quantum Key Distribution Network Architecture Using Model-Based Systems Engineering This work was partly supported by the Innovate UK [grant number 10102791] - arXiv, accessed September 9, 2025, https://arxiv.org/html/2508.15733v1
  52. A blueprint for large-scale quantum-network deployments | Request PDF - ResearchGate, accessed September 9, 2025, https://www.researchgate.net/publication/383701897_A_blueprint_for_large-scale_quantum-network_deployments
  53. China Establishes First Integrated Quantum Communication Network, accessed September 9, 2025, https://quantumzeitgeist.com/china-establishes-first-integrated-quantum-communication-network/
  54. Quantum Experiments at Space Scale - Wikipedia, accessed September 9, 2025, https://en.wikipedia.org/wiki/Quantum_Experiments_at_Space_Scale
  55. The state of U.S.-China quantum data security competition - Brookings Institution, accessed September 9, 2025, https://www.brookings.edu/articles/the-state-of-u-s-china-quantum-data-security-competition/
  56. Recent Progress in Quantum Key Distribution Network Deployments and Standards, accessed September 9, 2025, https://www.researchgate.net/publication/366434266_Recent_Progress_in_Quantum_Key_Distribution_Network_Deployments_and_Standards
  57. European Quantum Communication Infrastructure - EuroQCI ..., accessed September 9, 2025, https://digital-strategy.ec.europa.eu/en/policies/european-quantum-communication-infrastructure-euroqci
  58. ESA and European Commission to build quantum-secure space communications network, accessed September 9, 2025, https://www.esa.int/Applications/Connectivity_and_Secure_Communications/ESA_and_European_Commission_to_build_quantum-secure_space_communications_network
  59. Open European Quantum Key Distribution Testbed | OPENQKD | Project | Fact Sheet | H2020 - CORDIS, accessed September 9, 2025, https://cordis.europa.eu/project/id/857156
  60. Quantum Networking: Findings and Recommendations for Growing American Leadership, accessed September 9, 2025, https://www.quantum.gov/wp-content/uploads/2024/09/NQIAC-Report-Quantum-Networking.pdf
  61. Holding the quantum keys, innovation helps keep the power grid safe | ORNL, accessed September 9, 2025, https://www.ornl.gov/news/holding-quantum-keys-innovation-helps-keep-power-grid-safe
  62. Quantum Key Distribution (QKD) and Quantum Cryptography QC - National Security Agency, accessed September 9, 2025, https://www.nsa.gov/Cybersecurity/Quantum-Key-Distribution-QKD-and-Quantum-Cryptography-QC/
  63. Are Enterprises Ready for Quantum-Safe Cybersecurity? - arXiv, accessed September 9, 2025, https://arxiv.org/html/2509.01731v1
  64. The Tokyo QKD Network - The Project UQCC (Updating Quantum ..., accessed September 9, 2025, http://www.uqcc.org/QKDnetwork/
  65. Quantum ICT Advanced Development Center | Inauguration of the Tokyo QKD Network | NICT-National Institute of Information and Communications Technology, accessed September 9, 2025, https://www.nict.go.jp/en/quantum/topics/20101014-1.html
  66. Quantum Key Distribution Technology Promotion Committee, accessed September 9, 2025, https://qforum.org/en/committees/quantum-key-distribution
  67. Quantum Cryptography and Physical Layer Cryptography | National Institute of Information and Communications Technology - NICT, accessed September 9, 2025, https://www.nict.go.jp/en/quantum/about/crypt/english.html
  68. Toshiba Develops Large-Scale Quantum Key Distribution Network Control Technology and High-Speed Quantum Key Distribution Technology Toward Realizing Global-Scale Quantum Cryptography Communications -Expanding the scope of secure communications services by scaling up and accelerating quantum cryptography communications- | Corporate Laboratory (Komukai region) | Toshiba, accessed September 9, 2025, https://www.global.toshiba/ww/technology/corporate/rdc/rd/topics/24/2409-02.html
  69. arXiv:2503.21186v1 [quant-ph] 27 Mar 2025, accessed September 9, 2025, https://arxiv.org/pdf/2503.21186
  70. Quantum Network Goes the Distance Using Existing Telecom Infrastructure, accessed September 9, 2025, https://thequantuminsider.com/2025/04/24/quantum-network-goes-the-distance-using-existing-telecom-infrastructure/
  71. Quantum repeaters based on atomic ensembles and linear optics | Rev. Mod. Phys., accessed September 9, 2025, https://link.aps.org/doi/10.1103/RevModPhys.83.33
  72. Towards large-scale quantum key distribution network and its applications - ITU, accessed September 9, 2025, https://www.itu.int/en/ITU-T/Workshops-and-Seminars/2019060507/Documents/Hao_Qin_Presentation.pdf
  73. Space-Based Quantum Key Distribution: A Deep Dive Into QKD's Market Map And Competitive Landscape, accessed September 9, 2025, https://thequantuminsider.com/2025/03/05/space-based-quantum-key-distribution-a-deep-dive-into-qkds-market-map-and-competitive-landscape/
  74. Micius quantum experiments in space | Rev. Mod. Phys. - Physical Review Link Manager, accessed September 9, 2025, https://link.aps.org/doi/10.1103/RevModPhys.94.035001
  75. Eurasian-scale experimental satellite-based quantum key ..., accessed September 9, 2025, https://opg.optica.org/oe/abstract.cfm?uri=oe-32-7-11964
  76. Finite-Resource Performance of Small-Satellite-Based Quantum-Key-Distribution Missions, accessed September 9, 2025, https://link.aps.org/doi/10.1103/PRXQuantum.5.030101
  77. Why the NIST & NSA's Stance on Quantum Cryptography is Wrong | HEQA Security, accessed September 9, 2025, https://heqa-sec.com/blog/the-peculiar-stance-of-nist-and-the-nsa-on-quantum-cryptography-and-why-theyre-wrong/
  78. Quantum repeaters: From quantum networks to the quantum internet ..., accessed September 9, 2025, https://link.aps.org/doi/10.1103/RevModPhys.95.045006
  79. Quantum repeaters and their role in information technology | Argonne National Laboratory, accessed September 9, 2025, https://www.anl.gov/article/quantum-repeaters-and-their-role-in-information-technology
  80. Quantum repeaters for secure communication | News | Jan 27, 2025 ..., accessed September 9, 2025, https://www.fmq.uni-stuttgart.de/barz-group/news/Quantum-repeaters-for-secure-communication/
  81. Engineering Challenges in All-photonic Quantum Repeaters - arXiv, accessed September 9, 2025, https://arxiv.org/html/2405.09876v1
  82. Complete analysis of a realistic fiber-based quantum repeater scheme - ResearchGate, accessed September 9, 2025, https://www.researchgate.net/publication/395098639_Complete_analysis_of_a_realistic_fiber-based_quantum_repeater_scheme
  83. Quantum Memories and Repeaters: Challenges - tec@gov, accessed September 9, 2025, https://tec.gov.in/pdf/QC/Dr.%20Nixon%20Patel_1_2.pdf
  84. Experimental nested purification for a linear optical quantum repeater - ResearchGate, accessed September 9, 2025, https://www.researchgate.net/publication/320171276_Experimental_nested_purification_for_a_linear_optical_quantum_repeater
  85. Stony Brook University-Led Team Receives $4M NSF Grant to Develop 10-Node Quantum Network, accessed September 9, 2025, https://news.stonybrook.edu/newsroom/stony-brook-university-led-team-receives-4m-nsf-grant-to-develop-10-node-quantum-network/
  86. Performance comparison of homomorphic encryption scheme implementations, accessed September 9, 2025, https://www.etran.rs/2021/zbornik/Papers/104_RTI_2.5.pdf
  87. HOMOMORPHIC ENCRYPTION: A COMPREHENSIVE REVIEW - Journal of Emerging Technologies and Innovative Research, accessed September 9, 2025, https://www.jetir.org/papers/JETIR2308672.pdf
  88. A Comparison of the Homomorphic Encryption Libraries HElib, SEAL and FV-NFLlib - Prometheus, accessed September 9, 2025, https://www.h2020prometheus.eu/sites/default/files/2022-06/AguilarMelchor2019_Chapter_AComparisonOfTheHomomorphicEnc.pdf
  89. A systematic review of homomorphic encryption and its contributions in healthcare industry, accessed September 9, 2025, https://pmc.ncbi.nlm.nih.gov/articles/PMC9062639/
  90. Performance Comparison of Homomorphic ... - GitHub Pages, accessed September 9, 2025, https://proceedings-of-deim.github.io/DEIM2023/5b-9-2.pdf
  91. A Comparative Study of Homomorphic Encryption Schemes Using Microsoft SEAL, accessed September 9, 2025, https://www.researchgate.net/publication/356642138_A_Comparative_Study_of_Homomorphic_Encryption_Schemes_Using_Microsoft_SEAL
  92. Comparison of FHE Schemes and Libraries for Efficient Cryptographic Processing - International Conference on Computing, Networking and Communications, accessed September 9, 2025, http://www.conf-icnc.org/2024/papers/p584-tsuji.pdf
  93. Faster homomorphic comparison operations for BGV and BFV - ResearchGate, accessed September 9, 2025, https://www.researchgate.net/publication/351159408_Faster_homomorphic_comparison_operations_for_BGV_and_BFV
  94. [Resource Topic] 2025/1460: A Performance Comparison of the Homomorphic Encryption Schemes CKKS and TFHE - Ask Cryptography, accessed September 9, 2025, https://askcryp.to/t/resource-topic-2025-1460-a-performance-comparison-of-the-homomorphic-encryption-schemes-ckks-and-tfhe/24779
  95. Decoding ZK-SNARK VS STARK: An In-Depth Comparative Analysis - Calibraint, accessed September 9, 2025, https://www.calibraint.com/blog/zk-snark-vs-stark-differences-comparison
  96. Comparing ZK-SNARKs & ZK-STARKs: Key Distinctions In Blockchain Privacy Protocols, accessed September 9, 2025, https://hacken.io/discover/zk-snark-vs-zk-stark/
  97. zk-STARK vs zk-SNARK : An In-Depth Comparative Analysis - QuillAudits, accessed September 9, 2025, https://www.quillaudits.com/blog/ethereum/zk-starks-vs-zk-snarks
  98. Diving into the zk-SNARKs Setup Phase | by Daniel Benarroch | QEDIT | Medium, accessed September 9, 2025, https://medium.com/qed-it/diving-into-the-snarks-setup-phase-b7660242a0d7
  99. Trusted Setup - Fundamentals of Zero-Knowledge Proofs (ZKPs) - Blockchain and Smart Contract Development Courses - Cyfrin Updraft, accessed September 9, 2025, https://updraft.cyfrin.io/courses/fundamentals-of-zero-knowledge-proofs/fundamentals/trusted-setup
  100. Full Guide to Understanding zk-SNARKs and zk-STARKS - Cyfrin, accessed September 9, 2025, https://www.cyfrin.io/blog/a-full-comparison-what-are-zk-snarks-and-zk-starks
  101. Benchmarking ZKP Development Frameworks: the Pantheon of ZKP - Ethereum Research, accessed September 9, 2025, https://ethresear.ch/t/benchmarking-zkp-development-frameworks-the-pantheon-of-zkp/14943
  102. zk-SNARKs vs. Zk-STARKs vs. BulletProofs? (Updated) - Ethereum Stack Exchange, accessed September 9, 2025, https://ethereum.stackexchange.com/questions/59145/zk-snarks-vs-zk-starks-vs-bulletproofs-updated
  103. Performance Comparison of Two Generic MPC-frameworks with Symmetric Ciphers - SciTePress, accessed September 9, 2025, https://www.scitepress.org/Papers/2020/98317/98317.pdf
  104. Secure multi-party computation - Wikipedia, accessed September 9, 2025, https://en.wikipedia.org/wiki/Secure_multi-party_computation
  105. MP-SPDZ – A Versatile Framework for Multi-Party Computation, accessed September 9, 2025, https://www.cwi.nl/documents/195228/presentation%20Marcel%20Keller.pdf
  106. MOTION – A Framework for Mixed-Protocol Multi-Party Computation - Cryptography and Privacy Engineering, accessed September 9, 2025, https://encrypto.de/papers/BDST22.pdf
  107. Extending the Security of SPDZ with Fairness - Privacy Enhancing Technologies Symposium, accessed September 9, 2025, https://petsymposium.org/popets/2024/popets-2024-0053.pdf
  108. Secure Multiparty Computation, accessed September 9, 2025, https://hajji.org/en/crypto/secure-multiparty-computation
  109. ZKMPC: Publicly Auditable MPC for general-purpose computations - Ethereum Research, accessed September 9, 2025, https://ethresear.ch/t/zkmpc-publicly-auditable-mpc-for-general-purpose-computations/20956
  110. data61/MP-SPDZ: Versatile framework for multi-party ... - GitHub, accessed September 9, 2025, https://github.com/data61/MP-SPDZ
  111. MP-SPDZ documentation - Read the Docs, accessed September 9, 2025, https://mp-spdz.readthedocs.io/_/downloads/en/latest/pdf/
  112. Zero-knowledge proofs explained: zk-SNARKs vs zk-STARKs - LimeChain, accessed September 9, 2025, https://limechain.tech/blog/zero-knowledge-proofs-explained
  113. Privacy-Preserving Machine Learning Using Cryptography, accessed September 9, 2025, https://www.researchgate.net/publication/359813046_Privacy-Preserving_Machine_Learning_Using_Cryptography
  114. Private and Verifiable Computation - Dimitris Mouris, accessed September 9, 2025, https://jimouris.github.io/publications/mouris2024thesis.pdf
  115. A Review of Privacy Enhancement Methods for Federated Learning in Healthcare Systems - PMC - PubMed Central, accessed September 9, 2025, https://pmc.ncbi.nlm.nih.gov/articles/PMC10418741/
  116. arXiv:2501.06953v1 [cs.CR] 12 Jan 2025, accessed September 9, 2025, https://arxiv.org/pdf/2501.06953
  117. Checks and balances: Machine learning and zero-knowledge proofs - a16z crypto, accessed September 9, 2025, https://a16zcrypto.com/posts/article/checks-and-balances-machine-learning-and-zero-knowledge-proofs/
  118. A Survey of Zero-Knowledge Proof Based Verifiable Machine Learning - arXiv, accessed September 9, 2025, https://arxiv.org/pdf/2502.18535
  119. 5 Use Cases for Zero-Knowledge Proofs in Machine Learning ..., accessed September 9, 2025, https://coinmarketcap.com/academy/article/5-use-cases-for-zero-knowledge-proofs-in-machine-learning
  120. Zero-knowledge Proof Meets Machine Learning in Verifiability: A Survey - SciSpace, accessed September 9, 2025, https://scispace.com/pdf/zero-knowledge-proof-meets-machine-learning-in-verifiability-2t1p4poh3l.pdf
  121. Recent Advances in Machine Learning for Differential Cryptanalysis - ResearchGate, accessed September 9, 2025, https://www.researchgate.net/publication/375634453_Recent_Advances_in_Machine_Learning_for_Differential_Cryptanalysis
  122. Recent Advances in Machine Learning for Differential Cryptanalysis - Springer Professional, accessed September 9, 2025, https://www.springerprofessional.de/en/recent-advances-in-machine-learning-for-differential-cryptanalys/26298622
  123. Application of Machine Learning in Cryptanalysis Concerning ..., accessed September 9, 2025, https://www.researchgate.net/publication/353021047_Application_of_Machine_Learning_in_Cryptanalysis_Concerning_Algorithms_from_Symmetric_Cryptography
  124. E. Bellini, A. Hambitzer, M. Rossi A SURVEY ON MACHINE LEARNING APPLIED TO SYMMETRIC CRYPTANALYSIS - Seminario Matematico - Politecnico di Torino, accessed September 9, 2025, https://seminariomatematico.polito.it/rendiconti/80-2/Bellini.pdf
  125. (PDF) Advance attacks on AES: A comprehensive review of side channel, fault injection, machine learning and quantum techniques - ResearchGate, accessed September 9, 2025, https://www.researchgate.net/publication/391274324_Advance_attacks_on_AES_A_comprehensive_review_of_side_channel_fault_injection_machine_learning_and_quantum_techniques
  126. A machine learning approach against a masked AES - CARDIS Conferences, accessed September 9, 2025, https://cardis.org/cardis2013/proceedings/CARDIS2013_5.pdf
  127. Implementation Study of Cost-Effective Verification for Pietrzak's Verifiable Delay Function in Ethereum Smart Contracts - arXiv, accessed September 9, 2025, https://arxiv.org/html/2405.06498v3
  128. Verifiable Delay Functions (VDFs) - Crypto.com, accessed September 9, 2025, https://crypto.com/glossary/verifiable-delay-functions-vdfs
  129. Verifiable Delay Functions. A brief and gentle introduction | by Ramses Fernandez - Medium, accessed September 9, 2025, https://medium.com/rootstock-tech-blog/verifiable-delay-functions-8eb6390c5f4
  130. Verifiable Delay Functions - Emperor, accessed September 9, 2025, https://crypto.mirror.xyz/eoHWA_mpUkJUK49Eujdd8Uq-jFDb1M8DhR5qeAvitjA
  131. Ethereum 2.0 - MIH VC, accessed September 9, 2025, https://www.mih.vc/ethereum-2-0/
  132. Verifiable Delay Functions (VDFs): A Deep Dive into Sequential ..., accessed September 9, 2025, https://tokenminds.co/blog/knowledge-base/verifiable-delay-functions
  133. A comparative review on symmetric and asymmetric DNA-based cryptography, accessed September 9, 2025, https://www.researchgate.net/publication/346723295_A_comparative_review_on_symmetric_and_asymmetric_DNA-based_cryptography
  134. (PDF) A Review of DNA Cryptography - ResearchGate, accessed September 9, 2025, https://www.researchgate.net/publication/387464895_A_Review_of_DNA_Cryptography
  135. DNA based Cryptography: An Overview and Analysis - ResearchGate, accessed September 9, 2025, https://www.researchgate.net/publication/269098843_DNA_based_Cryptography_An_Overview_and_Analysis
  136. Chaotic cryptology - Wikipedia, accessed September 9, 2025, https://en.wikipedia.org/wiki/Chaotic_cryptology
  137. (PDF) Chaos-based cryptography: A brief overview - ResearchGate, accessed September 9, 2025, https://www.researchgate.net/publication/3432338_Chaos-based_cryptography_A_brief_overview
  138. Cryptanalysis of an Image Encryption Algorithm Using DNA Coding and Chaos - MDPI, accessed September 9, 2025, https://www.mdpi.com/1099-4300/27/1/40
  139. On Cryptanalysis Techniques in Chaos-based Cryptography - ResearchGate, accessed September 9, 2025, https://www.researchgate.net/publication/386305360_On_Cryptanalysis_Techniques_in_Chaos-based_Cryptography