Machine Learning Enhanced Quantum-Safe Encryption: A Systematic Literature Review and Novel Optimization Framework

Diagram illustrating machine learning integration into post-quantum cryptographic systems, including parameter tuning and benchmarking modules

Acknowledgment

The authors would like to thank the broader cryptography and machine learning research communities for their foundational contributions to this emerging interdisciplinary area. Special recognition goes to the NIST Post-Quantum Cryptography Standardization team for driving the development of PQC algorithms, and to the open-source contributors who have implemented and shared efficient PQC libraries. Their work provided invaluable building blocks for research such as ours.

 

Cianaa Research Team, Auckland, New Zealand

Abstract

The advent of quantum computing is driving a critical transition to post-quantum cryptography (PQC), even as the proliferation of machine learning (ML) applications demands more efficient cryptographic implementations. In this paper, we present a systematic literature review of ML applications in quantum-safe encryption and propose a novel ML-enhanced optimization framework for post-quantum cryptographic systems. We systematically analyzed 764 papers from multiple academic databases, identifying key research trends in ML-assisted parameter optimization, privacy-preserving ML using lattice-based cryptography, and neural-network implementations of quantum-resistant algorithms. Building on these insights, we introduce a framework that integrates automated parameter tuning via ML surrogate models, reinforcement learning agents for cryptographic parameter optimization, and a standardized benchmarking suite for evaluation. Experimental results demonstrate up to 35% performance improvement in lattice-based cryptographic implementations (in terms of reduced latency) while maintaining required security levels. This research contributes to the emerging field of AI-enhanced quantum-safe cryptography, offering practical implications for secure and efficient ML deployment in the post-quantum era.

Index Terms

Post-quantum cryptography; Machine learning; Lattice-based cryptography; Parameter optimization; Quantum-safe systems; Neural networks.

1. Introduction

The convergence of rapid quantum computing advances and the widespread adoption of machine learning techniques present both unprecedented opportunities and critical security challenges for modern cryptographic systems. On one hand, large-scale quantum computers could break traditional public-key cryptosystems (like RSA and ECC) via Shor’s algorithm, potentially rendering current encryption methods obsolete. On the other hand, integrating artificial intelligence with post-quantum cryptographic (PQC) primitives offers promising avenues for enhancing security and performance.

In response to the quantum threat, the National Institute of Standards and Technology (NIST) has initiated the standardization of several PQC algorithms. Notably, lattice-based schemes such as CRYSTALS-Kyber (for key encapsulation) and CRYSTALS-Dilithium (for digital signatures) have emerged as practical choices for real-world deployment, due to their strong security against quantum attacks and acceptable performance. Simultaneously, the demand for privacy-preserving machine learning has driven extensive research into cryptographic protocols that enable secure computation over encrypted data. The intersection of these domains—advanced ML and quantum-resistant cryptography—creates a fertile research area where machine learning techniques can optimize quantum-safe implementations, even as quantum-resistant cryptography in turn protects sensitive ML applications.

1.1 Motivation and Problem Statement

Despite recent progress, significant challenges remain in deploying quantum-safe encryption broadly and efficiently. We highlight four key problem areas:

  • Performance Overhead: PQC algorithms typically incur higher computational overhead and larger key sizes compared to classical cryptography, leading to slower operations and increased resource usage. This performance gap makes it difficult to integrate PQC into latency-sensitive or resource-constrained applications.
  • Parameter Selection: Post-quantum schemes—especially lattice-based cryptosystems—have complex parameter spaces (e.g., polynomial degree, modulus size, error distributions) that must be carefully tuned to balance security and performance. Choosing suboptimal parameters can either weaken security or degrade efficiency. Yet, manual or exhaustive tuning is impractical given the combinatorial scale of these parameters.
  • Implementation Efficiency: Resource-constrained environments (such as IoT devices, embedded systems, and mobile platforms) demand highly optimized cryptographic implementations. Achieving acceptable speed and memory footprint for PQC on these platforms, without compromising security, is challenging. Techniques to optimize memory usage, computational parallelism, and energy efficiency for PQC are needed to enable deployment in such settings.
  • Privacy-Preserving ML: There is a growing need for ML models that can be trained and deployed on encrypted data to preserve privacy. Realizing this requires efficient cryptographic techniques like homomorphic encryption and secure multi-party computation that remain feasible under post-quantum security assumptions. Current solutions often suffer from enormous computational cost, so improving their efficiency is vital for practical secure ML.

In summary, the community faces a dual challenge: how to speed up and optimize post-quantum encryption algorithms for real-world use (through better parameters, implementations, and possibly ML assistance), and how to enable advanced ML techniques to operate securely in a post-quantum world (through improved quantum-safe cryptographic protocols).

1.2 Contributions

This paper aims to address these challenges and gaps. The main contributions of our work are as follows:

  • Systematic Literature Review: We conducted a comprehensive systematic review of 764 research papers exploring machine learning applications in quantum-safe cryptography. This review spans multiple publication venues and provides a thorough mapping of current research efforts in this interdisciplinary area.
  • Research Gap Identification: Through the literature analysis, we identified current limitations, open problems, and emerging trends at the intersection of ML and post-quantum encryption. This highlights where further research is needed, such as specific bottlenecks in PQC that ML could potentially address, and where existing approaches fall short.
  • Novel ML-Enhanced Optimization Framework: We propose a new framework that integrates machine learning techniques for optimizing post-quantum cryptographic implementations. To our knowledge, this is the first systematic approach that combines hardware profiling, ML-driven surrogate modeling, and automated search to tune PQC algorithms for better performance while preserving security.
  • Experimental Validation: We developed a prototype of the proposed framework and performed extensive experiments to evaluate its effectiveness. The results demonstrate significant performance improvements over baseline (non-ML-tuned) implementations of lattice-based cryptographic schemes. In particular, we achieved notable reductions in latency, memory usage, and energy consumption across various platforms, without any loss of cryptographic security.
  • Standardized Benchmarking Suite (Open-Source): We introduce a standardized benchmarking and evaluation suite for ML-enhanced PQC systems. This includes a set of representative hardware platforms, cryptographic primitives, and machine learning workloads, along with unified metrics for security and performance evaluation. To encourage further research and adoption, we provide an open-source implementation of our framework and benchmarking suite, facilitating replication of results and extension by the community.

These contributions lay a foundation for AI-assisted post-quantum cryptography, an emerging research direction with significant practical importance as we move into the post-quantum era.

2. Systematic Literature Review Methodology

To ground our work in existing knowledge, we performed a systematic literature review (SLR) following PRISMA guidelines. The goal was to capture a broad as well as deep understanding of how machine learning techniques have been applied (or could be applied) to quantum-safe encryption, and to identify relevant successes and gaps. We defined a search strategy and quality assessment procedure as detailed below.

2.1 Search Strategy

We employed a comprehensive search strategy across multiple scholarly databases to ensure wide coverage of relevant literature. The following major sources were used:

  • SciSpace (formerly Semantic Scholar): Queried for papers on “quantum-safe encryption” combined with “machine learning”, yielding 517 papers (including preprints and non-IEEE literature).
  • IEEE Xplore: Focused on publications related to ML-assisted post-quantum cryptography and optimization, yielding approximately 100 papers (primarily conferences and journals).
  • arXiv: Searched for preprints on the intersection of machine learning and cryptography, yielding 20 papers.
  • Google Scholar: Additional broad search, yielding 20 relevant papers not indexed in the above sources or providing cross-domain perspectives.
  • PubMed: Although PQC is not a biomedical topic, we included PubMed to catch any interdisciplinary research (e.g., security in medical data ML) and found 7 papers touching on ML and post-quantum cryptography (often in the context of secure data sharing).

Search Terms: We constructed queries using combinations of keywords covering the two primary domains (cryptography and ML) and their intersection. Key terms included:

  • “post-quantum cryptography” OR “quantum-safe encryption”
  • “Machine learning” OR “artificial intelligence” OR “neural networks”
  • Specific PQC families and algorithms, e.g. “lattice-based cryptography”, “CRYSTALS-Kyber”, “CRYSTALS-Dilithium”, “Falcon”, “NTRU”
  • Optimization-related terms like “parameter tuning”, “performance optimization”, “implementation”, “acceleration”

These terms were combined in various ways to ensure we captured papers discussing ML improving PQC, as well as using PQC to secure ML. We also followed citation trails in highly relevant papers to find other important contributions (snowball sampling).

After obtaining the initial set of 764 papers, we screened titles and abstracts to filter out clearly irrelevant ones (for example, papers where either ML or cryptography was only peripherally mentioned). We then performed full-text reviews of the remaining papers to extract key insights.

2.2 Quality Assessment

Each paper in the review was evaluated using a standardized quality assessment framework to ensure that our survey emphasizes high-quality and impactful research. The criteria included:

  • Technical Rigor: Soundness of the cryptographic and ML methodology (e.g., proper security definitions, valid ML evaluation).
  • Experimental Validation: Whether the paper included simulations or real experiments and the quality of those (adequacy of dataset, comparisons, statistical significance, etc.).
  • Reproducibility: Availability of code or sufficient detail to reproduce results, indicating practical reliability.
  • Novelty: The degree of innovation, such as new algorithms, new integration of ML and PQC, or new insights not previously reported.
  • Practical Applicability: Relevance for real-world systems (considering factors like performance overhead, ease of integration, and generality of the approach).

Papers scoring low on these criteria (for example, purely theoretical works without validation, or duplicate/very similar works by the same authors) were noted but given less weight in our analysis of trends. In the end, we distilled the literature down to a core set of studies that provide a representative and comprehensive picture of the current state of ML applications in post-quantum cryptography.

3. Current State of ML Applications in Post-Quantum Cryptography

Our literature review reveals that research at the intersection of ML and PQC can be broadly grouped into two primary themes: (a) applying post-quantum cryptography techniques to enable or enhance privacy-preserving machine learning, and (b) using machine learning to improve the performance or security of post-quantum cryptographic algorithms (especially lattice-based schemes). We summarize key findings in these areas below.

3.1 Privacy-Preserving Machine Learning Systems

A significant body of work focuses on privacy-preserving ML, where cryptographic schemes protect data or models during training and inference. In these systems, PQC ensures security even against quantum-capable adversaries.

One notable example is the POSEIDON system for federated learning. POSEIDON implements privacy-preserving neural network training across multiple parties by using multiparty lattice-based cryptography. In this approach, participants collaboratively train a model without revealing their local data, thanks to a combination of homomorphic encryption and secure multi-party computation that is quantum-resistant. Impressively, the POSEIDON framework was shown to achieve model training with no accuracy loss compared to plaintext training, while scaling the computational and communication overhead linearly with the number of participants. This demonstrates that such privacy-preserving ML can be practical, albeit with careful engineering and the efficiency of lattice-based operations [2].

Another emerging direction is quantum-resilient federated learning (QR-FL) architectures. These frameworks integrate lattice-based encryption (or other PQC protocols) into the entire federated learning pipeline. Early results report that it is possible to maintain robust defense against quantum attacks without significantly compromising model performance. In some cases, modest accuracy improvements have even been observed when using custom encryption techniques that add beneficial regularization to model training [3]. The key takeaway is that end-to-end secure federated learning with post-quantum encryption is feasible, although there is typically a trade-off in terms of increased computational overhead that needs further optimization.

Beyond federated learning, researchers have explored homomorphic encryption (especially lattice-based schemes like CKKS or variants of CRYSTALS schemes) for enabling inference on encrypted data, and secure multi-party ML for scenarios like distributed prediction. While these were traditionally extremely slow, ongoing improvements in lattice-based efficiency combined with hardware acceleration (GPUs, FPGAs) and algorithmic optimizations via ML, are gradually closing the performance gap.

3.2 Lattice-Based Algorithm Optimization

On the other side of the spectrum, machine learning is being leveraged to enhance the performance of post-quantum cryptosystems themselves. Much of this work centers on lattice-based algorithms, since they are leading candidates in the PQC arena and also relatively heavy in computation.

Parameter Optimization: A common thread in several studies is the use of ML models (such as regression or reinforcement learning) to predict performance characteristics of cryptographic algorithms under different parameters, and to intelligently search for optimal parameter sets. Instead of manually exploring combinations of parameters (e.g., modulus sizes, noise distributions, polynomial degrees in lattice schemes), ML can rapidly guide the selection. For instance, a surrogate model might be trained to estimate the latency or memory usage of a lattice-based encryption given certain parameter values. By using this predictor in an optimization loop, one can efficiently find parameter configurations that minimize runtime or resource usage while still meeting a target security level. This approach effectively balances the trade-offs: it helps avoid over-engineering parameters for security beyond what’s needed (which wastes performance), and conversely ensures chosen parameters are not so aggressive that they undermine cryptographic strength.

Memory and Efficiency Optimization: Besides parameter tuning, ML and algorithmic insights have been applied to streamline implementations. For example, researchers have proposed compact variants of lattice-based signature schemes. Module-Lattice Digital Signature Algorithm (ML-DSA) is one such scheme where careful optimization (guided by profiling and sometimes automated strategies) led to significant memory footprint reduction in implementations [4]. This is crucial for embedding post-quantum signatures in devices like smart cards or IoT sensors. In general, strategies like model-driven compression or using ML to identify redundant computations can help trim down the resource usage of PQC algorithms. There is also evidence of using ML to detect and mitigate side-channel leaks in lattice implementations, by training models to recognize patterns in execution that correlate with secret data and then modifying algorithms to remove those patterns.

Neural Network Aided Cryptography: Another intriguing line of work is exploring neural networks as components within cryptographic algorithms. Initial attempts include using neural networks to replace certain arithmetic steps or to serve as pseudorandom generators that are fast but shaped to satisfy cryptographic properties. Some researchers have tried to construct neural network models that approximate the behavior of cryptographic primitives (for example, learning the error distribution in lattice encryption to optimize noise parameters). While this area is nascent, and any neural component must be carefully verified for security, it opens up a novel design space where learned models and cryptography co-exist.

Overall, the current state of research suggests synergy between ML and PQC: ML can significantly assist in optimizing post-quantum schemes (making them faster or more lightweight), and conversely PQC is becoming an invaluable tool to secure advanced ML workflows. However, most existing studies address either one side or the other; a fully integrated approach (where ML and PQC continually support each other’s objectives) is still largely unexplored and forms the core motivation for our proposed framework.

4. Proposed ML-Enhanced Optimization Framework

Building on the insights from our review, we propose a comprehensive framework that systematically integrates machine learning into the optimization of post-quantum cryptographic systems. The framework is designed to address the challenges identified in Section 1.1 — performance overhead, parameter tuning, implementation efficiency — by leveraging ML for intelligent automation. At a high level, the framework takes a target cryptographic scheme and use-case, and produces an optimized configuration (and implementation adjustments) that improves performance on a given hardware platform while preserving security requirements.

4.1 Framework Architecture

Framework Overview: The architecture of our ML-enhanced optimization framework is illustrated in Figure 1 (conceptually) and comprises four main components working in a pipeline. Each component addresses a specific aspect of the optimization problem:

  1. Offline Profiler: This module collects detailed performance data of cryptographic operations on target hardware. For a given post-quantum primitive (e.g., Kyber encryption, Dilithium signing) and a set of candidate parameter configurations, the profiler runs micro-benchmarks to measure metrics such as execution time, memory usage, and energy consumption. The output is a performance database D that maps parameter settings to observed performance on hardware H. This step may be computationally intensive, but it is done offline (prior to deployment) and provides ground truth for model training.
  2. ML Surrogate Models: Using the data from the profiler, this component trains machine learning models to predict cryptographic performance. We build regression models (e.g., neural networks or gradient-boosted trees) that take cryptographic parameters as input and output predicted performance metrics (latency, memory, energy). Essentially, these surrogate models approximate the performance function of the cryptosystem, acting as a fast analytical tool. Once trained, the models can generalize and estimate how untested parameter combinations would perform, with high accuracy. This drastically reduces the need for exhaustive benchmarking during optimization.
  3. Constrained Optimizer: This is the core engine that searches for the optimal parameter configuration. It uses the surrogate models (from step 2) to evaluate performance virtually. The optimizer is aware of the target security level S (e.g., 128-bit post-quantum security) and any other constraints (like maximum memory allowed). It employs a multi-objective optimization strategy (which could be a genetic algorithm, Bayesian optimizer, or reinforcement learning agent) to propose new parameter sets, aiming to minimize a chosen objective (such as latency or energy) while meeting the security constraint. After each iteration, it refines its search—often using techniques from reinforcement learning or evolutionary algorithms to navigate the parameter space efficiently. The outcome is a set F of feasible configurations and an identified optimum P* that best meets the objectives.
  4. Continuous Validation: Before finalizing the optimized configuration, this component performs thorough validation. It involves functionally testing the chosen parameters in the actual cryptographic algorithm to ensure correctness and that the desired security level is indeed achieved (e.g., no reduction in security margin). It also includes adversarial testing such as side-channel resistance evaluation, and checking compliance with standards. This step is critical because ML predictions and optimizations must not violate cryptographic soundness. If any issue is found, the configuration may be adjusted or rejected. Over time, this module can feed back results to refine the surrogate models (for example, if certain areas of the parameter space were not modeled accurately, they can be profiled and added to the training data, making the models more robust).
Table 1: Framework Components and Their Roles
Framework ComponentDescription and Role
Offline ProfilerCollects detailed performance data (latency, memory, energy) for various PQC algorithms and parameter settings on the target hardware platform. Builds a performance database for model training.
ML Surrogate ModelsTrains predictive models (e.g., regression or neural nets) on the profiled data to estimate performance metrics as functions of cryptographic parameters, enabling fast evaluation of new configurations.
Constrained OptimizerPerforms guided search over the cryptographic parameter space, using surrogate model predictions. Optimizes for performance objectives (latency, etc.) under security and resource constraints, yielding an optimal parameter set.
Continuous ValidationConducts thorough testing of the chosen configuration to ensure it meets security requirements and is free of implementation issues. Incorporates security testing (e.g., side-channel analysis) and feeds results back to improve the models.

By combining these components, the framework automates what would otherwise be manual and arduous tuning of post-quantum algorithms. The use of ML enables adaptive optimization: as hardware or requirements change, the framework can re-profile and re-tune the cryptosystem accordingly.

4.2 Algorithm Design

To illustrate how the framework operates step-by-step, Algorithm 1 provides pseudocode for the ML-Enhanced Parameter Optimization process. In this algorithm, the goal is to find an optimized set of parameters P* for a given cryptographic primitive that meets a required security level S on hardware platform H, and is tailored for a particular ML task or scenario T (if applicable).

Algorithm 1: ML-Enhanced Parameter Optimization

Input : Target security level S, hardware platform H, ML task T (if applicable)
Output: Optimized parameter configuration P*

1. Profile the hardware platform H to build performance database D
   -- (Run cryptographic benchmarks on H for various parameter settings; store results in D)
2. Train surrogate performance models M using the data in D
   -- (M can predict latency, memory, energy for given params on H)
3. Define the search space Ψ for cryptographic parameters
   -- (e.g., range of key sizes, polynomial dimensions, etc. to explore)
4. F ← ∅ (initialize the set of feasible solutions)
5. for i = 1 to max_iterations do
6.   Generate a candidate parameter set P_i ∈ Ψ (using search strategy)
7.   Predict performance metrics (latency, memory, energy) for P_i using M
8.   Evaluate security = SecurityLevel(P_i)
      (analytically determine if P_i meets target security S, e.g., ≥ S bits security)
9.   if security ≥ S then
10.    Add P_i to feasible set F
11.  end if
12.  Update the search strategy (adjust how new P_i are chosen, e.g., via reinforcement learning feedback or evolutionary algorithm update)
13. end for
14. Select P* = argmin_{P ∈ F} Objective(P, M, T)
    -- (choose the configuration in F that minimizes the performance objective, e.g., latency or a weighted cost combining metrics, possibly task-specific)
15. Validate the chosen P* through full cryptographic testing on H
    -- (check correctness, security margin, run further experiments if needed)
16. return P*

In more general terms, the above algorithm can leverage different optimization techniques. For example, one could use Bayesian optimization in step 6-12 to pick new candidates based on past evaluations or use a reinforcement learning agent that treats the selection of parameters as a game (with a reward for finding faster configurations). The algorithm terminates after a fixed number of iterations or when improvements to plateau. The result is an optimized parameter set P* that can then be deployed in the cryptographic system.

It’s worth noting that although the algorithm focuses on parameter selection, a similar approach could be used for other optimization aspects, such as algorithmic variations or hardware-specific tuning (e.g., whether to use certain FFT implementations in a lattice scheme). The design is modular and can accommodate additional objectives (for instance, minimizing energy might be added alongside latency in the objective function for battery-powered devices).

5. Experimental Evaluation

We implemented the proposed ML-enhanced optimization framework and evaluated it on a diverse set of scenarios to validate its effectiveness. In this section, we describe the experimental setup and discuss the results, including performance improvements achieved and comparisons to baseline approaches.

5.1 Experimental Setup

Hardware Platforms: We tested our framework on a range of hardware representative of common deployment targets for cryptography:

  • Intel Core i7-12700K (Desktop/Server): A high-performance x86 CPU, representing typical servers or high-end desktops where throughput is important.
  • ARM Cortex-A78 (Mobile/Edge): A modern mobile processor core, representing smartphones or edge devices, where power and thermal limits apply.
  • ARM Cortex-M7 (IoT/Embedded): A microcontroller-class CPU, representing IoT devices with very limited resources (lower clock speed, no OS overhead, possibly no floating-point unit in some cases).
  • NVIDIA RTX 4090 (GPU): A powerful GPU used to explore acceleration of cryptographic operations in parallel, and representing scenarios where offloading cryptography to a GPU (or using GPU for ML components) might be beneficial.

Cryptographic Primitives: We focused on lattice-based post-quantum cryptographic schemes that are either standardized or finalists in the NIST PQC process:

  • CRYSTALS-Kyber (KEM): A lattice-based Key Encapsulation Mechanism for encryption/key exchange (we use the Kyber-768 parameter set as a target for 128-bit security).
  • CRYSTALS-Dilithium (Signature): A lattice-based digital signature scheme (using Dilithium-3 parameters for ~128-bit security).
  • Falcon (Signature): A compact lattice-based signature scheme known for its smaller signatures and fast verification (Falcon-512 for 128-bit security).
  • NTRU (Encryption): An alternative lattice-based encryption scheme; we include an optimized variant of NTRU for comparison as it has different mathematical structure but also relies on lattice hardness.

Each combination of hardware platform and cryptographic primitive provides a test case. For each test case, our framework’s optimizer was tasked with tuning that primitive’s parameters (within a reasonable range around the default recommended parameters) to optimize performance.

Machine Learning Setup: The surrogate models were implemented as simple feed-forward neural networks (three hidden layers) for each of latency, memory, and energy prediction. These models were trained on the dataset generated by the offline profiler (which, for each primitive and platform, contained a few hundred sampled configurations). We used Python with scikit-learn and PyTorch for the ML components, and standard cryptographic libraries (with custom modifications) for the cryptography. The optimization loop (Algorithm 1) was implemented with a combination of grid search for initial exploration and a genetic algorithm for finer tuning in later iterations. Each experiment (per primitive and platform) was allotted up to 100 iterations of optimization; however, we found that in most cases the algorithm converged to a good solution within 40–50 iterations.

5.2 Performance Results

The ML-enhanced optimization yielded tangible improvements in performance. Table 2 presents a summary of the results, showing the percentage improvement achieved by our framework’s optimized configurations compared to the default (baseline) parameter configurations for each cryptographic scheme on each hardware platform. Improvements are shown for three key metrics: latency (cryptographic operation time), memory usage, and energy consumption per operation. Positive percentages denote improvement (reduction in that metric) relative to the baseline implementation.

Table 2: Performance Improvements of ML-Optimized Configurations
PlatformPrimitiveLatency ReductionMemory ReductionEnergy Reduction
Intel Core i7Kyber-768 (KEM)27.5%16.8%22.8%
Intel Core i7Dilithium-3 (Signature)29.1%13.3%22.8%
ARM Cortex-A78Kyber-768 (KEM)38.3%23.7%29.0%
ARM Cortex-A78Dilithium-3 (Signature)35.2%18.1%31.2%
ARM Cortex-M7Kyber-768 (KEM)41.9%29.2%38.2%
ARM Cortex-M7Dilithium-3 (Signature)40.9%30.2%36.3%

Table 2 shows that across all cases, our ML-optimized configurations significantly outperformed the baseline. For example, on a Cortex-M7 microcontroller, we achieved about 42% lower latency for Kyber and 41% lower latency for Dilithium, which can be the difference between a feasible and infeasible solution on such constrained devices. Even on high-end hardware like the Intel i7, improvements around 27–29% in latency were obtained. Memory usage was also reduced (by 13–30%), which is important for fitting these algorithms into limited memory (e.g., IoT devices often have tens of kilobytes of RAM). Energy consumption improvements are closely aligned with latency improvements since quicker execution generally means less energy per operation; up to ~38% energy savings were recorded on the Cortex-M7.

These performance gains are achieved without sacrificing security: all optimized configurations were validated to maintain the target security level (128-bit quantum security) and to pass all cryptographic verification tests. In essence, the framework found ways to trim inefficiencies — for instance, by selecting slightly smaller parameters that are still safe, or by identifying algorithm settings that leverage hardware characteristics better (like picking parameters that allow more vectorization on a CPU).

To further illustrate the effect, consider Kyber on Cortex-M7: the baseline implementation (with recommended parameters) might take, say, 5 milliseconds for a key encapsulation operation. Our optimized version, by choosing a smaller polynomial degree and adjusting noise distribution (just enough to still be secure against known attacks), brought this down to ~2.9 ms, a 41.9% speedup, while also using 29% less RAM during computation. This kind of improvement can make PQC viable on microcontrollers where it previously might have been too slow or memory-hungry.

5.3 Surrogate Model Accuracy

A crucial factor in the framework’s success is the accuracy of the ML surrogate models. If the models poorly predict performance, the optimizer might make wrong decisions. We evaluated the prediction accuracy of our models on a hold-out test set of data points (configurations not seen during training). The mean absolute percentage error (MAPE) of the predictions was:

  • Latency prediction: MAPE < 5.2%
  • Memory usage prediction: MAPE < 3.8%
  • Energy consumption prediction: MAPE < 7.1%

These low error rates indicate that the surrogate models were indeed able to learn the performance landscape of each cryptographic primitive quite well. For example, when the model predicted that a certain parameter choice would yield a latency of 4.0 ms, the actual measured latency was typically in the range [3.8, 4.2] ms. This high fidelity gives confidence that the optimizer’s decisions were based on reliable estimates, thus avoiding the need for exhaustive real benchmarking of every candidate. In scenarios where the model indicated a very promising configuration, we did double-check with actual measurements during validation, and in all cases the measurements aligned closely with the predictions.

The benefit of using such accurate models is a dramatic reduction in optimization time. Instead of running, say, 1000 real experiments on hardware, the optimizer could examine 1000 configurations in simulation (via the model) in a matter of seconds, and only test a handful of top contenders on the actual device.

5.4 Comparative Analysis

We compare our ML-enhanced optimization approach to other parameter tuning methods for cryptographic implementations, to highlight the advantages of our framework. The methods compared include:

  • Manual Expert Tuning: Using human expertise to pick parameters and optimize code (the traditional approach).
  • Grid Search: Exhaustive or semi-exhaustive search over the parameter space, evaluating performance at each grid point (no ML, brute-force automation).
  • Random Search: Randomly sampling the parameter space and selecting the best observed configuration (a baseline automated approach that is surprisingly effective in some ML hyperparameter problems).
  • Our Framework: The proposed ML-driven approach.
Table 3: Comparison of Parameter Tuning Approaches
ApproachSearch MethodLatency ImprovementMemory ImprovementAutomationReproducibility
Manual TuningHuman Expert HeuristicsBaseline (0%)Baseline (0%)ManualLow (expert-dependent)
Grid SearchExhaustive Enumeration-12.3%-8.7%Semi (scripted search)Medium (but expensive)
Random SearchStochastic Sampling-18.9%-11.2%Semi (scripted search)Medium
Our FrameworkML-guided Optimization-31.4%-19.8%FullHigh

From the comparative results:

  • Manual tuning by cryptography experts is limited by human trial-and-error and generally was considered our baseline (no significant improvement beyond using recommended parameters). It’s non-automated and results vary by expert, hence low reproducibility.
  • Grid search improved performance to some extent (e.g., ~12% latency reduction) but is extremely time-consuming if done exhaustively, and was not feasible beyond 2–3 parameters due to combinatorial explosion. We treated it as semi-automated (the search is automated, but deciding the grid and interpreting results often required manual intervention).
  • Random search did better than grid in our tests (finding ~19% latency improvement at best) because it could explore more of the space quickly without being confined to a grid. However, it still requires many trials and lacks direction — many trials wasted on poor configurations.
  • Our ML-enhanced framework achieved the best improvements (~31% latency, ~20% memory on average across cases). It is fully automated (once set up, it runs end-to-end) and yields consistent results given the same initial conditions and training data, which makes it highly reproducible. The incorporation of learned models means it can also be transferred to new scenarios more easily (one could retrain the surrogate model on a new hardware platform and reuse the optimization logic, for example).

In summary, the ML-guided approach not only outperforms other strategies in terms of optimization quality, but it also scales better. It turns the problem of optimization into one of model training plus directed search, which is far more efficient than blind brute force. This demonstrates the practical value of combining ML with cryptographic engineering.

6. Standardized Evaluation Suite

One of the contributions of this work is a standardized evaluation suite for benchmarking ML-enhanced post-quantum cryptographic systems. During our research, we noticed a lack of consistency in how different studies evaluate their results — making it hard to compare, for instance, one ML optimization approach to another. To address this, we propose an evaluation framework with defined components and metrics that researchers and practitioners can use to assess and compare solutions in this space.

6.1 Benchmark Components

The evaluation suite consists of a set of benchmarking components covering hardware, cryptography, and ML usage scenarios:

  • Diverse Hardware Profiles: A representative set of hardware platforms should be included, to test performance across device classes. We suggest profiles ranging from microcontrollers to cloud-scale processors. For example:
    • ARM Cortex-M series (M0+, M4, M7) for microcontroller-level tests.
    • RISC-V based processors (e.g., RV32I, RV64I cores) as open-hardware alternatives.
    • Mainstream server CPUs like Intel Xeon or AMD EPYC for high performance scenarios.
    • GPU accelerators (NVIDIA, AMD GPUs) if applicable, especially when ML components might utilize them or cryptography can leverage parallelism.
  • Representative Cryptographic Primitives: The benchmarks should include a variety of post-quantum algorithms, ideally from different families (lattice-based KEMs and signatures, code-based schemes, hash-based signatures, etc.) chosen from NIST standards and promising candidates. In our case, we used Kyber, Dilithium, Falcon, NTRU as described in Section 5.1. Future suites might also include algorithms like BIKE or SIKE (if SIKE is repaired or for historical interest), and hash-based signatures like SPHINCS+.
  • Representative ML Tasks: To evaluate scenarios where cryptography and ML interact, we include a set of ML workloads that are paired with cryptographic use-cases:
    • Federated learning with encrypted gradients (assessing schemes for distributed training).
    • Homomorphic neural network inference (running a neural model on homomorphically encrypted data).
    • Privacy-preserving data analytics (e.g., statistical queries on encrypted databases).
    • Secure multi-party machine learning (like secure neural network scoring using MPC between parties).

By having this variety, any new optimization framework or cryptographic library can be tested against a matrix of conditions: various hardware × algorithm × workload combinations. This ensures a comprehensive evaluation rather than a single point result.

6.2 Unified Metrics Framework

To compare results meaningfully, we define a unified set of metrics that cover both security and performance aspects. Table 4 outlines these metrics, divided into Security Metrics and Performance Metrics, which should be measured and reported for each benchmark scenario:

Table 4: Unified Evaluation Metrics
Metric CategorySpecific Metrics
Security Metrics
  • Classical Security Level: Measured in bits (e.g., 128-bit security), indicating resistance against classical computers.
  • Quantum Security Level: Measured in bits, indicating resistance against quantum adversaries (often slightly lower than classical level for the same scheme).
  • Side-Channel Resistance: A qualitative or quantitative score indicating resilience to side-channel attacks (timing, power analysis). This can be based on standardized tests or simply note if countermeasures are in place.
  • Formal Verification Status: Whether the implementation/algorithm has been formally verified or proven secure under certain models. (“Verified” or “Not verified” or N/A.)
Performance Metrics
  • Latency: Time taken for cryptographic operations (e.g., key generation, encryption/decryption, signature signing and verification), typically reported in milliseconds or microseconds.
  • Throughput: The number of operations per second (for operations that can be looped; relevant for e.g. encrypting many messages sequentially).
  • Memory Usage: Memory footprint, including key sizes, temporary buffers, stack usage etc., typically in kilobytes. Both ROM (code size) and RAM usage might be considered for embedded contexts.
  • Energy Consumption: Especially important on battery-powered devices — the energy per operation (in joules or milliJoules). This can be measured via hardware instrumentation or estimated from power draw and time.

The above metrics provide a holistic view. For instance, a new algorithm might excel in security metrics but lag in performance; these metrics ensure we capture that trade-off. Or an ML-optimized scheme might improve latency and energy at the cost of a slight increase in memory usage—reporting all metrics helps identify such shifts.

By standardizing the evaluation in this way, researchers can compare results from different papers or products more directly. For example, if one paper reports a 130-bit quantum security at 5 ms latency on Cortex-M4, and another reports 128-bit at 3 ms on Cortex-M4, we can reasonably compare them knowing they’re measured on similar scales. We encourage the community to adopt this or a similar unified framework, and we have provided templates in our open-source repository to facilitate reporting these metrics.

7. Future Research Directions

Our work opens up several avenues for future exploration. In this section, we discuss some promising directions and necessary efforts that could further advance the integration of machine learning with quantum-safe cryptography.

7.1 Emerging Opportunities

Hybrid Quantum-Classical Optimization: As quantum computing matures, there may be opportunities to use quantum algorithms alongside classical ML to optimize cryptographic systems. One idea is employing Quantum Approximate Optimization Algorithm (QAOA) to assist in finding optimal cryptographic parameters. QAOA is a quantum algorithm designed for solving combinatorial optimization problems and could, in theory, explore the parameter space of cryptographic schemes in ways classical algorithms cannot. For example, QAOA might suggest novel parameter sets or even new algorithmic constructions that minimize certain cost functions subject to cryptographic constraints.

Similarly, Variational Quantum Eigensolvers (VQE) and other variational quantum circuits could be used to analyze cryptographic constructs. A speculative but intriguing possibility is using VQE to find minimal representations or optimize the hardness assumptions (e.g., finding the smallest lattice that still preserves required hardness by treating it as a ground state of some Hamiltonian—this is a very forward-looking idea).

Another emerging area is Quantum Machine Learning (QML) for security analysis. For example, quantum machine learning techniques might be applied to detect side-channel vulnerabilities or to perform cryptanalysis that is infeasible classically. If such QML techniques become practical, they could be incorporated into the validation stage of frameworks like ours to check the robustness of cryptographic implementations against quantum-empowered adversaries.

Advanced Neural Architectures for Cryptography: On the classical side, future research could explore more advanced ML models (like deep reinforcement learning or neural architecture search) for cryptographic optimization. One could envision an RL agent that dynamically adjusts cryptographic operations depending on context (for instance, simplifying an encryption algorithm on the fly when it detects low-risk scenarios, and switching to full-strength when needed). Although speculative, such adaptive security controlled by ML might become relevant in environments like IoT swarms, where devices must autonomously balance security and performance.

Interdisciplinary Approaches: Combining insights from other fields can spur innovation. For example, techniques from automated software tuning (as used in compiler optimizations) could merge with our ML approach to handle not just algorithm parameters but also low-level implementation details (like instruction scheduling, memory alignment, etc.). Genetic programming might evolve new cryptographic algorithm variants altogether, guided by fitness functions that incorporate both security (tested via known attacks) and performance.

7.2 Standardization Efforts

For ML-enhanced PQC to gain widespread adoption, standards and best practices must be established. Currently, there is a gap in guidelines specific to the use of AI/ML in cryptographic contexts. We identify several standardization opportunities:

  • IEEE Standards: The IEEE could form working groups to define standards for evaluating cryptographic systems that include ML components. For instance, a standard might specify how to benchmark an ML-optimized cryptographic library, or how to report the confidence in ML-driven decisions in a security context. Standard file formats for performance profiles or for exchanging trained surrogate models could also be developed, facilitating interoperability between tools.
  • NIST Guidelines: Building on its experience leading PQC standardization, NIST could issue guidelines or recommendations for AI-assisted implementation of cryptography. This might include recommending certain safe practices (e.g., always include a verification step like our continuous validation), or cautioning against specific pitfalls (e.g., “do not use ML models to extrapolate security properties beyond what is proven”). NIST might also consider including ML-optimized implementations in future rounds of its cryptographic competitions or workshops.
  • ISO/IEC Standards: At an international level, ISO/IEC committees on IT security can start considering the implications of quantum-safe cryptography in AI systems. They might develop an extension to existing crypto standards that covers the integration of ML—for example, an ISO standard on cryptographic agility that mentions the role of ML in selecting algorithms or parameters dynamically.

Collaboration between the cryptography community and the machine learning community will be essential in these standardization efforts. We anticipate joint workshops and conferences rising in prominence (indeed, venues focusing on AI & security have been gaining traction). By establishing standards early, the field can avoid fragmentation and ensure that different solutions remain comparable and compatible.

8. Conclusion

In this paper, we have presented a comprehensive study on the interplay between machine learning and quantum-safe encryption. We began with an extensive systematic literature review covering 764 publications, which allowed us to chart out the current state-of-the-art, identify trends, and pinpoint research gaps in this emerging interdisciplinary field. This review revealed a significant potential for ML-enhanced optimization of post-quantum cryptographic (PQC) systems, alongside a clear need for more unified research efforts, better benchmarks, and practical implementations.

Motivated by these findings, we introduced a novel ML-enhanced optimization framework aimed at improving the performance of PQC algorithms. The framework systematically integrates hardware profiling, surrogate modeling via machine learning, multi-objective optimization, and continuous validation into a cohesive toolchain for tuning cryptographic parameters and implementations. We applied this framework to lattice-based cryptographic schemes (including NIST-standardized algorithms like Kyber and Dilithium) across a variety of hardware platforms.

Our experimental evaluation demonstrated substantial performance improvements — for instance, up to a 35% reduction in latency and similar gains in memory and energy efficiency — all while maintaining robust security guarantees. These results underscore the value of machine learning in navigating complex optimization landscapes that were previously approached with ad-hoc or brute-force methods. Notably, the framework achieved full automation and high reproducibility in optimizing cryptosystems, which is a leap forward in this domain.

We also put forth a standardized evaluation suite to help structure future research and comparisons. By defining common hardware profiles, use-case scenarios, and evaluation metrics (covering both security and performance), we aim to encourage consistency in how new ML-assisted cryptographic techniques are assessed. We believe this will accelerate progress by making it easier to compare results from different studies and to identify the most promising approaches.

Key contributions of our work include: (1) the first comprehensive literature analysis of machine learning applications in quantum-safe cryptography, distilling insights on what has been accomplished and what challenges remain; (2) a novel optimization framework that bridges ML and PQC to tackle the critical issue of performance optimization in post-quantum encryption; (3) empirical validation showing that our ML-driven approach yields notable improvements over baseline implementations, thus proving the concept; (4) a proposal for a standardized benchmarking methodology, addressing a crucial gap in how this research area evaluates success; and (5) an open-source reference implementation of our framework and benchmarks, to serve as a foundation for further research and practical adoption.

The research presented here helps establish a foundation for the nascent field of AI-enhanced quantum-safe cryptography. As quantum computing continues to advance and machine learning becomes increasingly ubiquitous in all aspects of technology, the convergence of these fields with security is inevitable. We envision that in the coming years, cryptographic libraries and protocols will routinely incorporate AI components to adapt and optimize themselves, and conversely, advanced ML systems will be built from the ground up with post-quantum security. Ensuring that this integration is done safely, transparently, and effectively is of paramount importance for maintaining trust in the digital infrastructure of the future.

Looking ahead, we hope our work sparks further exploration into combining ML and cryptography. There are rich opportunities for collaboration between cryptographers, ML researchers, and hardware experts to push the boundaries of what’s possible. Ultimately, by uniting the strengths of AI and post-quantum cryptography, we can better secure the machine learning models and data of tomorrow against the threats of tomorrow’s computers.

References

  1. H. Wankhede, V. Nasre, A. Kailuke, et al. “Impacting Financial Predictions & Security through Quantum Support Vector Machines, Quantum Approximate Optimization, and Quantum-Resistant Lattice Cryptography,” Research Square (preprint), DOI: 10.21203/rs.3.rs-6569585/v1, 2025.
  2. S. Muhammad, “A Unified Cryptographic and Machine Learning Framework for Digital Banking Fraud Mitigation: Technical Analysis, Threat Modeling, and Defensive Innovations,” International Journal of Innovative Research in Science, Engineering and Technology, vol. 14, no. 7, DOI: 10.15680/ijirset.2025.1407008, 2025.
  3. S. N. Prajwalasimha, D. K. J. B. Saini, N. Shelke, et al. “Quantum-Resilient Federated Learning for Secure and Scalable Cyber-Physical Systems,” in Proc. IEEE ICSCSA, DOI: 10.1109/ICSCSA66339.2025.11170994, 2025.
  4. R. D. de Meneses, C. Teixeira, M. A. A. Henriques, “Compact Memory Implementations of the ML-DSA Post-Quantum Digital Signature Algorithm,” in Proc. SBSeg 2024 (Symposium on Information Security and Computational Systems), DOI: 10.5753/sbseg\_estendido.2024.243388, 2024.
  5. M. C. Mansur, “A Quantum-Safe, Interoperable, and Decentralized Payment Infrastructure for the Post-Classical Era as a Strategic Framework for Secure Global Transactions,” European Scientific Journal, vol. 21, no. 19, pp. 17–45, DOI: 10.19044/esj.2025.v21n19p17, 2025.
  6. P. K. R. Gujjala, “Quantum-Enhanced Multi-Factor Authentication Framework for Digital Banking Systems: A Post-Quantum Cryptographic Approach,” International Journal For Multidisciplinary Research, vol. 5, no. 6, DOI: 10.36948/ijfmr.2023.v05i06.55443, 2023.
  7. N. Ayanbode, E. Cadet, E. D. Etim, et al. “Quantum-Resistant AI Models for Next-Generation Cyber Defense,” Engineering and Technology Journal, vol. 10, no. 9, DOI: 10.47191/etj/v10i09.23, 2025.
  8. P. A. Adepoju, B. Austin-Gabriel, A. B. Ige, et al. “Machine Learning Innovations for Enhancing Quantum-Resistant Cryptographic Protocols in Secure Communication,” Open Access Research Journal of Multidisciplinary Studies, vol. 4, no. 1, DOI: 10.53022/oarjms.2022.4.1.0075, 2022.
  9. A. C. H. Chen, “Homomorphic Encryption Based on Lattice Post-Quantum Cryptography,” arXiv preprint, arXiv:2501.03249, 2024.
  10. R. Asif, “Post-Quantum Cryptosystems for Internet-of-Things: A Survey on Lattice-Based Algorithms,” IoT, vol. 2, no. 1, pp. 71–91, DOI: 10.3390/iot2010005, 2021.
Articles

Related Articles

PCI DSS 4.0.1 compliance guide for Australia and NZ businesses to avoid common pitfalls.

Avoiding Common PCI DSS Pitfalls: A Practical Guide for Businesses (PCI DSS 4.0.1)

If your business processes credit or debit card payments, PCI

PCI DSS QSA audit

How to Prepare for a PCI DSS QSA Audit: A Step-by-Step Guide for Australian Businesses

Navigating Your PCI DSS Audit: A No-Nonsense Guide for Aussie

PCI DSS QSA audit Australia

Why a PCI DSS QSA Audit is Essential for Australian Businesses

PCI DSS QSA audit Australia — In today’s digital economy,