Paper 2023/019

Autoencoder-enabled Model Portability for Reducing Hyperparameter Tuning Efforts in Side-channel Analysis

Marina Krček, Delft University of Technology
Guilherme Perin, Leiden University
Abstract

Hyperparameter tuning represents one of the main challenges in deep learning-based profiling side-channel analysis. For each different side-channel dataset, the typical procedure to find a profiling model is applying hyperparameter tuning from scratch. The main reason is that side-channel measurements from various targets contain different underlying leakage distributions. Consequently, the same profiling model hyperparameters are usually not equally efficient for other targets. This paper considers autoencoders for dimensionality reduction to verify if encoded datasets from different targets enable the portability of profiling models and architectures. Successful portability reduces the hyperparameter tuning efforts as profiling model tuning is eliminated for the new dataset, and tuning autoencoders is simpler. We first search for the best autoencoder for each dataset and the best profiling model when the encoded dataset becomes the training set. Our results show no significant difference in tuning efforts using original and encoded traces, meaning that encoded data reliably represents the original data. Next, we verify how portable is the best profiling model among different datasets. Our results show that tuning autoencoders enables and improves portability while reducing the effort in hyperparameter search for profiling models. Lastly, we present a transfer learning case where dimensionality reduction might be necessary if the model is tuned for a dataset with fewer features than the new dataset. In this case, tuning of the profiling model is eliminated and training time reduced.

Metadata
Available format(s)
PDF
Category
Attacks and cryptanalysis
Publication info
Preprint.
Keywords
Side-channel AnalysisAutoencodersPreprocessingHyperparameter TuningPortabilityTransfer Learning
Contact author(s)
m krcek @ tudelft nl
g perin @ liacs leidenuniv nl
History
2023-07-20: revised
2023-01-05: received
See all versions
Short URL
https://ia.cr/2023/019
License
Creative Commons Attribution
CC BY

BibTeX

@misc{cryptoeprint:2023/019,
      author = {Marina Krček and Guilherme Perin},
      title = {Autoencoder-enabled Model Portability for Reducing Hyperparameter Tuning Efforts in Side-channel Analysis},
      howpublished = {Cryptology ePrint Archive, Paper 2023/019},
      year = {2023},
      note = {\url{https://eprint.iacr.org/2023/019}},
      url = {https://eprint.iacr.org/2023/019}
}
Note: In order to protect the privacy of readers, eprint.iacr.org does not use cookies or embedded third party content.