Paper 2023/006

Exploring multi-task learning in the context of masked AES implementations

Thomas Marquet, Universität Klagenfurt
Elisabeth Oswald, Universität Klagenfurt, University of Birmingham
Abstract

Deep learning is very efficient at breaking masked implementations even when the attacker does not assume knowledge of the masks. However, recent works pointed out a significant challenge: overcoming the initial learning plateau. This paper discusses the advantages of multi-task learning to break through the initial plateau consistently. We investigate different ways of applying multi-task learning against masked AES implementations (via the ASCAD-r, ASCAD-v2, and CHESCTF-2023 datasets) under the assumption that the attacker cannot access masks during training. We offer evidence that multi-task learning significantly increases the consistency of convergence and performance of deep neural networks. Our work provides a wide range of experiments to understand the benefits of multi-task strategies over the current single-task state-of-the-art. Furthermore, such strategies achieve novel milestones against protected implementations as we propose models that defeat all masks of the affine masking on ASCAD-v2 for the first time.

Metadata
Available format(s)
PDF
Category
Attacks and cryptanalysis
Publication info
Published elsewhere. COSADE 2024
Keywords
Side Channel AttacksMaskingDeep LearningMulti-task Learning
Contact author(s)
thomas marquet @ aau at
elisabeth oswald @ aau at
History
2024-02-06: last of 4 revisions
2023-01-02: received
See all versions
Short URL
https://ia.cr/2023/006
License
Creative Commons Attribution
CC BY

BibTeX

@misc{cryptoeprint:2023/006,
      author = {Thomas Marquet and Elisabeth Oswald},
      title = {Exploring multi-task learning in the context of masked AES implementations},
      howpublished = {Cryptology ePrint Archive, Paper 2023/006},
      year = {2023},
      note = {\url{https://eprint.iacr.org/2023/006}},
      url = {https://eprint.iacr.org/2023/006}
}
Note: In order to protect the privacy of readers, eprint.iacr.org does not use cookies or embedded third party content.