Paper 2021/1546

Improving Deep Learning Networks for Profiled Side-Channel Analysis Using Performance Improvement Techniques

Damien Robissout, Lilian Bossuet, Amaury Habrard, and Vincent Grosso


The use of deep learning techniques to perform side-channel analysis attracted the attention of many researchers as they obtained good performances with them. Unfortunately, the understanding of the neural networks used to perform side-channel attacks is not very advanced yet. In this paper, we propose to contribute to this direction by studying the impact of some particular deep learning techniques for tackling side-channel attack problems. More precisely, we propose to focus on three existing techniques: batch normalization, dropout and weight decay, not yet used in side-channel context. By combining adequately these techniques for our problem, we show that it is possible to improve the attack performance, i.e. the number of traces needed to recover the secret, by more than 55%. Additionally, they allow us to have a gain of more than 34% in terms of training time. We also show that an architecture trained with such techniques is able to perform attacks efficiently even in the context of desynchronized traces.

Available format(s)
Secret-key cryptography
Publication info
Published elsewhere. ACM Journal on Emerging Technologies in Computing Systems, Volume 17, Issue 3
Profiled side-channel attacksMetricsDeep LearningUnderfittingOverfitting
Contact author(s)
damien robissout @ univ-st-etienne fr
2021-11-29: received
Short URL
Creative Commons Attribution


      author = {Damien Robissout and Lilian Bossuet and Amaury Habrard and Vincent Grosso},
      title = {Improving Deep Learning Networks for Profiled Side-Channel Analysis Using Performance Improvement Techniques},
      howpublished = {Cryptology ePrint Archive, Paper 2021/1546},
      year = {2021},
      doi = {10.1145/3453162},
      note = {\url{}},
      url = {}
Note: In order to protect the privacy of readers, does not use cookies or embedded third party content.