Paper 2020/977

On the Influence of Optimizers in Deep Learning-based Side-channel Analysis

Guilherme Perin and Stjepan Picek


The deep learning-based side-channel analysis represents a powerful and easy to deploy option for profiled side-channel attacks. A detailed tuning phase is often required to reach a good performance where one first needs to select relevant hyperparameters and then tune them. A common selection for the tuning phase are hyperparameters connected with the neural network architecture, while those influencing the training process are less explored. In this work, we concentrate on the optimizer hyperparameter, and we show that this hyperparameter has a significant role in the attack performance. Our results show that common choices of optimizers (Adam and RMSprop) indeed work well, but they easily overfit, which means that we must use short training phases, small profiled models, and explicit regularization. On the other hand, SGD type of optimizers works well on average (slower convergence and less overfit), but only if momentum is used. Finally, our results show that Adagrad represents a strong option to use in scenarios with longer training phases or larger profiled models.

Available format(s)
Publication info
Preprint. MINOR revision.
Side-channel AnalysisProfiled AttacksNeural NetworksOptimizers
Contact author(s)
guilhermeperin7 @ gmail com
picek stjepan @ gmail com
2020-08-18: received
Short URL
Creative Commons Attribution


      author = {Guilherme Perin and Stjepan Picek},
      title = {On the Influence of Optimizers in Deep Learning-based Side-channel Analysis},
      howpublished = {Cryptology ePrint Archive, Paper 2020/977},
      year = {2020},
      note = {\url{}},
      url = {}
Note: In order to protect the privacy of readers, does not use cookies or embedded third party content.