You are looking at a specific version 20200818:082945 of this paper. See the latest version.

Paper 2020/977

On the Influence of Optimizers in Deep Learning-based Side-channel Analysis

Guilherme Perin and Stjepan Picek

Abstract

The deep learning-based side-channel analysis represents a powerful and easy to deploy option for profiled side-channel attacks. A detailed tuning phase is often required to reach a good performance where one first needs to select relevant hyperparameters and then tune them. A common selection for the tuning phase are hyperparameters connected with the neural network architecture, while those influencing the training process are less explored. In this work, we concentrate on the optimizer hyperparameter, and we show that this hyperparameter has a significant role in the attack performance. Our results show that common choices of optimizers (Adam and RMSprop) indeed work well, but they easily overfit, which means that we must use short training phases, small profiled models, and explicit regularization. On the other hand, SGD type of optimizers works well on average (slower convergence and less overfit), but only if momentum is used. Finally, our results show that Adagrad represents a strong option to use in scenarios with longer training phases or larger profiled models.

Metadata
Available format(s)
PDF
Category
Applications
Publication info
Preprint. MINOR revision.
Keywords
Side-channel AnalysisProfiled AttacksNeural NetworksOptimizers
Contact author(s)
guilhermeperin7 @ gmail com
picek stjepan @ gmail com
History
2020-08-18: received
Short URL
https://ia.cr/2020/977
License
Creative Commons Attribution
CC BY
Note: In order to protect the privacy of readers, eprint.iacr.org does not use cookies or embedded third party content.