Paper 2021/249

NeuroSCA: Evolving Activation Functions for Side-channel Analysis

Karlo Knezevic, Juraj Fulir, Domagoj Jakobovic, and Stjepan Picek

Abstract

The choice of activation functions can have a significant effect on the performance of a neural network. Although the researchers have been developing novel activation functions, Rectified Linear Unit ($ReLU$) remains the most common one in practice. This paper shows that evolutionary algorithms can discover new activation functions for side-channel analysis (SCA) that outperform $ReLU$. Using Genetic Programming (GP), candidate activation functions are defined and explored (neuroevolution). As far as we know, this is the first attempt to develop custom activation functions for SCA. The ASCAD database experiments show this approach is highly effective compared to the state-of-the-art neural network architectures. While the optimal performance is achieved when activation functions are evolved for the particular task, we also observe that these activation functions show the property of generalization and high performance for different SCA scenarios.

Metadata
Available format(s)
PDF
Category
Implementation
Publication info
Preprint. MINOR revision.
Keywords
Activation functionsMultilayer perceptronConvolutional neural networkSide-channel analysisEvolutionary AlgorithmsNeuroevolution
Contact author(s)
picek stjepan @ gmail com
domagoj jakobovic @ fer hr
History
2021-03-28: last of 2 revisions
2021-03-02: received
See all versions
Short URL
https://ia.cr/2021/249
License
Creative Commons Attribution
CC BY

BibTeX

@misc{cryptoeprint:2021/249,
      author = {Karlo Knezevic and Juraj Fulir and Domagoj Jakobovic and Stjepan Picek},
      title = {{NeuroSCA}: Evolving Activation Functions for Side-channel Analysis},
      howpublished = {Cryptology {ePrint} Archive, Paper 2021/249},
      year = {2021},
      url = {https://eprint.iacr.org/2021/249}
}
Note: In order to protect the privacy of readers, eprint.iacr.org does not use cookies or embedded third party content.