Cryptology ePrint Archive: Report 2021/249

NeuroSCA: Evolving Activation Functions for Side-channel Analysis

Karlo Knezevic and Juraj Fulir and Domagoj Jakobovic and Stjepan Picek

Abstract: The choice of activation functions can have a significant effect on the performance of a neural network. Although the researchers have been developing novel activation functions, Rectified Linear Unit ($ReLU$) remains the most common one in practice. This paper shows that evolutionary algorithms can discover new activation functions for side-channel analysis (SCA) that outperform $ReLU$. Using Genetic Programming (GP), candidate activation functions are defined and explored (neuroevolution). As far as we know, this is the first attempt to develop custom activation functions for SCA. The ASCAD database experiments show this approach is highly effective compared to the state-of-the-art neural network architectures. While the optimal performance is achieved when activation functions are evolved for the particular task, we also observe that these activation functions show the property of generalization and high performance for different SCA scenarios.

Category / Keywords: implementation / Activation functions, Multilayer perceptron, Convolutional neural network, Side-channel analysis, Evolutionary Algorithms, Neuroevolution

Date: received 2 Mar 2021, last revised 28 Mar 2021

Contact author: picek stjepan at gmail com, domagoj jakobovic@fer hr

Available format(s): PDF | BibTeX Citation

Version: 20210328:064718 (All versions of this report)

Short URL: ia.cr/2021/249


[ Cryptology ePrint archive ]