Paper 2021/827

TransNet: Shift Invariant Transformer Network for Side Channel Analysis

Suvadeep Hajra
Sayandeep Saha
Manaar Alam
Debdeep Mukhopadhyay

Deep learning (DL) has revolutionized Side Channel Analysis (SCA) in recent years. One of the major advantages of DL in the context of SCA is that it can automatically handle masking and desynchronization countermeasures, even while they are applied simultaneously for a cryptographic implementation. However, the success of the attack strongly depends on the DL model used for the attack. Traditionally, Convolutional Neural Networks (CNNs) have been utilized in this regard. This work proposes to use Transformer Network (TN) for attacking implementations secured with masking and desynchronization. Our choice is motivated by the fact that TN is good at capturing the dependencies among distant points of interest in a power trace. Furthermore, we show that TN can be made shift-invariant which is an important property required to handle desynchronized traces. Experimental validation on several public datasets establishes that our proposed TN-based model, called TransNet, outperforms the present state-of-the-art on several occasions. Specifically, TransNet outperforms the other methods by a wide margin when the traces are highly desynchronized. Additionally, TransNet shows good attack performance against implementations with desynchronized traces even when it is trained on synchronized traces. The Tensorflow implementation of TransNet is available at

Available format(s)
Attacks and cryptanalysis
Publication info
side channel analysis masking countermeasure transformer network
Contact author(s)
suvadeep hajra @ gmail com
2022-06-01: last of 2 revisions
2021-06-21: received
See all versions
Short URL
Creative Commons Attribution


      author = {Suvadeep Hajra and Sayandeep Saha and Manaar Alam and Debdeep Mukhopadhyay},
      title = {TransNet: Shift Invariant Transformer Network for Side Channel Analysis},
      howpublished = {Cryptology ePrint Archive, Paper 2021/827},
      year = {2021},
      note = {\url{}},
      url = {}
Note: In order to protect the privacy of readers, does not use cookies or embedded third party content.