Paper 2021/555

Neural-Network-Based Modeling Attacks on XOR Arbiter PUFs Revisited

Nils Wisiol
Bipana Thapaliya
Khalid T. Mursi
Jean-Pierre Seifert
Yu Zhuang
Abstract

By revisiting, improving, and extending recent neural-network based modeling attacks on XOR Arbiter PUFs from the literature, we show that XOR Arbiter PUFs, (XOR) Feed-Forward Arbiter PUFs, and Interpose PUFs can be attacked faster, up to larger security parameters, and with fewer challenge-response pairs than previously known both in simulation and in silicon data. To support our claim, we discuss the differences and similarities of recently proposed modeling attacks and offer a fair comparison of the performance of these attacks by implementing all of them using the popular machine learning framework Keras and comparing their performance against the well-studied Logistic Regression attack. Our findings show that neural-network-based modeling attacks have the potential to outperform traditional modeling attacks on PUFs and must hence become part of the standard toolbox for PUF security analysis; the code and discussion in this paper can serve as a basis for the extension of our results to PUF designs beyond the scope of this work.

Note: Added analysis of Feed-Forward Arbiter PUF and variants.

Metadata
Available format(s)
PDF
Category
Foundations
Publication info
Preprint.
Keywords
Physical Unclonable Function Strong PUFs Machine Learning Modeling Attacks XOR Arbiter PUF
Contact author(s)
nwisiol @ gmail com
History
2022-06-20: revised
2021-04-28: received
See all versions
Short URL
https://ia.cr/2021/555
License
Creative Commons Attribution
CC BY

BibTeX

@misc{cryptoeprint:2021/555,
      author = {Nils Wisiol and Bipana Thapaliya and Khalid T.  Mursi and Jean-Pierre Seifert and Yu Zhuang},
      title = {Neural-Network-Based Modeling Attacks on XOR Arbiter PUFs Revisited},
      howpublished = {Cryptology ePrint Archive, Paper 2021/555},
      year = {2021},
      note = {\url{https://eprint.iacr.org/2021/555}},
      url = {https://eprint.iacr.org/2021/555}
}
Note: In order to protect the privacy of readers, eprint.iacr.org does not use cookies or embedded third party content.