Paper 2023/592

Blockchain Large Language Models

Yu Gai, University of California, Berkeley
Liyi Zhou, Imperial College London
Kaihua Qin, Imperial College London
Dawn Song, University of California, Berkeley
Arthur Gervais, University College London
Abstract

This paper presents a dynamic, real-time approach to detecting anomalous blockchain transactions. The proposed tool, BlockGPT, generates tracing representations of blockchain activity and trains from scratch a large language model to act as a real-time Intrusion Detection System. Unlike traditional methods, BlockGPT is designed to offer an unrestricted search space and does not rely on predefined rules or patterns, enabling it to detect a broader range of anomalies. We demonstrate the effectiveness of BlockGPT through its use as an anomaly detection tool for Ethereum transactions. In our experiments, it effectively identifies abnormal transactions among a dataset of 68M transactions and has a batched throughput of 2284 trans- actions per second on average. Our results show that, BlockGPT identifies abnormal transactions by ranking 49 out of 124 attacks among the top-3 most abnormal transactions interacting with their victim contracts. This work makes contributions to the field of blockchain transaction analysis by introducing a custom data encoding compatible with the transformer architecture, a domain-specific tokenization technique, and a tree encoding method specifically crafted for the Ethereum Virtual Machine (EVM) trace representation.

Metadata
Available format(s)
PDF
Category
Applications
Publication info
Preprint.
Keywords
blockchainlarge language modelintrusion detectionintrusion prevention
Contact author(s)
lzhou1110 @ gmail com
History
2023-04-29: last of 3 revisions
2023-04-25: received
See all versions
Short URL
https://ia.cr/2023/592
License
Creative Commons Attribution-NonCommercial-NoDerivs
CC BY-NC-ND

BibTeX

@misc{cryptoeprint:2023/592,
      author = {Yu Gai and Liyi Zhou and Kaihua Qin and Dawn Song and Arthur Gervais},
      title = {Blockchain Large Language Models},
      howpublished = {Cryptology ePrint Archive, Paper 2023/592},
      year = {2023},
      note = {\url{https://eprint.iacr.org/2023/592}},
      url = {https://eprint.iacr.org/2023/592}
}
Note: In order to protect the privacy of readers, eprint.iacr.org does not use cookies or embedded third party content.