Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
IGT4ETH: An Isotropic Pre-trained Graph Transformer for Ethereum Account Classification
0
Zitationen
4
Autoren
2026
Jahr
Abstract
Pre-trained language models (PLMs) have shown strong potential in Ethereum account modeling and fraud detection. However, existing approaches often overlook the graph-structured nature of transaction networks. In addition, they struggle with the long-tail distribution of account activity, resulting in anisotropic embedding spaces and poor representation quality for low-frequency accounts. In this paper, we present IGT4ETH, a pre-trained Graph Transformer with an isotropy-enhanced post-processing, which explicitly models transaction topologies and mitigates representational anisotropy for Ethereum account classification. IGT4ETH improves structural representation by incorporating structural centrality and role embeddings into an Edge-augmented Graph Transformer, effectively capturing both topological and interaction patterns in transaction graphs. To further mitigate embedding anisotropy, we systematically evaluate various post-processing techniques. Among them, we adopt the Conceptor Negation (CN) method to softly suppress latent features dominated by high-frequency words via matrix conceptors, alongside a modified Focal-InfoNCE loss to enhance directional uniformity and representation balance. Extensive experiments on four real-world Ethereum account classification tasks, including phishing, exchange, mining, and ICO-wallet classification, demonstrate that IGT4ETH consistently outperforms state-of-the-art PLM-based baselines in terms of classification performance.
Ähnliche Arbeiten
Detecting Functionality-Specific Vulnerabilities via Retrieving Individual Functionality-Equivalent APIs in Open-Source Repositories
2025 · 16.058 Zit.
A tutorial on spectral clustering
2007 · 10.081 Zit.
Authoritative sources in a hyperlinked environment
1999 · 8.989 Zit.
The Graph Neural Network Model
2008 · 8.943 Zit.
A Comprehensive Survey on Graph Neural Networks
2020 · 8.666 Zit.