📄
Abstract - Avoiding Over-smoothing in Social Media Rumor Detection with Pre-trained Propagation Tree Transformer
Deep learning techniques for rumor detection typically utilize Graph Neural Networks (GNNs) to analyze post relations. These methods, however, falter due to over-smoothing issues when processing rumor propagation structures, leading to declining performance. Our investigation into this issue reveals that over-smoothing is intrinsically tied to the structural characteristics of rumor propagation trees, in which the majority of nodes are 1-level nodes. Furthermore, GNNs struggle to capture long-range dependencies within these trees. To circumvent these challenges, we propose a Pre-Trained Propagation Tree Transformer (P2T3) method based on pure Transformer architecture. It extracts all conversation chains from a tree structure following the propagation direction of replies, utilizes token-wise embedding to infuse connection information and introduces necessary inductive bias, and pre-trains on large-scale unlabeled datasets. Experiments indicate that P2T3 surpasses previous state-of-the-art methods in multiple benchmark datasets and performs well under few-shot conditions. P2T3 not only avoids the over-smoothing issue inherent in GNNs but also potentially offers a large model or unified multi-modal scheme for future social media research.
基于预训练传播树Transformer的社交媒体谣言检测:避免过度平滑问题 /
Avoiding Over-smoothing in Social Media Rumor Detection with Pre-trained Propagation Tree Transformer
1️⃣ 一句话总结
本文提出了一种名为P2T3的新方法,它使用纯Transformer架构来分析和检测社交媒体上的谣言传播结构,有效解决了传统图神经网络在处理此类数据时遇到的过度平滑和长距离依赖捕获困难的问题,并在多个数据集上取得了更好的性能。