[Paper]
[Code]
ARXIV
Deep Learning
Graph
Has Code
LSH
Supervised
This study introduces a novel transformer model optimized for large-scale
point cloud processing in scientific domains such as high-energy physics (HEP)
and astrophysics. Addressing the limitations of graph neural networks and
standard transformers, our model integrates local inductive bias and achieves
near-linear complexity with hardware-friendly regular operations. One
contribution of this work is the quantitative analysis of the error-complexity
tradeoff of various sparsification techniques for building efficient
transformers. Our findings highlight the superiority of using
locality-sensitive hashing (LSH), especially OR & AND-construction LSH, in
kernel approximation for large-scale point cloud data with local inductive
bias. Based on this finding, we propose LSH-based Efficient Point Transformer
(HEPT), which combines E