Poster "long-range dependencies" Papers
18 papers found
Conference
AMR-Transformer: Enabling Efficient Long-range Interaction for Complex Neural Fluid Simulation
Zeyi Xu, Jinfan Liu, Kuangxu Chen et al.
A Multimodal BiMamba Network with Test-Time Adaptation for Emotion Recognition Based on Physiological Signals
Ziyu Jia, Tingyu Du, Zhengyu Tian et al.
BeyondMix: Leveraging Structural Priors and Long-Range Dependencies for Domain-Invariant LiDAR Segmentation
Yujia Chen, Rui Sun, Wangkai Li et al.
DUALFormer: Dual Graph Transformer
Zhuo Jiaming, Yuwei Liu, Yintong Lu et al.
DuSA: Fast and Accurate Dual-Stage Sparse Attention Mechanism Accelerating Both Training and Inference
Chong Wu, Jiawang Cao, Renjie Xu et al.
Global-Aware Monocular Semantic Scene Completion with State Space Models
Shijie Li, Zhongyao Cheng, Rong Li et al.
GLoRa: A Benchmark to Evaluate the Ability to Learn Long-Range Dependencies in Graphs
Dongzhuoran Zhou, Evgeny Kharlamov, Egor Kostylev
Learning Long Range Dependencies on Graphs via Random Walks
Dexiong Chen, Till Schulz, Karsten Borgwardt
L-MTP: Leap Multi-Token Prediction Beyond Adjacent Context for Large Language Models
Xiaohao Liu, Xiaobo Xia, Weixiang Zhao et al.
MambaIC: State Space Models for High-Performance Learned Image Compression
Fanhu Zeng, Hao Tang, Yihua Shao et al.
MODEM: A Morton-Order Degradation Estimation Mechanism for Adverse Weather Image Recovery
Hainuo Wang, Qiming Hu, Xiaojie Guo
MVSMamba: Multi-View Stereo with State Space Model
Jianfei Jiang, Qiankun Liu, Hongyuan Liu et al.
P-SPIKESSM: HARNESSING PROBABILISTIC SPIKING STATE SPACE MODELS FOR LONG-RANGE DEPENDENCY TASKS
Malyaban Bal, Abhronil Sengupta
RAT: Bridging RNN Efficiency and Attention Accuracy via Chunk-based Sequence Modeling
Xiuying Wei, Anunay Yadav, Razvan Pascanu et al.
Sketch-Augmented Features Improve Learning Long-Range Dependencies in Graph Neural Networks
Ryien Hosseini, Filippo Simini, Venkatram Vishwanath et al.
VORTA: Efficient Video Diffusion via Routing Sparse Attention
Wenhao Sun, Rong-Cheng Tu, Yifu Ding et al.
WaLRUS: Wavelets for Long range Representation Using State Space Methods
Hossein Babaei, Mel White, Sina Alemohammad et al.
Viewing Transformers Through the Lens of Long Convolutions Layers
Itamar Zimerman, Lior Wolf