"large-scale pre-training" Papers
12 papers found
Conference
ARKit LabelMaker: A New Scale for Indoor 3D Scene Understanding
Guangda Ji, Silvan Weder, Francis Engelmann et al.
CVPR 2025arXiv:2410.13924
6
citations
Large-scale Pre-training for Grounded Video Caption Generation
Evangelos Kazakos, Cordelia Schmid, Josef Sivic
ICCV 2025arXiv:2503.10781
3
citations
Revisiting MAE Pre-training for 3D Medical Image Segmentation
Tassilo Wald, Constantin Ulrich, Stanislav Lukyanenko et al.
CVPR 2025highlightarXiv:2410.23132
16
citations
SiMHand: Mining Similar Hands for Large-Scale 3D Hand Pose Pre-training
Nie Lin, Takehiko Ohkawa, Yifei Huang et al.
ICLR 2025arXiv:2502.15251
4
citations
StelLA: Subspace Learning in Low-rank Adaptation using Stiefel Manifold
Zhizhong Li, Sina Sajadmanesh, Jingtao Li et al.
NEURIPS 2025spotlightarXiv:2510.01938
4
citations
THD-BAR: Topology Hierarchical Derived Brain Autoregressive Modeling for EEG Generic Representations
Wenchao Yang, Weidong Yan, Wenkang Liu et al.
NEURIPS 2025oralarXiv:2511.13733
1
citations
This Time is Different: An Observability Perspective on Time Series Foundation Models
Ben Cohen, Emaad Khwaja, Youssef Doubli et al.
NEURIPS 2025arXiv:2505.14766
13
citations
Time-MoE: Billion-Scale Time Series Foundation Models with Mixture of Experts
Xiaoming Shi, Shiyu Wang, Yuqi Nie et al.
ICLR 2025arXiv:2409.16040
194
citations
Your ViT is Secretly an Image Segmentation Model
Tommie Kerssies, Niccolò Cavagnero, Alexander Hermans et al.
CVPR 2025highlightarXiv:2503.19108
26
citations
Scalable Pre-training of Large Autoregressive Image Models
Alaaeldin Ali, Michal Klein, Shuangfei Zhai et al.
ICML 2024
Timer: Generative Pre-trained Transformers Are Large Time Series Models
Yong Liu, Haoran Zhang, Chenyu Li et al.
ICML 2024arXiv:2402.02368
148
citations
Unified Training of Universal Time Series Forecasting Transformers
Gerald Woo, Chenghao Liu, Akshat Kumar et al.
ICML 2024arXiv:2402.02592
428
citations