"generalization performance" Papers
23 papers found
Conference
Data Scaling Laws in Imitation Learning for Robotic Manipulation
Fanqi Lin, Yingdong Hu, Pingyue Sheng et al.
Detecting Adversarial Data Using Perturbation Forgery
Qian Wang, Chen Li, Yuchen Luo et al.
Epistemic Uncertainty for Generated Image Detection
Jun Nie, Yonggang Zhang, Tongliang Liu et al.
From Style to Facts: Mapping the Boundaries of Knowledge Injection with Finetuning
Eric Zhao, Pranjal Awasthi, Nika Haghtalab
How Memory in Optimization Algorithms Implicitly Modifies the Loss
Matias Cattaneo, Boris Shigida
Optimizing importance weighting in the presence of sub-population shifts
Floris Holstege, Bram Wouters, Noud Giersbergen et al.
Symbolic regression via MDLformer-guided search: from minimizing prediction error to minimizing description length
Zihan Yu, Jingtao Ding, Yong Li et al.
The Computational Advantage of Depth in Learning High-Dimensional Hierarchical Targets
Yatin Dandi, Luca Pesce, Lenka Zdeborová et al.
Weight matrices compression based on PDB model in deep neural networks
Xiaoling Wu, Junpeng Zhu, Zeng Li
Achieving Margin Maximization Exponentially Fast via Progressive Norm Rescaling
Mingze Wang, Zeping Min, Lei Wu
Deep Fusion: Efficient Network Training via Pre-trained Initializations
Hanna Mazzawi, Xavi Gonzalvo, Michael Wunder et al.
Foster Adaptivity and Balance in Learning with Noisy Labels
Mengmeng Sheng, Zeren Sun, Tao Chen et al.
From Inverse Optimization to Feasibility to ERM
Saurabh Mishra, Anant Raj, Sharan Vaswani
INViT: A Generalizable Routing Problem Solver with Invariant Nested View Transformer
Han Fang, Zhihao Song, Paul Weng et al.
Lookbehind-SAM: k steps back, 1 step forward
Gonçalo Mordido, Pranshu Malviya, Aristide Baratin et al.
OGNI-DC: Robust Depth Completion with Optimization-Guided Neural Iterations
Yiming Zuo, Jia Deng
On Harmonizing Implicit Subpopulations
Feng Hong, Jiangchao Yao, YUEMING LYU et al.
Pi-DUAL: Using privileged information to distinguish clean from noisy labels
Ke Wang, Guillermo Ortiz-Jimenez, Rodolphe Jenatton et al.
Provable Benefits of Local Steps in Heterogeneous Federated Learning for Neural Networks: A Feature Learning Perspective
Yajie Bao, Michael Crawshaw, Mingrui Liu
Stable Unlearnable Example: Enhancing the Robustness of Unlearnable Examples via Stable Error-Minimizing Noise
Yixin Liu, Kaidi Xu, Xun Chen et al.
TEA: Test-time Energy Adaptation
Yige Yuan, Bingbing Xu, Liang Hou et al.
Two-stage LLM Fine-tuning with Less Specialization and More Generalization
Yihan Wang, Si Si, Daliang Li et al.
Weisfeiler-Leman at the margin: When more expressivity matters
Billy Franks, Christopher Morris, Ameya Velingker et al.