"distribution shifts" Papers
53 papers found • Page 1 of 2
Conference
A Multimodal BiMamba Network with Test-Time Adaptation for Emotion Recognition Based on Physiological Signals
Ziyu Jia, Tingyu Du, Zhengyu Tian et al.
Breakthrough Sensor-Limited Single View: Towards Implicit Temporal Dynamics for Time Series Domain Adaptation
Mingyang Liu, Xinyang Chen, Xiucheng Li et al.
Bridging Critical Gaps in Convergent Learning: How Representational Alignment Evolves Across Layers, Training, and Distribution Shifts
Chaitanya Kapoor, Sudhanshu Srivastava, Meenakshi Khosla
CONDA: Adaptive Concept Bottleneck for Foundation Models Under Distribution Shifts
Jihye Choi, Jayaram Raghuram, Yixuan Li et al.
Conformal Prediction under Lévy-Prokhorov Distribution Shifts: Robustness to Local and Global Perturbations
Liviu Aolaritei, Julie Zhu, Oliver Wang et al.
COUNTS: Benchmarking Object Detectors and Multimodal Large Language Models under Distribution Shifts
Jiansheng Li, Xingxuan Zhang, Hao Zou et al.
D2SA: Dual-Stage Distribution and Slice Adaptation for Efficient Test-Time Adaptation in MRI Reconstruction
Lipei Zhang, Rui Sun, Zhongying Deng et al.
Directional Gradient Projection for Robust Fine-Tuning of Foundation Models
Chengyue Huang, Junjiao Tian, Brisa Maneechotesuwan et al.
Enhancing Deep Batch Active Learning for Regression with Imperfect Data Guided Selection
Yinjie Min, Furong Xu, Xinyao Li et al.
Exploring the Noise Robustness of Online Conformal Prediction
HuaJun Xi, Kangdao Liu, Hao Zeng et al.
Going Beyond Static: Understanding Shifts with Time-Series Attribution
Jiashuo Liu, Nabeel Seedat, Peng Cui et al.
Matcha: Mitigating Graph Structure Shifts with Test-Time Adaptation
Wenxuan Bao, Zhichen Zeng, Zhining Liu et al.
MINGLE: Mixture of Null-Space Gated Low-Rank Experts for Test-Time Continual Model Merging
Zihuan Qiu, Yi Xu, Chiyuan He et al.
Mint: A Simple Test-Time Adaptation of Vision-Language Models against Common Corruptions
Wenxuan Bao, Ruxi Deng, Jingrui He
OCRT: Boosting Foundation Models in the Open World with Object-Concept-Relation Triad
Luyao Tang, Chaoqi Chen, Yuxuan Yuan et al.
Quantifying Uncertainty in the Presence of Distribution Shifts
Yuli Slavutsky, David Blei
RA-TTA: Retrieval-Augmented Test-Time Adaptation for Vision-Language Models
Youngjun Lee, Doyoung Kim, Junhyeok Kang et al.
Rethinking Fair Representation Learning for Performance-Sensitive Tasks
Charles Jones, Fabio De Sousa Ribeiro, Mélanie Roschewitz et al.
Rethinking Graph Prompts: Unraveling the Power of Data Manipulation in Graph Neural Networks
Chenyi Zi, Bowen LIU, Xiangguo SUN et al.
TimeEmb: A Lightweight Static-Dynamic Disentanglement Framework for Time Series Forecasting
Mingyuan Xia, Chunxu Zhang, Zijian Zhang et al.
Tracing the Roots: Leveraging Temporal Dynamics in Diffusion Trajectories for Origin Attribution
Andreas Floros, Seyed-Mohsen Moosavi-Dezfooli, Pier Luigi Dragotti
TS-RAG: Retrieval-Augmented Generation based Time Series Foundation Models are Stronger Zero-Shot Forecaster
Kanghui Ning, Zijie Pan, Yu Liu et al.
Uncertainty-Informed Meta Pseudo Labeling for Surrogate Modeling with Limited Labeled Data
Xingyu Ren, Pengwei Liu, Pengkai Wang et al.
Universal generalization guarantees for Wasserstein distributionally robust models
Tam Le, Jerome Malick
An Empirical Study Into What Matters for Calibrating Vision-Language Models
Weijie Tu, Weijian Deng, Dylan Campbell et al.
CLAP: Isolating Content from Style through Contrastive Learning with Augmented Prompts
Yichao Cai, Yuhang Liu, Zhen Zhang et al.
COALA: A Practical and Vision-Centric Federated Learning Platform
Weiming Zhuang, Jian Xu, Chen Chen et al.
De-confounded Data-free Knowledge Distillation for Handling Distribution Shifts
Yuzheng Wang, Dingkang Yang, Zhaoyu Chen et al.
Density-Softmax: Efficient Test-time Model for Uncertainty Estimation and Robustness under Distribution Shifts
Ha Manh Bui, Anqi Liu
Disentangled Graph Self-supervised Learning for Out-of-Distribution Generalization
Haoyang Li, Xin Wang, Zeyang Zhang et al.
DQ-LoRe: Dual Queries with Low Rank Approximation Re-ranking for In-Context Learning
Jing Xiong, Zixuan Li, Chuanyang Zheng et al.
Efficient Diffusion-Driven Corruption Editor for Test-Time Adaptation
Yeongtak Oh, Jonghyun Lee, Jooyoung Choi et al.
Efficient Test-Time Adaptation of Vision-Language Models
Adilbek Karmanov, Dayan Guan, Shijian Lu et al.
Feature Contamination: Neural Networks Learn Uncorrelated Features and Fail to Generalize
Tianren Zhang, Chujie Zhao, Guanyu Chen et al.
FedRC: Tackling Diverse Distribution Shifts Challenge in Federated Learning by Robust Clustering
Yongxin Guo, Xiaoying Tang, Tao Lin
Few-shot Adaptation to Distribution Shifts By Mixing Source and Target Embeddings
Yihao Xue, Ali Payani, Yu Yang et al.
Graph Invariant Learning with Subgraph Co-mixup for Out-of-Distribution Generalization
Tianrui Jia, Haoyang Li, Cheng Yang et al.
Graph Structure Extrapolation for Out-of-Distribution Generalization
Xiner Li, Shurui Gui, Youzhi Luo et al.
How Do Nonlinear Transformers Learn and Generalize in In-Context Learning?
Hongkang Li, Meng Wang, Songtao Lu et al.
Improving Out-of-Distribution Generalization in Graphs via Hierarchical Semantic Environments
Yinhua Piao, Sangseon Lee, Yijingxiu Lu et al.
IW-GAE: Importance weighted group accuracy estimation for improved calibration and model selection in unsupervised domain adaptation
Taejong Joo, Diego Klabjan
Learning by Erasing: Conditional Entropy Based Transferable Out-of-Distribution Detection
Meng Xing, Zhiyong Feng, Yong Su et al.
Learning to Intervene on Concept Bottlenecks
David Steinmann, Wolfgang Stammer, Felix Friedrich et al.
Measuring Stochastic Data Complexity with Boltzmann Influence Functions
Nathan Ng, Roger Grosse, Marzyeh Ghassemi
MedBN: Robust Test-Time Adaptation against Malicious Test Samples
Hyejin Park, Jeongyeon Hwang, Sunung Mun et al.
Multiply Robust Estimation for Local Distribution Shifts with Multiple Domains
Steven Wilkins-Reeves, Xu Chen, Qi Ma et al.
Online Adaptive Anomaly Thresholding with Confidence Sequences
Sophia Sun, Abishek Sankararaman, Balakrishnan Narayanaswamy
RetroOOD: Understanding Out-of-Distribution Generalization in Retrosynthesis Prediction
Yemin Yu, Luotian Yuan, Ying WEI et al.
Selective Mixup Helps with Distribution Shifts, But Not (Only) because of Mixup
Damien Teney, Jindong Wang, Ehsan Abbasnejad
Statistical Inference Under Constrained Selection Bias
Santiago Cortes-Gomez, Mateo Dulce Rubio, Carlos Miguel Patiño et al.