α
Research
Alpha Leak
Conferences
Topics
Top Authors
Rankings
Browse All
EN
中
Home
/
Authors
/
Huaxiu Yao
Huaxiu Yao
1
Affiliations
Affiliations
University of North Carolina at Chapel Hill
19
papers
622
total citations
papers (19)
HALC: Object Hallucination Reduction via Adaptive Focal-Contrast Decoding
ICML 2024
arXiv
142
citations
Wild-Time: A Benchmark of in-the-Wild Distribution Shift over Time
NEURIPS 2022
arXiv
108
citations
C-Mixup: Improving Generalization in Regression
NEURIPS 2022
arXiv
87
citations
Multimodal Representation Learning by Alternating Unimodal Adaptation
CVPR 2024
arXiv
80
citations
Meta-learning with an Adaptive Task Scheduler
NEURIPS 2021
arXiv
53
citations
Conformal Prediction for Deep Classifier via Label Ranking
ICML 2024
arXiv
43
citations
ReAgent-V: A Reward-Driven Multi-Agent Framework for Video Understanding
NEURIPS 2025
arXiv
29
citations
Online Structured Meta-learning
NEURIPS 2020
arXiv
28
citations
MMedPO: Aligning Medical Vision-Language Models with Clinical-Aware Multimodal Preference Optimization
ICML 2025
arXiv
18
citations
Improving Domain Generalization with Domain Relations
ICLR 2024
arXiv
17
citations
Weak-for-Strong: Training Weak Meta-Agent to Harness Strong Executors
COLM 2025
arXiv
11
citations
Generating Chain-of-Thoughts with a Pairwise-Comparison Approach to Searching for the Most Promising Intermediate Thought
ICML 2024
arXiv
4
citations
FactTest: Factuality Testing in Large Language Models with Finite-Sample and Distribution-Free Guarantees
ICML 2025
arXiv
2
citations
An Iterative Self-Learning Framework for Medical Domain Generalization
NEURIPS 2023
0
citations
GRASP: Navigating Retrosynthetic Planning with Goal-driven Policy
NEURIPS 2022
0
citations
Meta-Learning with Neural Bandit Scheduler
NEURIPS 2023
0
citations
Position: TrustLLM: Trustworthiness in Large Language Models
ICML 2024
0
citations
Functionally Regionalized Knowledge Transfer for Low-resource Drug Discovery
NEURIPS 2021
0
citations
One Meta-tuned Transformer is What You Need for Few-shot Learning
ICML 2024
0
citations