α
Research
Alpha Leak
Conferences
Topics
Top Authors
Rankings
Browse All
EN
中
Home
/
Authors
/
Yaodong Yu
Yaodong Yu
1
Affiliations
Affiliations
EECS, UC Berkeley
13
papers
582
total citations
papers (13)
Learning Diverse and Discriminative Representations via the Principle of Maximal Coding Rate Reduction
NEURIPS 2020
arXiv
234
citations
White-Box Transformers via Sparse Rate Reduction
NEURIPS 2023
arXiv
124
citations
Boundary thickness and robustness in learning models
NEURIPS 2020
arXiv
47
citations
Robust Calibration with Multi-domain Temperature Scaling
NEURIPS 2022
arXiv
44
citations
TCT: Convexifying Federated Learning using Bootstrapped Neural Tangent Kernels
NEURIPS 2022
arXiv
33
citations
Token Statistics Transformer: Linear-Time Attention via Variational Rate Reduction
ICLR 2025
arXiv
28
citations
ViP: A Differentially Private Foundation Model for Computer Vision
ICML 2024
arXiv
18
citations
Masked Completion via Structured Diffusion with White-Box Transformers
ICLR 2024
arXiv
16
citations
What You See is What You Get: Principled Deep Learning via Distributional Generalization
NEURIPS 2022
arXiv
11
citations
A Global Geometric Analysis of Maximal Coding Rate Reduction
ICML 2024
arXiv
11
citations
Differentially Private Representation Learning via Image Captioning
ICML 2024
arXiv
7
citations
Adventurer: Optimizing Vision Mamba Architecture Designs for Efficiency
CVPR 2025
arXiv
5
citations
Attention-Only Transformers via Unrolled Subspace Denoising
ICML 2025
arXiv
4
citations