α
Research
Alpha Leak
Conferences
Topics
Top Authors
Rankings
Browse All
EN
中
Home
/
Authors
/
Marco Mondelli
Marco Mondelli
14
papers
257
total citations
papers (14)
Global Convergence of Deep Networks with One Wide Layer Followed by Pyramidal Topology
NEURIPS 2020
arXiv
78
citations
Deep Neural Collapse Is Provably Optimal for the Deep Unconstrained Features Model
NEURIPS 2023
arXiv
39
citations
Memorization and Optimization in Deep Neural Networks with Minimum Over-parameterization
NEURIPS 2022
arXiv
36
citations
PCA Initialization for Approximate Message Passing in Rotationally Invariant Models
NEURIPS 2021
arXiv
21
citations
The price of ignorance: how much does it cost to forget noise structure in low-rank matrix estimation?
NEURIPS 2022
arXiv
19
citations
High-dimensional Analysis of Knowledge Distillation: Weak-to-Strong Generalization and Scaling Laws
ICLR 2025
arXiv
13
citations
Wide Neural Networks Trained with Weight Decay Provably Exhibit Neural Collapse
ICLR 2025
arXiv
10
citations
When Are Solutions Connected in Deep Networks?
NEURIPS 2021
arXiv
10
citations
How Spurious Features are Memorized: Precise Analysis for Random and NTK Features
ICML 2024
arXiv
9
citations
Spurious Correlations in High Dimensional Regression: The Roles of Regularization, Simplicity Bias and Over-Parameterization
ICML 2025
arXiv
6
citations
Towards Understanding the Word Sensitivity of Attention Layers: A Study via Random Features
ICML 2024
arXiv
6
citations
Test-Time Training Provably Improves Transformers as In-context Learners
ICML 2025
arXiv
5
citations
Neural Collapse is Globally Optimal in Deep Regularized ResNets and Transformers
NEURIPS 2025
arXiv
4
citations
Compression of Structured Data with Autoencoders: Provable Benefit of Nonlinearities and Depth
ICML 2024
arXiv
1
citations