α
Research
Alpha Leak
Conferences
Topics
Top Authors
Rankings
Browse All
EN
中
Home
/
Authors
/
Chengting Yu
Chengting Yu
4
papers
19
total citations
papers (4)
Temporal Separation with Entropy Regularization for Knowledge Distillation in Spiking Neural Networks
CVPR 2025
arXiv
10
citations
Efficient ANN-Guided Distillation: Aligning Rate-based Features of Spiking Neural Networks through Hybrid Block-wise Replacement
CVPR 2025
arXiv
5
citations
Efficient Logit-based Knowledge Distillation of Deep Spiking Neural Networks for Full-Range Timestep Deployment
ICML 2025
arXiv
4
citations
Enhanced Self-Distillation Framework for Efficient Spiking Neural Network Training
NEURIPS 2025
arXiv
0
citations