"self-attention analysis" Papers
2 papers found
Conference
How Smooth Is Attention?
Valérie Castin, Pierre Ablin, Gabriel Peyré
ICML 2024arXiv:2312.14820
29
citations
OPERA: Alleviating Hallucination in Multi-Modal Large Language Models via Over-Trust Penalty and Retrospection-Allocation
Qidong Huang, Xiaoyi Dong, Pan Zhang et al.
CVPR 2024highlightarXiv:2311.17911
385
citations