"low-rank adaptation" Papers

86 papers found • Page 1 of 2

Accurate and Efficient Low-Rank Model Merging in Core Space

Aniello Panariello, Daniel Marczak, Simone Magistri et al.

NEURIPS 2025arXiv:2509.17786
3
citations

AuroRA: Breaking Low-Rank Bottleneck of LoRA with Nonlinear Mapping

Haonan Dong, Wenhao Zhu, Guojie Song et al.

NEURIPS 2025spotlightarXiv:2505.18738
2
citations

A Wander Through the Multimodal Landscape: Efficient Transfer Learning via Low-rank Sequence Multimodal Adapter

Zirun Guo, Xize Cheng, Yangyang Wu et al.

AAAI 2025paperarXiv:2412.08979
3
citations

Cached Multi-Lora Composition for Multi-Concept Image Generation

Xiandong Zou, Mingzhu Shen, Christos-Savvas Bouganis et al.

ICLR 2025arXiv:2502.04923
6
citations

CL-LoRA: Continual Low-Rank Adaptation for Rehearsal-Free Class-Incremental Learning

Jiangpeng He, Zhihao Duan, Fengqing Zhu

CVPR 2025arXiv:2505.24816
8
citations

Continual Multiple Instance Learning with Enhanced Localization for Histopathological Whole Slide Image Analysis

Byung Hyun Lee, Wongi Jeong, Woojae Han et al.

ICCV 2025arXiv:2507.02395
2
citations

Continuous Subspace Optimization for Continual Learning

Quan Cheng, Yuanyu Wan, Lingyu Wu et al.

NEURIPS 2025arXiv:2505.11816
1
citations

Contrastive Test-Time Composition of Multiple LoRA Models for Image Generation

Tuna Meral, Enis Simsar, Federico Tombari et al.

ICCV 2025highlightarXiv:2403.19776
5
citations

Correlated Low-Rank Adaptation for ConvNets

Wu Ran, Weijia Zhang, ShuYang Pang et al.

NEURIPS 2025

dEBORA: Efficient Bilevel Optimization-based low-Rank Adaptation

Emanuele Zangrando, Sara Venturini, Francesco Rinaldi et al.

ICLR 2025

Decoupling Angles and Strength in Low-rank Adaptation

Massimo Bini, Leander Girrbach, Zeynep Akata

ICLR 2025arXiv:2503.18225
8
citations

DEFT: Decompositional Efficient Fine-Tuning for Text-to-Image Models

Komal Kumar, Rao Anwer, Fahad Shahbaz Khan et al.

NEURIPS 2025arXiv:2509.22793

Densely Connected Parameter-Efficient Tuning for Referring Image Segmentation

Jiaqi Huang, Zunnan Xu, Ting Liu et al.

AAAI 2025paperarXiv:2501.08580
21
citations

Drag-and-Drop LLMs: Zero-Shot Prompt-to-Weights

Zhiyuan Liang, Dongwen Tang, Yuhao Zhou et al.

NEURIPS 2025arXiv:2506.16406
3
citations

Ensembles of Low-Rank Expert Adapters

Yinghao Li, Vianne Gao, Chao Zhang et al.

ICLR 2025arXiv:2502.00089
6
citations

Evidential Knowledge Distillation

Liangyu Xiang, Junyu Gao, Changsheng Xu

ICCV 2025arXiv:2507.18366
1
citations

Federated Residual Low-Rank Adaption of Large Language Models

Yunlu Yan, Chun-Mei Feng, Wangmeng Zuo et al.

ICLR 2025
8
citations

GraLoRA: Granular Low-Rank Adaptation for Parameter-Efficient Fine-Tuning

Yeonjoon Jung, Daehyun Ahn, Hyungjun Kim et al.

NEURIPS 2025spotlightarXiv:2505.20355
2
citations

HMVLM:Human Motion-Vision-Language Model via MoE LoRA

Lei Hu, Yongjing Ye, Shihong Xia

NEURIPS 2025

Image Referenced Sketch Colorization Based on Animation Creation Workflow

Dingkun Yan, Xinrui Wang, Zhuoru Li et al.

CVPR 2025arXiv:2502.19937
3
citations

Latent Space Factorization in LoRA

Shashi Kumar, Yacouba Kaloga, John Mitros et al.

NEURIPS 2025arXiv:2510.19640

LoRA3D: Low-Rank Self-Calibration of 3D Geometric Foundation models

Ziqi Lu, Heng Yang, Danfei Xu et al.

ICLR 2025arXiv:2412.07746
11
citations

LoRA Done RITE: Robust Invariant Transformation Equilibration for LoRA Optimization

Jui-Nan Yen, Si Si, Zhao Meng et al.

ICLR 2025arXiv:2410.20625
16
citations

LoRA-EnVar: Parameter-Efficient Hybrid Ensemble Variational Assimilation for Weather Forecasting

Yi Xiao, Hang Fan, Kun Chen et al.

NEURIPS 2025oral

LoRA-FAIR: Federated LoRA Fine-Tuning with Aggregation and Initialization Refinement

Jieming Bian, Lei Wang, Letian Zhang et al.

ICCV 2025arXiv:2411.14961
18
citations

LoRA Learns Less and Forgets Less

Jonathan Frankle, Jose Javier Gonzalez Ortiz, Cody Blakeney et al.

ICLR 2025arXiv:2405.09673
252
citations

LoRA-Pro: Are Low-Rank Adapters Properly Optimized?

Zhengbo Wang, Jian Liang, Ran He et al.

ICLR 2025arXiv:2407.18242
54
citations

LoRA Subtraction for Drift-Resistant Space in Exemplar-Free Continual Learning

Xuan Liu, Xiaobin Chang

CVPR 2025arXiv:2503.18985
9
citations

LoRA vs Full Fine-tuning: An Illusion of Equivalence

Reece Shuttleworth, Jacob Andreas, Antonio Torralba et al.

NEURIPS 2025arXiv:2410.21228
70
citations

LoRA-X: Bridging Foundation Models with Training-Free Cross-Model Adaptation

Farzad Farhadzadeh, Debasmit Das, Shubhankar Borse et al.

ICLR 2025arXiv:2501.16559
6
citations

LuxDiT: Lighting Estimation with Video Diffusion Transformer

Ruofan Liang, Kai He, Zan Gojcic et al.

NEURIPS 2025arXiv:2509.03680
5
citations

Magical: Medical Lay Language Generation via Semantic Invariance and Layperson-tailored Adaptation

Weibin Liao, Tianlong Wang, Yinghao Zhu et al.

NEURIPS 2025arXiv:2508.08730
1
citations

Merging LoRAs like Playing LEGO: Pushing the Modularity of LoRA to Extremes Through Rank-Wise Clustering

Ziyu Zhao, tao shen, Didi Zhu et al.

ICLR 2025arXiv:2409.16167
35
citations

MeteoRA: Multiple-tasks Embedded LoRA for Large Language Models

Jingwei Xu, Junyu Lai, Yunpeng Huang

ICLR 2025arXiv:2405.13053
13
citations

MokA: Multimodal Low-Rank Adaptation for MLLMs

Yake Wei, Yu Miao, Dongzhan Zhou et al.

NEURIPS 2025oralarXiv:2506.05191
1
citations

MoORE: SVD-based Model MoE-ization for Conflict- and Oblivion-Resistant Multi-Task Adaptation

Shen Yuan, Yin Zheng, Taifeng Wang et al.

NEURIPS 2025arXiv:2506.14436
1
citations

Multi-Task Vehicle Routing Solver via Mixture of Specialized Experts under State-Decomposable MDP

Yuxin Pan, Zhiguang Cao, Chengyang GU et al.

NEURIPS 2025arXiv:2510.21453

On Large Language Model Continual Unlearning

Chongyang Gao, Lixu Wang, Kaize Ding et al.

ICLR 2025arXiv:2407.10223
30
citations

PaCA: Partial Connection Adaptation for Efficient Fine-Tuning

Sunghyeon Woo, Sol Namkung, SunWoo Lee et al.

ICLR 2025arXiv:2503.01905
3
citations

Parameter Efficient Fine-tuning via Explained Variance Adaptation

Fabian Paischer, Lukas Hauzenberger, Thomas Schmied et al.

NEURIPS 2025arXiv:2410.07170
6
citations

PLAN: Proactive Low-Rank Allocation for Continual Learning

XIEQUN WANG, Zhan Zhuang, Yu Zhang

ICCV 2025arXiv:2510.21188

PointLoRA: Low-Rank Adaptation with Token Selection for Point Cloud Learning

Song Wang, Xiaolu Liu, Lingdong Kong et al.

CVPR 2025arXiv:2504.16023
4
citations

PoLAR: Polar-Decomposed Low-Rank Adapter Representation

Kai Lion, Liang Zhang, Bingcong Li et al.

NEURIPS 2025arXiv:2506.03133
5
citations

Provable Meta-Learning with Low-Rank Adaptations

Jacob Block, Sundararajan Srinivasan, Liam Collins et al.

NEURIPS 2025arXiv:2410.22264

RAD: Region-Aware Diffusion Models for Image Inpainting

Sora Kim, Sungho Suh, Minsik Lee

CVPR 2025arXiv:2412.09191
8
citations

RaSA: Rank-Sharing Low-Rank Adaptation

Zhiwei He, Zhaopeng Tu, Xing Wang et al.

ICLR 2025arXiv:2503.12576
5
citations

Ravan: Multi-Head Low-Rank Adaptation for Federated Fine-Tuning

Arian Raje, Baris Askin, Divyansh Jhunjhunwala et al.

NEURIPS 2025arXiv:2506.05568
3
citations

RILQ: Rank-Insensitive LoRA-Based Quantization Error Compensation for Boosting 2-Bit Large Language Model Accuracy

Geonho Lee, Janghwan Lee, Sukjin Hong et al.

AAAI 2025paperarXiv:2412.01129
5
citations

Robust Federated Finetuning of LLMs via Alternating Optimization of LoRA

Shuangyi Chen, Yuanxin Guo, Yue Ju et al.

NEURIPS 2025arXiv:2502.01755
7
citations

SD-LoRA: Scalable Decoupled Low-Rank Adaptation for Class Incremental Learning

Yichen Wu, Hongming Piao, Long-Kai Huang et al.

ICLR 2025arXiv:2501.13198
34
citations
PreviousNext