Spectrum-Aware Parameter Efficient Fine-Tuning for Diffusion Models

X Zhang, S Wen, L Han, F Juefei-Xu… - arXiv preprint arXiv …, 2024 - arxiv.org
arXiv preprint arXiv:2405.21050, 2024arxiv.org
Adapting large-scale pre-trained generative models in a parameter-efficient manner is
gaining traction. Traditional methods like low rank adaptation achieve parameter efficiency
by imposing constraints but may not be optimal for tasks requiring high representation
capacity. We propose a novel spectrum-aware adaptation framework for generative models.
Our method adjusts both singular values and their basis vectors of pretrained weights. Using
the Kronecker product and efficient Stiefel optimizers, we achieve parameter-efficient …
Adapting large-scale pre-trained generative models in a parameter-efficient manner is gaining traction. Traditional methods like low rank adaptation achieve parameter efficiency by imposing constraints but may not be optimal for tasks requiring high representation capacity. We propose a novel spectrum-aware adaptation framework for generative models. Our method adjusts both singular values and their basis vectors of pretrained weights. Using the Kronecker product and efficient Stiefel optimizers, we achieve parameter-efficient adaptation of orthogonal matrices. We introduce Spectral Orthogonal Decomposition Adaptation (SODA), which balances computational efficiency and representation capacity. Extensive evaluations on text-to-image diffusion models demonstrate SODA's effectiveness, offering a spectrum-aware alternative to existing fine-tuning methods.
arxiv.org