Exploring the Essence of LoRA: A Low-Rank Projection of Full Gradients
Article Title: FLORA: Low-Rank Adapters Are Secretly Gradient Compressors Article Link: https://arxiv.org/pdf/2402.03293 This paper not only introduces a brand new high-rank efficient fine-tuning algorithm but also deeply interprets the essence of LoRA: LoRA is a low-rank projection of full gradients. In fact, this perspective is not surprising; we can see traces of this viewpoint in … Read more