LoRA+MoE: A Historical Interpretation of the Combination of Low-Rank Matrices and Multi-Task Learning

LoRA+MoE: A Historical Interpretation of the Combination of Low-Rank Matrices and Multi-Task Learning

↑↑↑ Follow and Star Kaggle Competition Guide Kaggle Competition Guide Author: Elvin Loves to Ask, excerpted from Zhai Ma LoRA+MoE: A Historical Interpretation of the Combination of Low-Rank Matrices and Multi-Task Learning This article introduces some works that combine LoRA and MoE, hoping to be helpful to everyone. 1. MoV and MoLoRA Paper: 2023 | … Read more