Efficient Transformer: SparseViT Reassessing Activation Sparsity in High-Resolution ViT

Efficient Transformer: SparseViT Reassessing Activation Sparsity in High-Resolution ViT

Click the card below to follow the “LiteAI” public account Hi, everyone, I am Lite. Recently, I shared the Efficient Large Model Full-Stack Technology from Article One to Nineteen, which includes content on large model quantization and fine-tuning, efficient inference for LLMs, quantum computing, generative AI acceleration, etc. The content links are as follows: Efficient … Read more

Efficient Transformer for TinyML: Long-Short Distance Attention

Efficient Transformer for TinyML: Long-Short Distance Attention

Click the card below to follow the “LiteAI” public account Hi, everyone, I am Lite. Recently, I shared the Efficient Large Model Full-Stack Technology from Part 1 to 19, including large model quantization, fine-tuning, efficient inference of LLMs, quantum computing, generative AI acceleration, etc. The content links are as follows: Efficient Large Model Full-Stack Technology … Read more

LoRA: Low-Rank Adaptation of Large Language Models

LoRA: Low-Rank Adaptation of Large Language Models

Paper Title: LoRA: Low-Rank Adaptation of Large Language Models Authors: Edward J. Hu, Yelong Shen, Phillip Wallis, Zeyuan Allen-Zhu, Yuanzhi Li, Shean Wang, Lu Wang, Weizhu Chen Compiled by: Kuang Ji Reviewed by: Los Abstract: This article introduces a new approach for deploying large models, called LoRA, which reduces the number of parameters needed for … Read more

How to Build a High-Precision Rotary Transformer Simulator with Fault Injection Features?

How to Build a High-Precision Rotary Transformer Simulator with Fault Injection Features?

Good articles should be shared! If you like this article, please contact the backend to add to the whitelist, and you are welcome to reprint it~ Due to the ability of rotary transformers to maintain excellent reliability and high precision performance in harsh and adverse environments for a long time, they are widely used in … Read more