The field of artificial intelligence (AI) has seen a growing interest in large models.Large models, as the name suggests, refer to deep learning models with a vast number of parameters and highly complex structures.The emergence of these models has not only propelled breakthroughs in AI technology but has also brought revolutionary changes to various industries.
Experience with the iTOP-RK3588 development board and large models
To allow users to experience RKLLM more quickly, Beijing Xunwei has upgraded the NPU version in the Linux kernel source code to the latest version 0.9.6, as detailed below:
By default, the RKLLM dynamic library is integrated into Ubuntu and Debian systems, allowing users to directly copy the converted RKLLM large prediction models and inference programs for testing. For detailed instructions, please refer to the NPU manual for the relevant steps on RKLLM model conversion and testing.
Update Content
■《iTOP-RK3588 Development BoardNPU User Manual》v1.1
Added Chapter 8 related to RKLLM large language model testing
■ How to obtain: Contact customer service to join the RK3588 after-sales group
■ Purchase link:
https://m.tb.cn/h.5EZ0ev1LMbQdsiI?tk=rrl8WmlLtmr
Tutorial Directory
Chapter 1 Hello! NPU
1.1 The Birth of NPU!
1.2 Getting to Know RKNPU
Chapter 2 Preparing the RKNPU Development Environment
2.1 Development Environment
2.1 Software Architecture
2.2 SDK Description
Chapter 3 Getting NPU Running
3.1 Using NPU in Linux
3.1.1 Setting Up Cross Compiler
3.1.2 Modifying Compiler Tool Path
3.1.3 Updating RKNN Model
3.1.4 Compiling Demo
3.1.5 Running Demo on Development Board
3.2 Using NPU in Android
3.2.1 Downloading Required Tools for Compilation
3.2.2 Modifying Compiler Tool Path
3.2.3 Updating RKNN Model
3.2.4 Compiling Demo
3.2.5 Running Demo on Development Board
Chapter 4 Experience RKNN_DEMO
4.1 rknn_ssd_demo Experience
4.2 rkn_api_demo Experience
4.3 rknn_multiple_input_demo Experience
Chapter 5 Model Conversion
5.1 RKNN-Toolkit2 Introduction
5.2 Setting Up RKNN-Toolkit2 Environment
5.2.1 Installing Miniconda
5.2.2 Creating RKNN Virtual Environment
5.2.3 Installing Pycharm
5.2.4 Configuring Pycharm
5.3 Using RKNN-Toolkit2 Tools
5.3.1 Running Model in Emulator
5.3.2 Running Model on RK3588 Development Board
Chapter 6 Other Model Conversions
6.1 Using TensorFlow Framework
6.2 Using Caffe Framework
6.3 Using TFLite Framework
6.4 Using ONNX Framework
6.5 Using Darknet Framework
6.6 Using PyTorch Framework
Chapter 7 Using RKNN-Toolkit-lite2
7.1 Main Features Description
7.2 Steps to Set Up Environment
7.2.1 Installing Miniconda
7.2.2 Creating RKNN Virtual Environment
7.2.3 Installing RKNN-ToolkitLite 2 Package
7.2.4 Installing OpenCV
7.3 Running Test Program
Chapter 8 RKLLM Large Prediction Model Testing
8.1 Introduction to RKLLM-Toolkit
8.2 Setting Up RKLLM-Toolkit Environment
8.2.1 Installing Miniconda
8.2.2 Creating RKLLM Virtual Environment
8.3 Large Language Model Conversion
8.4 Compiling Inference Program
8.5 Running Tests on Development Board
Video Showcase
■ Follow the “Xunwei Electronics” WeChat public account for tutorials, materials, industry insights, and product resources.
■ For more information about Xunwei, please contact us:
Sales Engineer: 010-8527-0708 Extension 1
Technical Support: 010-8527-0708 Extension 2
Custom Solutions: 010-8527-0708 Extension 3
■ Real-time technical support:
AM 9:00—12:00, PM 13:30—17:30 (Monday to Saturday)
Technical Group Chat【824412014】
END
Long press to recognize the QR code to follow
Xunwei Electronics
Making learning easier and development simpler
【RK3588 Adaptation to Android 13】 【Hongmeng Learning Development Tutorial】
【RKNPU2 Three Major Projects in Practice】 【RK3568 Driver Guide Issue 14】
Leave a Comment
Your email address will not be published. Required fields are marked *