Ultra Lightweight! AI Model Compiler MegCC Open Source Reduces Inference Engine Size
Currently, there are quite a few mobile deep learning inference frameworks in the community (such as NCNN, MNN), which provide considerable convenience for community users to deploy deep learning on mobile devices. However, these inference frameworks share a common problem: as they continue to iterate and optimize performance, the runtime libraries gradually increase in size. … Read more