Available on Google Play. Subscribe to news telegram channel. Discuss this repo, frameworks, networks on mobile in telegram group.
Frameworks:
- MNN Alibaba releases
- NCNN Tencent releases
- TFLite Google releases
- TFMobile Google releases
- Pytorch Facebook releases
- OpenCV DNN releases
- onnxruntime Microsoft releases how to build
- [?] Mace by Xiaomi
- Tengine Lite OPEN AI LAB
- TNN Tencent
- NeoML ABBYY
- [?] SNPE Qualcomm
- HiAI Huawei
- NeuroPilot SDK Mediatek
- Paddle-Lite Baidu
- Samsung Neural SDK (if they approve my request)
(versions are specified in corresponding *Framework classes)
Questionable/other:
- huawei-noah/bolt not very popular?
- JDAI-CV/dabnn binary networks
Features:
- Compare inference results between frameworks and desktop
- Visualize progress/results
- Publish to Play Market
- Collect results on backend
- Web site with agregated results
Models:
All models are floating point
Supported ABIs: armeabi-v7a, arm64-v8a. Some frameworks (eg TF) also supports x86 and x86_64, but are they still exist in 2020?
A detailed explanation of how to convert the model into each framework available here.
Here is repo with docker images contatining some built converters and other maybe nessesary tools. gordinmitya/docker_that_framework
- Thanks to Rohithkvsp/OnnxRuntimeAndorid for sample code on how to use onnxruntime with nnapi!
Qualcomm prohibits redestribution of their libraries, so you have to register there and download them by yourself. ¯\_(ツ)_/¯
- Register and download zip from developer.qualcomm.com;
- Copy
android/snpe-release.aar
from archive intosnpe/libs
.
OR compile without snpe
- Remove
, ':snpe'
fromsettings.gradle
; - Remove
implementation project(path: ':snpe')
fromapp/build.gradle
; - Remove amy mentions of SNPE in MainActivity.kt.
Project itself and code inside ru.gordinmitya.*
packages are under MIT licanse as stated in LICENSE file.
Code inside other packages (eg org.opencv.*
) or some C plus plus code may be under other licenses.
ImageNet samples were taken from Kaggle.