Skip to content

Latest commit

 

History

History
87 lines (55 loc) · 3.49 KB

README.md

File metadata and controls

87 lines (55 loc) · 3.49 KB

zetton-inference

English | 中文

Table of Contents

Introduction

zetton-inference is an open source package for deep learning inference. It's a part of the Project Zetton.

Major features
  • Modular Design: zetton-inference is designed to be modular, which means that you can easily add new inference nodes to the package.

  • Support Multiple Frameworks: zetton-inference supports multiple deep learning frameworks, such as ONNX, TensorRT, RKNN, OpenVINO, etc.

  • High Efficiency: zetton-inference is designed to be high efficient, which means that you can easily deploy the inference nodes to GPU servers or embedded devices.

  • State-of-the-art Algorithms: zetton-inference provides state-of-the-art algorithms, such as object detection, object tracking, etc.

What's New

  • (2022-09-19) TensorRT-based inference nodes are moved to zetton-inference-tensorrt
  • (2022-10-08) this repo is conevrted into a pure CMake package.

Please refer to changelog.md for details and release history.

For compatibility changes between different versions of zetton-inference, please refer to compatibility.md.

Installation

Please refer to Installation for installation instructions.

Getting Started

Please see get_started.md for the basic usage of zetton-inference.

Overview of Benchmark and Model Zoo

Task Model ONNX TensorRT RKNN OpenVINO
Detection YOLOv5
Detection YOLOX
Detection YOLOv7
Tracking DeepSORT / / / /
Tracking ByteTrack / / / /

FAQ

Please refer to FAQ for frequently asked questions.

Contributing

We appreciate all contributions to improve zetton-inference. Please refer to CONTRIBUTING.md for the contributing guideline.

Acknowledgement

We appreciate all the contributors who implement their methods or add new features, as well as users who give valuable feedbacks. We wish that the package and benchmark could serve the growing research and production community by providing a flexible toolkit to deploy models.

License

  • For academic use, this project is licensed under the 2-clause BSD License, please see the LICENSE file for details.
  • For commercial use, please contact Yusu Pan.

Related Projects