6 Star 44 Fork 16

OAL / Tengine

Create your Gitee Account
Explore and code with more than 6 million developers,Free private repositories !:)
Sign up
Clone or Download
Cancel
Notice: Creating folder will generate an empty file .keep, because not support in Git
Loading...
README_EN.md

English | 简体中文

Tengine

GitHub license Build Status Build Status Test Status codecov Language grade: C/C++

Introduction

Tengine is developed by OPEN AI LAB. This project meet the demand of fast and efficient deployment of deep learning neural network models on embedded devices. In order to achieve cross-platform deployment in many AIoT applications, this project is based on the original Tengine project using C language for reconstruction, and deep frame tailoring for the characteristics of limited embedded device resources. Also, it adopts a completely separated front-end/back-end design, which makes it possible to be transplanted and deployed onto CPU, GPU, NPU and other heterogeneous computing units rapidly, conveniently. At the same time, it is compatible with the original API and model format tmfile of Tengine, which reduces the cost of evaluation and migration.

The core code of Tengine Lite consists of 4 modules:

  • device: NN Operators back-end module, currently provides CPU code, and gradually open source GPU and NPU reference code;
  • scheduler: core components of the framework, including NNIR, Computational Graphs, Hardware Resources, and the scheduling and execution modules of model serializer;
  • operator: NN Operators front-end module, which realizes registration and initialization of NN Operators;
  • serializer: Model decoder, which decodes binary tmfile format into serialized model parameter.

Architecture

Tengine Architecture

How to use

Compile

Example

  • examples provides basic classification and detection algorithm use cases, which are continuously updated according to the needs of issues.

Model Zoo

Model Convert tool

  • Pre-compiled version: Pre-compiled model convert tool is provided on Linux system;
  • Online Convert tool: Based on WebAssembly (the models are converted locally by browsers, no private data will be uploaded);
  • Source Compilation: Refer to Tengine-Convert-Tools project, convert tool could be built by users.

Speed assessment

  • Benchmark Basic network speed assessment tool, any pull request is welcomed.

NPU Plugin

  • TIM-VX VeriSilicon NPU user manual.

AutoKernel Plugin

Container

Roadmap

Acknowledgement

Tengine Lite got ideas and developed based on these projects:

License

FAQ

Tech Forum

Repository Comments ( 0 )

Sign in to post a comment

About

Tengine is a lite, high performance, modular inference engine for embedded device expand collapse
C and 6 more languages
Apache-2.0
Cancel

Releases

No release

Tengine

Contributors

All

Activities

Load More
can not load any more
C
1
https://git.oschina.net/OAL/Tengine.git
git@git.oschina.net:OAL/Tengine.git
OAL
Tengine
Tengine
tengine-lite

Search

161121 f78d6d6f 1850385 154831 86f8c370 1850385