7 Star 10 Fork 4

OpenHarmony-SIG / dllite_micro

加入 Gitee
与超过 1200万 开发者一起发现、参与优秀开源项目,私有仓库也完全免费 :)
免费加入
克隆/下载
贡献代码
同步代码
取消
提示: 由于 Git 不支持空文件夾,创建文件夹后会生成空的 .keep 文件
Loading...
README
Apache-2.0

SIG-DLLite-Micro

简体中文 | English

说明:本SIG的内容遵循OpenHarmony的PMC管理章程 README中描述的约定。

SIG组工作目标和范围

工作目标

DLLite-Micro是一个轻量级的AI推理框架,支持在运行OpenHarmony OS的轻量设备和小型设备上实现深度模型的推理。DLLite-Micro向开发者提供清晰、易上手的北向接口,降低开发者在端侧部署深度模型的难度;可对接多种基础推理框架,而基础推理框架则可对接不同的底层硬件。当前,DLLite-Micro仅适配了MindSpore Lite for IoT推理框架,后续会逐步增加其他的基础推理框架,开发者可以按需配置。

工作范围

  • 模型推理功能 接收用户传入的模型,在用户的正确调用下完成模型的加载,执行和卸载等操作;
  • 样例工程 创建基础业务的样例工程,供开发者用户参考
  • 生态拓展 开源框架工厂模块,指导三方设备厂商、芯片厂商对接框架

代码仓

SIG组成员

Leader

Committers列表

会议

联系方式(可选)


AI业务子系统·DLLite-Micro

简介

DLLite-Micro是一个轻量级的AI推理框架,支持在运行OpenHarmony OS的轻量设备和小型设备上实现深度模型的推理。DLLite-Micro向开发者提供清晰、易上手的北向接口,降低开发者在端侧部署深度学习模型的难度;可对接多种基础推理框架,而基础推理框架则可对接不同的底层硬件。当前,DLLite-Micro仅适配了MindSpore Lite for IoT推理框架,后续会逐步增加其他的基础推理框架,开发者可以按需配置。

图 1 DLLite-Micro框架架构图

目录

/foundation/ai/dllite-micro           # DLLite-Micro框架主目录
├── interfaces
│   └── kits
│       └── interpreter               # DLLite-micro对外接口
├── samples                           # DLLite-micro应用案例demo
│   ├── app                           # app samples目录
│   └── model                         # 模型编译samples目录
├── services
│   ├── inferframework                # 基础推理框架适配模块
│   ├── interpreter                   # 基础推理模块
│   └── third_party                   # 第三方依赖
│       └── mindspore_lite            # mindspore_lite依赖
├── test                              # 模块测试目录
│    └── unittest                     # 单元测试
│        └── common                   # 公共用例

使用场景

支持语言: C++语言

操作系统限制: OpenHarmony操作系统

架构差异: 由于MindSpore Lite提供的代码转换工具在ARM32M平台将模型结构和权重合并,不单独生成权重文件,在ARM32M架构平台上,DLLite-micro采用模型结构和权重合并的方式读取模型,ModelConfig.weightSeparateFlag_需要设为false;在ARM32A架构平台,DLLite-micro采用模型结构和权重分离的形式读取模型,ModelConfig.weightSeparateFlag_需要设为true。

对外接口

DLLite-Micro对外接口

开发步骤

  1. DLLite-Micro框架编译

    轻量级AI推理引擎框架模块,源代码所在路径:/foundation/ai/dllite_micro/services

    编译指令如下:

    设置编译路径

    hb set -root dir //OpenHarmony根目录

    设置编译产品(执行后用方向键和回车进行选择):

    hb set -p

    添加DLLite-Micro组件

    修改文件/build/lite/components/ai.json,添加DLLite-Micro的配置,如下所示为ai.json文件片段,"##start##"和"##end##"之间为新增配置("##start##"和"##end##"仅用来标识位置,添加完配置后删除这两行):

    {
      "components": [
        {
          "component": "ai_engine",
          "description": "AI engine framework.",
          "optional": "true",
          "dirs": [
            "foundation/ai/engine"
          ],
          "targets": [
            "//foundation/ai/engine/services:ai"
          ],
          "rom": "130KB",
          "ram": "~337KB",
          "output": [
            "ai_server",
            "ai_communication_adapter.a"
          ],
          "adapted_kernel": [
            "liteos_a",
            "linux"
          ],
          "features": [],
          "deps": {
            "third_party": [
              "bounds_checking_function",
              "iniparser"
            ],
            "kernel_special": {},
            "board_special": {},
            "components": [
              "hilog",
              "utils_base",
              "ipc_lite",
              "samgr_lite"
            ]
          }
        },
    ##start##
        {
          "component": "ai_dllite_micro",
          "description": "DLLite-micro framework.",
          "optional": "true",
          "dirs": [
            "foundation/ai/dllite_micro"
          ],
          "targets": [
            "//foundation/ai/dllite_micro/services:ai_dllite_micro"
          ],
          "rom": "",
          "ram": "",
          "output": [
            "libdlliteclient.so",
            "libdlliteclient_mslite_for_iot.so"
          ],
          "adapted_kernel": ["liteos_a"],
          "features": [],
          "deps": {
            "third_party": [],
            "components": []
          }
        }
    ##end##
      ]
    }

    修改单板配置文件

    修改文件vendor/hisilicon/hispark_taurus/config.json,新增DLLite-Micro组件的条目,如下所示代码片段为ai子系统配置,"##start##"和"##end##"之间为新增条目("##start##"和"##end##"仅用来标识位置,添加完配置后删除这两行):

        {
          "subsystem": "ai",
          "components": [
            { "component": "ai_engine", "features":[] },
    ##start##
            { "component": "ai_dllite_micro", "features": [] }
    ##end##
          ]
        },

    执行编译

    hb build -f //编译全仓
    或者 hb build dllite_micro //只编译dllite_micro组件

    注意:系统环境搭建请参考 OpenHarmony快速入门;hb工具安装和使用指导请参考 轻量级编译构建组件

  2. 模型编译

    当前框架仅支持加载编译成动态库的推理模型,用户需要根据不同的基础框架按照特定的方式编译模型动态库,下面将介绍不同框架的模型编译过程。

    MindSpore 1.2框架模型编译:

    使用MindSpore 1.2框架进行推理可以参考/foundation/ai/dllite_micro/samples/model/mnist提供的样例,编译模型的目录参考如下:

    /dllite_micro/samples/model/mnist    # Mnist分类模型编译示例目录
    ├── include                          # 头文件目录
    │   ├── nnacl                        # nnacl算子头文件
    │   ├── wrapper
    │   └── mindspore_adapter.h
    ├── lib                              # 依赖库文件目录
    │   ├── libmindspore-lite.a          # MindSpore Lite算子库
    │   └── libwrapper.a                 # MindSpore Lite接口库
    ├── src                              # 源文件目录
    │   ├── micro                        # 通过codegen工具生成的推理源代码
    │   └── mindspore_adapter.cpp        # 封装MindSpore Lite,提供对外接口
    └── BUILD.gn                         # GN配置文件
    1. 首先从MindSpore开源网站获取对应版本软件包(根据具体使用环境下载相应版本的软件包);

    2. 非Mindspore框架的模型需要用软件包内提供的converter工具将原始模型转换为ms格式模型,ms格式的模型可以跳过这一步,converter的使用请参考推理模型转换

    3. 使用codegen将ms模型转换成C/C++格式模型代码,模型代码生成在src目录下,将src目录下的模型代码拷贝到/foundation/ai/dllite_micro/samples/model/mnist/src/micro目录下;

    4. 将/foundation/ai/dllite_micro/samples/model/mnist/src/micro目录下的model.h文件名修改为mmodel.h,同时将session.cc文件中的#include"model.h"修改为#include"mmodel.h",避免和/foundation/ai/dllite_micro/services/third_party/mindspore_lite/include/model.h混淆;

      须知: 请同步拷贝src目录下的net.bin模型权重,后续步骤需要加载模型权重进行推理。

    5. 下载MindSpore r1.2对应OpenHarmony的软件包,取出算子库和接口库(inference/lib/libmindspore-lite.a和tools/lib/libwrapper.a)拷贝到/foundation/ai/dllite_micro/samples/model/mnist/lib目录下;

    6. 将MindSpore Lite软件包中的推理框架头文件(tools/codegen/nnacl/和tools/codegen/wrapper/)拷贝到/foundation/ai/dllite_micro/samples/model/mnist/include目录下;

    7. 修改/build/lite/component/ai.json,添加模型编译的配置,如下所示为ai.json文件片段,"##start##"和"##end##"之间为新增配置("##start##"和"##end##"仅用来标识位置,添加完配置后删除这两行):

        {
          "component": "ai_dllite_micro",
          "description": "DLLite-Micro framework.",
          "optional": "true",      
          "dirs": [
            "foundation/ai/dllite_micro"
          ],
          "targets": [
            "//foundation/ai/dllite_micro/services:ai_dllite_micro",
        ##start##
            "//foundation/ai/dllite_micro/samples:dllite_micro_sample_model"
        ##end##
          ],
          "rom": "",
          "ram": "",
          "output": [
            "libdlliteclient.so",
            "libdlliteclient_mslite_for_iot.so"
          ],
          "adapted_kernel": [ "liteos_a" ],
          "features": [],
          "deps": {
            "components": [],
            "third_party": []
          }
        },
    1. 编译dllite-micro,编译生成的模型动态库在/usr/lib/libmnist.so;

      说明: MindSpore模型转换工具和代码生成工具下载和使用详见MindSpore开源网站

  3. sample开发(参考MNIST分类demo)

    /foundation/ai/dllite-micro/samples/app/mnist目录中提供了样例程序调用dllite-micro提供接口,加载模型动态库和模型权重进行模型推理。

    创建实例

    static int CreatInterpreter()
    {
        // RegisterFeature
        g_featureConfig.featureName = FEATURE_NAME;
        featureInterpreter = FeatureInterpreter::RegisterFeature(featureConfig);
        if (featureInterpreter.get() == nullptr) {
            std::cout << "RegisterFeature failed" << std::endl;
            return -1;
        }
    
        // CreateModelInterpreter
        modelConfig.inferFrameworkType_ = InferFrameworkType::MINDSPORE;
        modelInterpreter = featureInterpreter->CreateModelInterpreter(modelConfig);
        if (modelInterpreter == nullptr) {
            std::cout << "CreateModelInterpreter failed" << std::endl;
            return -1;
        }
        return 0;
    }

    调用推理流程

    static int ModelInference()
    {
        // Load
        ReturnCode returnCode = modelInterpreter->Load();
        if (returnCode != ReturnCode::SUCCESS) {
            std::cout << "Load failed" << std::endl;
            return -1;
        }
    
        // GetTensors
        returnCode = modelInterpreter->GetTensors(inputs, IOFlag::INPUT);
        if (returnCode != ReturnCode::SUCCESS) {
            std::cout << "GetTensors inputs failed" << std::endl;
            return -1;
        }
        returnCode = modelInterpreter->GetTensors(outputs, IOFlag::OUTPUT);
        if (returnCode != ReturnCode::SUCCESS) {
            std::cout << "GetTensors outputs failed" << std::endl;
            return -1;
        }
        SetInputTensor(inputs);
    
        // Invoke
        WarmUp(modelInterpreter);
        for (int i = 0; i < INVOKE_LOOP_COUNT; ++i) {
            returnCode = modelInterpreter->Invoke(inputs, outputs);
            if (returnCode != ReturnCode::SUCCESS) {
                std::cout << Invoke failed" << std::endl;
                return -1;
            }
        }
        
        returnCode = modelInterpreter->GetTensors(outputs, IOFlag::OUTPUT);
        PrintTensors(inputs, outputs);
    
        // Unload
        returnCode = modelInterpreter->Unload();
        if (returnCode != ReturnCode::SUCCESS) {
            std::cout << "Unload failed" << std::endl;
            return -1;
        }
    
        return 0;
    }

    销毁实例

    static int DestoryInterpreter()
    {
        // DestroyModelInterpreter
        ReturnCode returnCode = featureInterpreter->DestroyModelInterpreter(modelInterpreter);
        if (returnCode != ReturnCode::SUCCESS) {
            std::cout << "DestroyModelInterpreter failed" << std::endl;
            return -1;
        }
    
        // UnregisterFeature
        returnCode = FeatureInterpreter::UnregisterFeature(featureInterpreter);
        if (returnCode != ReturnCode::SUCCESS) {
            std::cout << "UnregisterFeature failed" << std::endl;
            return -1;
        }
    
        return 0;
    }

    编译样例程序

    如下图所示,在ai.json中添加dllite_micro_sample的配置("##start##"和"##end##"仅用来标识位置,添加完配置后删除这两行):

        {
          "component": "ai_dllite_micro",
          "description": "DLLite-Micro framework.",
          "optional": "true",      
          "dirs": [
            "foundation/ai/dllite_micro"
          ],
          "targets": [
            "//foundation/ai/dllite_micro/services:ai_dllite_micro",
            "//foundation/ai/dllite_micro/samples:dllite_micro_sample_model",
        ##start##
            "//foundation/ai/dllite_micro/samples:dllite_micro_sample"
        ##end##
          ],
          "rom": "",
          "ram": "",
          "output": [
            "libdlliteclient.so",
            "libdlliteclient_mslite_for_iot.so"
          ],
          "adapted_kernel": [ "liteos_a" ],
          "features": [],
          "deps": {
            "components": [],
            "third_party": []
          }
        },

    样例程序需要模型动态库和模型权重两个文件,在/foundation/ai/dllite-micro/samples/app/mnist/BUILD.gn添加如下命令,编译时将MindSpore Lite生成的模型权重文件拷贝到OpenHarmony系统/storage/data/目录下。

    copy("sample_model") {
      sources = ["//foundation/ai/dllite_micro/samples/model/mnist/src/micro/net.bin"]
      outputs = ["$root_out_dir/data/dllite_micro_mnist.bin"]
    }

    编译dllite-micro组件,编译生成的样例程序在/bin/dllite_micro_mnist_sample.bin,在OpenHarmony系统中执行以下命令,运行应用程序:

    cd /bin
    ./dllite_micro_mnist_sample.bin /usr/lib/libmnist.so /storage/data/dllite_micro_mnist.bin

涉及仓

Apache License Version 2.0, January 2004 http://www.apache.org/licenses/ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 1. Definitions. "License" shall mean the terms and conditions for use, reproduction, and distribution as defined by Sections 1 through 9 of this document. "Licensor" shall mean the copyright owner or entity authorized by the copyright owner that is granting the License. "Legal Entity" shall mean the union of the acting entity and all other entities that control, are controlled by, or are under common control with that entity. For the purposes of this definition, "control" means (i) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the outstanding shares, or (iii) beneficial ownership of such entity. "You" (or "Your") shall mean an individual or Legal Entity exercising permissions granted by this License. "Source" form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files. "Object" form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types. "Work" shall mean the work of authorship, whether in Source or Object form, made available under the License, as indicated by a copyright notice that is included in or attached to the work (an example is provided in the Appendix below). "Derivative Works" shall mean any work, whether in Source or Object form, that is based on (or derived from) the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work and Derivative Works thereof. "Contribution" shall mean any work of authorship, including the original version of the Work and any modifications or additions to that Work or Derivative Works thereof, that is intentionally submitted to Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, "submitted" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as "Not a Contribution." "Contributor" shall mean Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Work. 2. Grant of Copyright License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute the Work and such Derivative Works in Source or Object form. 3. Grant of Patent License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed. 4. Redistribution. You may reproduce and distribute copies of the Work or Derivative Works thereof in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions: (a) You must give any other recipients of the Work or Derivative Works a copy of this License; and (b) You must cause any modified files to carry prominent notices stating that You changed the files; and (c) You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent, trademark, and attribution notices from the Source form of the Work, excluding those notices that do not pertain to any part of the Derivative Works; and (d) If the Work includes a "NOTICE" text file as part of its distribution, then any Derivative Works that You distribute must include a readable copy of the attribution notices contained within such NOTICE file, excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source form or documentation, if provided along with the Derivative Works; or, within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. You may add Your own attribution notices within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License. You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Works as a whole, provided Your use, reproduction, and distribution of the Work otherwise complies with the conditions stated in this License. 5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions. 6. Trademarks. This License does not grant permission to use the trade names, trademarks, service marks, or product names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the NOTICE file. 7. Disclaimer of Warranty. Unless required by applicable law or agreed to in writing, Licensor provides the Work (and each Contributor provides its Contributions) on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Work and assume any risks associated with Your exercise of permissions under this License. 8. Limitation of Liability. In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages. 9. Accepting Warranty or Additional Liability. While redistributing the Work or Derivative Works thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability. END OF TERMS AND CONDITIONS

简介

为了支持AI模型在IoT设备上完成推理功能,使IoT设备具备AI能力,并能够更好的匹配OpenHarmonyOS的各种设备形态,DLLite SIG不仅需要为用户提供基础的推理服务,包括统一的API和 展开 收起
C++ 等 2 种语言
Apache-2.0
取消

发行版

暂无发行版

贡献者

全部

近期动态

加载更多
不能加载更多了
1
https://gitee.com/openharmony-sig/dllite_micro.git
git@gitee.com:openharmony-sig/dllite_micro.git
openharmony-sig
dllite_micro
dllite_micro
master

搜索帮助

14c37bed 8189591 565d56ea 8189591