5 Star 7 Fork 6

Gitee 极速下载 / DELTA

加入 Gitee
与超过 1200万 开发者一起发现、参与优秀开源项目,私有仓库也完全免费 :)
免费加入
此仓库是为了提升国内下载速度的镜像仓库,每日同步一次。 原始仓库: https://github.com/didi/delta
克隆/下载
贡献代码
同步代码
取消
提示: 由于 Git 不支持空文件夾,创建文件夹后会生成空的 .keep 文件
Loading...
README
Apache-2.0

Build Status Contributions welcome GitHub top language GitHub Issues License

DELTA - A DEep learning Language Technology plAtform

What is DELTA?

DELTA is a deep learning based end-to-end natural language and speech processing platform. DELTA aims to provide easy and fast experiences for using, deploying, and developing natural language processing and speech models for both academia and industry use cases. DELTA is mainly implemented using TensorFlow and Python 3.

For details of DELTA, please refer to this paper.

What can DELTA do?

DELTA has been used for developing several state-of-the-art algorithms for publications and delivering real production to serve millions of users. It helps you to train, develop, and deploy NLP and/or speech models, featuring:

  • Easy-to-use
    • One command to train NLP and speech models, including:
      • NLP: text classification, named entity recognition, question and answering, text summarization, etc
      • Speech: speech recognition, speaker verification, emotion recognition, etc
    • Use configuration files to easily tune parameters and network structures
  • Easy-to-deploy
    • What you see in training is what you get in serving: all data processing and features extraction are integrated into a model graph
    • Uniform I/O interfaces and no changes for new models
  • Easy-to-develop
    • Easily build state-of-the-art models using modularized components
    • All modules are reliable and fully-tested

Table of Contents

Installation

We provide several approach to install DELTA:

Install from pip

We provide the pip install support for nlp version of DELTA.

Note: Users can still install DELTA from the source for both nlp and speech tasks.

We recommend to create conda or virtualenv and install DELTA from pip in the virtual environment. For example

conda create -n delta-pip-py3.6 python=3.6
conda activate delta-pip-py3.6

Please install TensorFlow 2.x if you have not installed it in your system.

pip install tensorflow

Then, simply install DELTA use the following command:

pip install delta-nlp

After install DELTA, you can follow this example to train NLP models or develop new models. A Text Classification Usage Example for pip users

Install from Source Code

To install from the source code, we use conda to install required packages. Please install conda if you do not have it in your system.

Also, we provide two options to install DELTA, nlp version or full version. nlp version needs minimal requirements and only installs NLP related packages:

# Run the installation script for NLP version, with CPU or GPU.
cd tools
./install/install-delta.sh nlp [cpu|gpu]

Note: Users from mainland China may need to set up conda mirror sources, see ./tools/install/install-delta.sh for details.

If you want to use both NLP and speech packages, you can install the full version. The full version needs Kaldi library, which can be pre-installed or installed using our installation script.

cd tools
# If you have installed Kaldi
KALDI=/your/path/to/Kaldi ./install/install-delta.sh full [cpu|gpu]
# If you have not installed Kaldi, use the following command
# ./install/install-delta.sh full [cpu|gpu]

To verify the installation, run:

# Activate conda environment
conda activate delta-py3.6-tf2.3.0
# Or use the following command if your conda version is < 4.6
# source activate delta-py3.6-tf2.3.0

# Add DELTA environment
source env.sh

# Generate mock data for text classification.
pushd egs/mock_text_cls_data/text_cls/v1
./run.sh
popd

# Train the model
python3 delta/main.py --cmd train_and_eval --config egs/mock_text_cls_data/text_cls/v1/config/han-cls.yml

Manual installation

For advanced installation, full version users, or more details, please refer to manual installation.

Install from Docker

For Docker users, we provide images with DELTA installed. Please refer to docker installation.

Quick Start

Existing Examples

DELTA organizes many commonly-used tasks as examples in egs directory. Each example is a NLP or speech task using a public dataset. We provide the whole pipeline including data processing, model training, evaluation, and deployment.

You can simply use the run.sh under each directory to prepare the dataset, and then train or evaluate a model. For example, you can use the following command to download the CONLL2003 dataset and train and evaluate a BLSTM-CRF model for NER:

pushd ./egs/conll2003/seq_label/v1/
./run.sh
popd
python3 delta/main.py --cmd train --config egs/conll2003/seq_label/v1/config/seq-label.yml
python3 delta/main.py --cmd eval --config egs/conll2003/seq_label/v1/config/seq-label.yml

Modeling

There are several modes to start a DELTA pipeline:

  • train_and_eval
  • train
  • eval
  • infer
  • export_model

Note: Before run any command, please make sure you need to source env.sh in the current command prompt or a shell script.

You can use train_and_eval to start the model training and evaluation:

python3 delta/main.py --cmd train_and_eval --config <your configuration file>.yml

This is equivalent to:

python3 delta/main.py --cmd train --config <your configuration file>.yml 
python3 delta/main.py --cmd eval --config <your configuration file>.yml 

For evaluation, you need to prepare a data file with features and labels. If you only want to do inference with feature only, you can use the infer mode:

python3 delta/main.py --cmd infer --config <your configuration file>.yml 

When the training is done, you can export a model checkpoint to SavedModel:

python3 delta/main.py --cmd export_model --config <your configuration file>.yml 

Deployment

For model deployment, we provide many tools in the DELTA-NN package. We organize the model deployment scripts under ./dpl directory.

  • Docker pull zh794390558/delta:deltann-cpu-py3 image, we test deployment under this env.
  • Download third-party pacakges by cd tools && make deltann.
  • Put SavedModel and configure model.yaml into dpl/model.
  • Use scripts under dpl/run.sh to convert model to other deployment model, and compile libraries.
  • All compiled tensorflow libs and delta-nn libs are in dpl/lib.
  • All things need for deployment are under dpl/output dir.
  • Test, benchmark or serve under docker.

For more information, please see dpl/README.md.

Graph Compiler

Graph Compiler using TensorFlow grappler, see gcompiler/README.md.

Benchmarks

In DELTA, we provide experimental results for each task on public datasets as benchmarks. For each task, we compare our implementation with a similar model chosen from a highly-cited publication. You can reproduce the experimental results using the scripts and configuration in the ./egs directory. For more details, please refer to released models.

NLP tasks

Task Model DataSet Metric DELTA Baseline Baseline reference
Sentence Classification CNN TREC Acc 92.2 91.2 Kim (2014)
Document Classification HAN Yahoo Answer Acc 75.1 75.8 Yang et al. (2016)
Named Entity Recognition BiLSTM-CRF CoNLL 2003 F1 84.6 84.7 Huang et al. (2015)
Intent Detection (joint) BiLSTM-CRF-Attention ATIS Acc 97.4 98.2 Liu and Lane (2016)
Slots Filling (joint) BiLSTM-CRF-Attention ATIS F1 95.2 95.9 Liu and Lane (2016)
Natural Language Inference LSTM SNLI Acc 80.7 80.6 Bowman et al. (2016)
Summarization Seq2seq-LSTM CNN/Daily Mail RougeL 27.3 28.1 See et al. (2017)
Pretrain-NER ELMO CoNLL 2003 F1 92.2 92.2 Peters et al. (2018)
Pretrain-NER BERT CoNLL 2003 F1 94.6 94.9 Devlin et al. (2019)

Speech tasks

Task Model DataSet Metric DELTA Baseline Baseline reference
Speech recognition CTC HKUST CER 36.49 38.67 Miao et al. (2016)
Speaker verfication TDNN VoxCeleb EER 3.028 3.138 Kaldi
Emotion recognition RNN-mean pool IEMOCAP Acc 59.44 56.90 Mirsamadi et al. (2017)

FAQ

See FAQ for more information.

Contributing

Any contribution is welcome. All issues and pull requests are highly appreciated! For more details, please refer to the contribution guide.

References

Please cite this paper when referencing DELTA.

@ARTICLE{delta,
       author = {{Han}, Kun and {Chen}, Junwen and {Zhang}, Hui and {Xu}, Haiyang and
         {Peng}, Yiping and {Wang}, Yun and {Ding}, Ning and {Deng}, Hui and
         {Gao}, Yonghu and {Guo}, Tingwei and {Zhang}, Yi and {He}, Yahao and
         {Ma}, Baochang and {Zhou}, Yulong and {Zhang}, Kangli and {Liu}, Chao and
         {Lyu}, Ying and {Wang}, Chenxi and {Gong}, Cheng and {Wang}, Yunbo and
         {Zou}, Wei and {Song}, Hui and {Li}, Xiangang},
       title = "{DELTA: A DEep learning based Language Technology plAtform}",
       journal = {arXiv e-prints},
       year = "2019",
       url = {https://arxiv.org/abs/1908.01853},
}

License

The DELTA platform is licensed under the terms of the Apache license. See LICENSE for more information.

Acknowledgement

The DELTA platform depends on many open source repos. See References for more information.

 Apache License Version 2.0, January 2004 http://www.apache.org/licenses/ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 1. Definitions. "License" shall mean the terms and conditions for use, reproduction, and distribution as defined by Sections 1 through 9 of this document. "Licensor" shall mean the copyright owner or entity authorized by the copyright owner that is granting the License. "Legal Entity" shall mean the union of the acting entity and all other entities that control, are controlled by, or are under common control with that entity. For the purposes of this definition, "control" means (i) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the outstanding shares, or (iii) beneficial ownership of such entity. "You" (or "Your") shall mean an individual or Legal Entity exercising permissions granted by this License. "Source" form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files. "Object" form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types. "Work" shall mean the work of authorship, whether in Source or Object form, made available under the License, as indicated by a copyright notice that is included in or attached to the work (an example is provided in the Appendix below). "Derivative Works" shall mean any work, whether in Source or Object form, that is based on (or derived from) the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work and Derivative Works thereof. "Contribution" shall mean any work of authorship, including the original version of the Work and any modifications or additions to that Work or Derivative Works thereof, that is intentionally submitted to Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, "submitted" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as "Not a Contribution." "Contributor" shall mean Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Work. 2. Grant of Copyright License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute the Work and such Derivative Works in Source or Object form. 3. Grant of Patent License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed. 4. Redistribution. You may reproduce and distribute copies of the Work or Derivative Works thereof in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions: (a) You must give any other recipients of the Work or Derivative Works a copy of this License; and (b) You must cause any modified files to carry prominent notices stating that You changed the files; and (c) You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent, trademark, and attribution notices from the Source form of the Work, excluding those notices that do not pertain to any part of the Derivative Works; and (d) If the Work includes a "NOTICE" text file as part of its distribution, then any Derivative Works that You distribute must include a readable copy of the attribution notices contained within such NOTICE file, excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source form or documentation, if provided along with the Derivative Works; or, within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. You may add Your own attribution notices within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License. You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Works as a whole, provided Your use, reproduction, and distribution of the Work otherwise complies with the conditions stated in this License. 5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions. 6. Trademarks. This License does not grant permission to use the trade names, trademarks, service marks, or product names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the NOTICE file. 7. Disclaimer of Warranty. Unless required by applicable law or agreed to in writing, Licensor provides the Work (and each Contributor provides its Contributions) on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Work and assume any risks associated with Your exercise of permissions under this License. 8. Limitation of Liability. In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages. 9. Accepting Warranty or Additional Liability. While redistributing the Work or Derivative Works thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability. END OF TERMS AND CONDITIONS APPENDIX: How to apply the Apache License to your work. To apply the Apache License to your work, attach the following boilerplate notice, with the fields enclosed by brackets "{}" replaced with your own identifying information. (Don't include the brackets!) The text should be enclosed in the appropriate comment syntax for the file format. We also recommend that a file or class name and description of purpose be included on the same "printed page" as the copyright notice for easier identification within third-party archives. Copyright (C) 2017 Beijing Didi Infinity Technology and Development Co.,Ltd. All rights reserved. Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.

简介

滴滴正式宣布开源基于深度学习的语音和自然语言理解模型训练平台 DELTA,以进一步帮助 AI 开发者创建、部署自然语言处理和语音模型,构建高效的解决方案,助力 NLP 应用更好落地 展开 收起
Python 等 6 种语言
Apache-2.0
取消

发行版

暂无发行版

贡献者

全部

近期动态

加载更多
不能加载更多了
Python
1
https://gitee.com/mirrors/DELTA.git
git@gitee.com:mirrors/DELTA.git
mirrors
DELTA
DELTA
master

搜索帮助