1 Star 0 Fork 2

donfar / KnowledgeGraphCourse

加入 Gitee
与超过 1200万 开发者一起发现、参与优秀开源项目,私有仓库也完全免费 :)
免费加入
该仓库未声明开源许可证文件(LICENSE),使用请关注具体项目描述及其代码上游依赖。
克隆/下载
贡献代码
同步代码
取消
提示: 由于 Git 不支持空文件夾,创建文件夹后会生成空的 .keep 文件
Loading...
README

A systematic course about knowledge graph for graduate students, interested researchers and engineers.

东南大学《知识图谱》研究生课程
时间:2019年春季(2月下旬~5月中旬)
每周五下午2:00~4:30
地点:东南大学九龙湖校区, 纪忠楼Y205
答疑/讨论/建议:请致信 pwang AT seu.edu.cn

课程内容

第1讲 知识图谱概论 (2019-3-1,2019-3-8)

1.1 知识图谱起源和发展
1.2 知识图谱 VS 深度学习
1.3 知识图谱 VS 关系数据库 VS 传统专家库
1.4 知识图谱本质和核心价值
1.5 知识图谱技术体系
1.6 典型知识图谱
1.7 知识图谱应用场景
课件下载:partA partB partC

第2讲 知识表示 (2019-3-15)

2.1 知识表示概念
2.2 知识表示方法

  • 语义网络
  • 产生式系统
  • 框架系统
  • 概念图
  • 形式化概念分析
  • 描述逻辑
  • 本体
  • 本体语言
  • 统计表示学习

课件下载:partA

第3讲 知识建模 (2019-3-15,2019-3-22)

3.1 本体
3.2 知识建模方法

  • 本体工程
  • 本体学习
  • 知识建模工具
  • 知识建模实践

课件下载:partA

第4讲 知识抽取基础:问题和方法(2019-3-22)

4.1 知识抽取场景
4.2 知识抽取挑战
4.3 面向结构化数据的知识抽取
4.4 面向半结构化数据的知识抽取
4.5 面向非机构化数据的知识抽取

课件下载:partA

第5讲 知识抽取:数据采集(2019-3-29)

5.1 数据采集原理和技术

  • 爬虫原理
  • 请求和响应
  • 多线程并行爬取
  • 反爬机制应对
    5.2 数据采集实践
  • 百科 论坛 社交网络等爬取实践

课件下载:partA

第6讲 知识抽取:实体识别(2019-3-29)

6.1 实体识别基本概念
6.2 基于规则和词典的实体识别方法
6.3 基于机器学习的实体识别方法
6.4 基于深度学习的实体识别方法
6.5 基于半监督学习的实体识别方法
6.6 基于迁移学习的实体识别方法
6.7 基于预训练的实体识别方法

课件下载:partA

第7讲 知识抽取:关系抽取(2019-4-19,2019-4-26)

7.1 关系基本概念
7.2 语义关系
7.3 关系抽取的特征
7.4 关系抽取数据集
7.5 基于监督学习的关系抽取方法
7.6 基于无监督学习的关系抽取方法
7.7 基于远程监督的关系抽取方法
7.8 基于深度学习/强化学习的关系抽取方法

课件下载:partA

第8讲 知识抽取:事件抽取(2019-3-29)

8.1 事件抽取基本概念
8.2 基于规则和模板的事件抽取方法
8.3 基于机器学习的事件抽取方法
8.4 基于深度学习的事件抽取方法
8.5 基于知识库的事件抽取方法
8.6 基于强化学习的事件抽取方法

课件下载:partA

第9讲 知识融合(2019-4-28)

9.1 知识异构
9.2 本体匹配
9.3 匹配抽取和匹配调谐
9.4 实体匹配
9.5 大规模实体匹配处理
9.6 知识融合应用实例

课件下载:partA

第10讲 知识图谱表示学习(2019-5-5)

10.1 知识表示学习概念
10.2 基于距离的表示学习模型
10.3 基于翻译的表示学习模型
10.4 基于语义的表示学习模型
10.5 融合多源信息的表示学习模型
10.6 知识图谱表示学习模型的评测
10.7 知识图谱表示学习前沿进展和挑战

课件下载:partA

第11讲 知识存储(2019-5-10)

11.1 知识存储概念
11.2 图数据库管理系统、模型、查询语言
11.3 RDF数据库管理系统、模型、查询语言
11.4 基于关系型数据库的知识存储

课件下载:partA

第12讲 基于知识的智能问答(2019-5-10)

12.1 智能问答基础
12.2 问题理解
12.3 问题求解
12.4 基于模板的知识问答方法
12.5 基于语义分析的知识问答方法
12.6 基于深度学习的知识问答方法
12.7 IBM Watson原理和技术剖析
12.8 微软小冰的原理和技术剖析

课件下载:partA

第13讲 实体链接(2019-5-17)

13.1 实体链接基本概念
13.2 基于概率生成模型的实体链接方法
13.3 基于主题模型的实体链接方法
13.4 基于图的实体链接方法
13.5 基于深度学习的实体链接方法
13.6 基于无监督的实体链接方法

课件下载:partA

第14讲 知识推理(2019-5-17)

14.1 知识推理基础概念
14.2 基于逻辑的知识推理方法
14.3 基于统计学习的知识推理方法
14.4 基于图的知识推理方法
14.4 基于神经网络的知识推理方法
14.5 多种方法混合的知识推理方法

课件下载:partA

附录A:经典文献选读

知识图谱构建

  1. Dong X, Gabrilovich E, Heitz G, et al. Knowledge vault: A web-scale approach to probabilistic knowledge fusion. KDD2014: 601-610.
  2. Suchanek F M, Kasneci G, Weikum G. Yago: a core of semantic knowledge. WWW2007: 697-706.
  3. Hoffart J, Suchanek F M, Berberich K, et al. YAGO2: A spatially and temporally enhanced knowledge base from Wikipedia. Artificial Intelligence, 2013, 194: 28-61.
  4. Navigli R, Ponzetto S P. BabelNet: The automatic construction, evaluation and application of a wide-coverage multilingual semantic network. Artificial Intelligence, 2012, 193: 217-250.
  5. Auer S, Bizer C, Kobilarov G, et al. Dbpedia: A nucleus for a web of open data. ISWC2007: 722-735.
  6. Mitchell T, Cohen W, Hruschka E, et al. Never-ending learning. Communications of the ACM, 2018, 61(5): 103-115. earlier work

知识表示和建模

  1. Sowa J F. Knowledge representation: logical, philosophical, and computational foundations. 1999.
  2. Noy N F, McGuinness D L. Ontology Development 101: A Guide to Creating Your First Ontology. another version

知识抽取

  • 信息抽取
  1. Etzioni O, Cafarella M, Downey D, et al. Web-scale information extraction in knowitall:(preliminary results).WWW2004: 100-110.
  2. Banko M, Cafarella M J, Soderland S, et al. Open information extraction from the web. IJCAI2007, 7: 2670-2676.
  3. Sarawagi S. Information extraction. Foundations and Trends® in Databases, 2008, 1(3): 261-377.
  4. Fader A, Soderland S, Etzioni O. Identifying relations for open information extraction. EMNLP2011: 1535-1545.
  5. Fan J, Kalyanpur A, Gondek D C, et al. Automatic knowledge extraction from documents. IBM Journal of Research and Development, 2012, 56(3.4): 5: 1-5: 10.
  6. Hearst M A. Automatic acquisition of hyponyms from large text corpora. ACL1992: 539-545.
  • 实体识别
  1. Nadeau D, Sekine S. A survey of named entity recognition and classification. Lingvisticae Investigationes, 2007, 30(1): 3-26.
  2. Lample G, Ballesteros M, Subramanian S, et al. Neural architectures for named entity recognition. NAACL-HLT 2016.
  3. Huang Z, Xu W, Yu K. Bidirectional LSTM-CRF models for sequence tagging. arXiv preprint arXiv:1508.01991, 2015.
  4. Alhelbawy A, Gaizauskas R. Graph ranking for collective named entity disambiguation. ACL2014, 2: 75-80.
  5. Florian R, Ittycheriah A, Jing H, et al. Named entity recognition through classifier combination. HLT-NAACL2003: 168-171.
  6. Chiu J P C, Nichols E. Named entity recognition with bidirectional LSTM-CNNs. Transactions of the Association for Computational Linguistics, 2016, 4: 357-370.
  7. Nothman J, Ringland N, Radford W, et al. Learning multilingual named entity recognition from Wikipedia. Artificial Intelligence, 2013, 194: 151-175.
  8. Santos C N, Guimaraes V. Boosting named entity recognition with neural character embeddings. Proceedings of NEWS 2015 The Fifth Named Entities Workshop, 2015.
  9. Chiticariu L, Krishnamurthy R, Li Y, et al. Domain adaptation of rule-based annotators for named-entity recognition tasks. EMNLP2010: 1002-1012.
  10. Shaalan K. A survey of arabic named entity recognition and classification. Computational Linguistics, 2014, 40(2): 469-510.
  11. Speck R, Ngomo A C N. Ensemble learning for named entity recognition. ISWC2014:519-534.
  12. Habibi M, Weber L, Neves M, et al. Deep learning with word embeddings improves biomedical named entity recognition. Bioinformatics, 2017, 33(14): i37-i48.
  • 关系抽取
  1. Wang C, Kalyanpur A, Fan J, et al. Relation extraction and scoring in DeepQA. IBM Journal of Research and Development, 2012, 56(3.4): 9: 1-9: 12.

  2. Socher R, Huval B, Manning C D, et al. Semantic compositionality through recursive matrix-vector spaces[C]//EMNLP, 2012: 1201-1211.

  3. Liu C Y, Sun W B, Chao W H, et al. Convolution neural network for relation extraction[C]//International Conference on Advanced Data Mining and Applications. Springer, Berlin, Heidelberg, 2013: 231-242.

  4. Zeng D, Liu K, Lai S, et al. Relation classification via convolutional deep neural network[J]. 2014.

  5. Santos, Cicero Nogueira dos, Bing Xiang, and Bowen Zhou. “Classifying relations by ranking with convolutional neural networks.” In Proceedings of ACL, 2015.

  6. Zeng D, Liu K, Chen Y, et al. Distant Supervision for Relation Extraction via Piecewise Convolutional Neural Networks[C]//Emnlp. 2015: 1753-1762.

  7. Miwa M , Bansal M . End-to-end Relation Extraction using LSTMs on Sequences and Tree Structures[J]. ACL, 2016: 1105–1116.

  8. Zhou P, Shi W, Tian J, et al. Attention-based bidirectional long short-term memory networks for relation classification[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers). 2016, 2: 207-212.

  9. Lin Y, Shen S, Liu Z, et al. Neural relation extraction with selective attention over instances[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). 2016, 1: 2124-2133.

  10. Cai R, Zhang X, Wang H. Bidirectional recurrent convolutional neural network for relation classification[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). 2016, 1: 756-765.

  11. Wang L, Cao Z, De Melo G, et al. Relation classification via multi-level attention cnns[J]. 2016.

  12. Zhou P, Shi W, Tian J, et al. Attention-based bidirectional long short-term memory networks for relation classification[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers). 2016, 2: 207-212.

  13. Lin Y, Shen S, Liu Z, et al. Neural relation extraction with selective attention over instances[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). 2016, 1: 2124-2133.

  14. Lin Y, Liu Z, Sun M. Neural relation extraction with multi-lingual attention[C]//Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). 2017: 34-43.

  15. Huang Y Y, Wang W Y. Deep residual learning for weakly-supervised relation extraction[J]//EMNLP, 2017: 1803–1807.

  16. Ji G, Liu K, He S, et al. Distant supervision for relation extraction with sentence-level attention and entity descriptions[C]//Thirty-First AAAI Conference on Artificial Intelligence. 2017.

  17. Wu Y, Bamman D, Russell S. Adversarial training for relation extraction[C]//Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing. 2017: 1778-1783.

  18. Ren X, Wu Z, He W, et al. Cotype: Joint extraction of typed entities and relations with knowledge bases[C]//Proceedings of the 26th International Conference on World Wide Web. International World Wide Web Conferences Steering Committee, 2017: 1015-1024.

  • 事件抽取
  1. Chen Y, Xu L, Liu K, et al. Event extraction via dynamic multi-pooling convolutional neural networks. ACL2015, 1: 167-176.
  2. Nguyen T H, Grishman R. Event detection and domain adaptation with convolutional neural networks. ACL2015, 2: 365-371.
  3. Hogenboom F, Frasincar F, Kaymak U, et al. An overview of event extraction from text. DeRiVE2011.
  4. Narasimhan K, Yala A, Barzilay R. Improving information extraction by acquiring external evidence with reinforcement learning. EMNLP2016.
  5. Nguyen T H, Cho K, Grishman R. Joint event extraction via recurrent neural networks. NAACL2016: 300-309.

知识融合

  1. Shvaiko P, Euzenat J. Ontology matching: state of the art and future challenges. IEEE Transactions on knowledge and data engineering, 2013, 25(1): 158-176.
  2. Noy N F, Musen M A. Algorithm and tool for automated ontology merging and alignment. AAAI2000.
  3. Do H H, Rahm E. COMA: a system for flexible combination of schema matching approaches.VLDB2002: 610-621.
  4. Doan A H, Madhavan J, Domingos P, et al. Learning to map between ontologies on the semantic web. WWW2002: 662-673.
  5. Ehrig M, Staab S. QOM–quick ontology mapping. ISWC2004: 683-697.
  6. Qu Y, Hu W, Cheng G. Constructing virtual documents for ontology matching. WWW2006: 23-31.
  7. Li J, Tang J, Li Y, et al. RiMOM: A dynamic multistrategy ontology alignment framework. IEEE Transactions on Knowledge and data Engineering, 2009, 21(8): 1218-1232.
  8. Mao M, Peng Y, Spring M. An adaptive ontology mapping approach with neural network based constraint satisfaction. Journal of Web Semantics, 2010, 8(1): 14-25.
  9. Hu W, Qu Y, Cheng G. Matching large ontologies: A divide-and-conquer approach. Data & Knowledge Engineering, 2008, 67(1): 140-160.
  10. Papadakis G, Ioannou E, Palpanas T, et al. A blocking framework for entity resolution in highly heterogeneous information spaces. IEEE Transactions on Knowledge and Data Engineering, 2013, 25(12): 2665-2682.
  11. Wang P, Zhou Y, Xu B. Matching large ontologies based on reduction anchors. Twenty-Second International Joint Conference on Artificial Intelligence. 2011.
  12. Niu X, Rong S, Wang H, et al. An effective rule miner for instance matching in a web of data. CIKM2012: 1085-1094.
  13. Papadakis G, Ioannou E, Palpanas T, et al. A blocking framework for entity resolution in highly heterogeneous information spaces. IEEE Transactions on Knowledge and Data Engineering, 2013, 25(12): 2665-2682.
  14. Li J, Wang Z, Zhang X, et al. Large scale instance matching via multiple indexes and candidate selection. Knowledge-Based Systems, 2013, 50: 112-120.
  15. Hu W, Chen J, Qu Y. A self-training approach for resolving object coreference on the semantic web. WWW2011: 87-96.
  16. Tang J, Fong A C M, Wang B, et al. A unified probabilistic framework for name disambiguation in digital library. IEEE Transactions on Knowledge and Data Engineering, 2012, 24(6): 975-987.
  17. Zhang Y, Zhang F, Yao P, et al. Name Disambiguation in AMiner: Clustering, Maintenance, and Human in the Loop. KDD2018: 1002-1011.
  18. Ngomo A C N, Auer S. LIMES—a time-efficient approach for large-scale link discovery on the web of data. IJCAI2011.

知识图谱嵌入

  • ---Review---
  1. Wang Q, Mao Z, Wang B, et al. Knowledge graph embedding: A survey of approaches and applications. IEEE Transactions on Knowledge and Data Engineering, 2017, 29(12): 2724-2743.
  2. 刘知远, 孙茂松, 林衍凯, 等. 知识表示学习研究进展. 计算机研究与发展, 2016, 53(2): 247-261.
  • ---Basic Models---
  1. Turian J, Ratinov L, Bengio Y. Word representations: A simple and general method for semi-supervised learning. Proceedings of the 48th annual meeting of the association for computational linguistics. Association for Computational Linguistics, 2010: 384-394. (one-hot)
  2. Bordes A, Glorot X, Weston J, et al. Joint learning of words and meaning representations for open-text semantic parsing. Artificial Intelligence and Statistics. 2012: 127-135. (UM)
  3. Bordes A, Weston J, Collobert R, et al. Learning structured embeddings of knowledge bases. AAAI. 2011. (SE)
  4. Mikolov T, Sutskever I, Chen K, et al. Distributed representations of words and phrases and their compositionality. NIPS2013: 3111-3119.
  • ---Translation-based Models(Basic Models)---
  1. Bordes A, Usunier N, Garcia-Duran A, et al. Translating embeddings for modeling multi-relational data. NIPS2013: 2787-2795.(TransE)
  2. Wang Z, Zhang J, Feng J, et al. Knowledge graph embedding by translating on hyperplanes. AAAI2014.(TransH)
  3. Lin Y, Liu Z, Sun M, et al. Learning entity and relation embeddings for knowledge graph completion. AAAI2015.(TransR/CTransR)
  4. Ji G, He S, Xu L, et al. Knowledge graph embedding via dynamic mapping matrix. ACL2015: 687-696. (TransD)
  5. Ji G, Liu K, He S, et al. Knowledge graph completion with adaptive sparse transfer matrix. AAAI. 2016. (TansSparse)
  • ---Translation-based Models(Translation Requirements Relaxing)---
  1. Fan M, Zhou Q, Chang E, et al. Transition-based knowledge graph embedding with relational mapping properties. Proceedings of the 28th Pacific Asia Conference on Language, Information and Computing. 2014. (TransM)
  2. Xiao H, Huang M, Zhu X. From one point to a manifold: Knowledge graph embedding for precise link prediction. arXiv preprint arXiv:1512.04792, 2015. (ManifoldE)
  3. Feng J, Huang M, Wang M, et al. Knowledge graph embedding by flexible translation. Fifteenth International Conference on the Principles of Knowledge Representation and Reasoning. 2016. (TransF)
  4. Xiao H, Huang M, Hao Y, et al. TransA: An adaptive approach for knowledge graph embedding. arXiv preprint arXiv:1509.05490, 2015. (TransA)
  • ---Translation-based Models(Gaussian Distribution Models)---
  1. He S, Liu K, Ji G, et al. Learning to represent knowledge graphs with gaussian embedding. Proceedings of the 24th ACM International on Conference on Information and Knowledge Management. ACM, 2015: 623-632. (KB2E)
  2. Xiao H, Huang M, Zhu X. TransG: A generative model for knowledge graph embedding. Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). 2016, 1: 2316-2325. (TransG
  • ---Semantic Matching Models(Matrix Factorization Models)---
  1. Jenatton R, Roux N L, Bordes A, et al. A latent factor model for highly multi-relational data. NIPS. 2012: 3167-3175. (LFM)
  2. Nickel M, Tresp V, Kriegel H P. A Three-Way Model for Collective Learning on Multi-Relational Data. ICML. 2011, 11: 809-816. (RESCAL)
  3. Yang B, Yih W, He X, et al. Embedding entities and relations for learning and inference in knowledge bases. arXiv preprint arXiv:1412.6575, 2014. (DistMult)
  4. Nickel M, Rosasco L, Poggio T. Holographic embeddings of knowledge graphs. AAAI. 2016. (HolE)
  5. Trouillon T, Welbl J, Riedel S, et al. Complex embeddings for simple link prediction. International Conference on Machine Learning. 2016: 2071-2080. (ComplEx)
  6. Liu H, Wu Y, Yang Y. Analogical inference for multi-relational embeddings. Proceedings of the 34th International Conference on Machine Learning-Volume 70. JMLR. org, 2017: 2168-2178. (ANALOGY)
  • ---Semantic Matching Models(Neural Network Models)---
  1. Socher R, Chen D, Manning C D, et al. Reasoning with neural tensor networks for knowledge base completion. NIPS. 2013: 926-934. (SLM)
  2. Bordes A, Glorot X, Weston J, et al. A semantic matching energy function for learning with multi-relational data. Machine Learning, 2014, 94(2): 233-259. (SME)
  3. Socher R, Chen D, Manning C D, et al. Reasoning with neural tensor networks for knowledge base completion. NIPS. 2013: 926-934. (NTN)
  4. Dong X, Gabrilovich E, Heitz G, et al. Knowledge vault: A web-scale approach to probabilistic knowledge fusion. Proceedings of the 20th ACM SIGKDD international conference on Knowledge discovery and data mining. ACM, 2014: 601-610. (MLP)
  5. Liu Q, Jiang H, Evdokimov A, et al. Probabilistic reasoning via deep learning: Neural association models. arXiv preprint arXiv:1603.07704, 2016. (NAM)
  6. Dettmers T, Minervini P, Stenetorp P, et al. Convolutional 2d knowledge graph embeddings. AAAI. 2018. (ConvE)
  • ---Multi-source Information Fusion Models(Entity Type)---
  1. Guo S, Wang Q, Wang B, et al. Semantically smooth knowledge graph embedding. Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 1: Long Papers). 2015, 1: 84-94. (SSE)
  2. Xie R, Liu Z, Sun M. Representation Learning of Knowledge Graphs with Hierarchical Types. IJCAI. 2016: 2965-2971. (TKRL)
  • ---Multi-source Information Fusion Models(Relation Paths)---
  1. Lin Y, Liu Z, Luan H, et al. Modeling relation paths for representation learning of knowledge bases. arXiv preprint arXiv:1506.00379, 2015. (PTransE)
  2. Dong X, Gabrilovich E, Heitz G, et al. Knowledge vault: A web-scale approach to probabilistic knowledge fusion. Proceedings of the 20th ACM SIGKDD international conference on Knowledge discovery and data mining. ACM, 2014: 601-610. (MLP+PRA)
  3. Nickel M, Jiang X, Tresp V. Reducing the rank in relational factorization models by including observable patterns. NIPS. 2014: 1179-1187. (PRA+RESCAL)
  • ---Multi-source Information Fusion Models(Textual Descriptions)---
  1. Socher R, Chen D, Manning C D, et al. Reasoning with neural tensor networks for knowledge base completion. NIPS. 2013: 926-934. (NTN)
  2. Xie R, Liu Z, Jia J, et al. Representation learning of knowledge graphs with entity descriptions. AAAI. 2016. (DKRL)
  3. Xiao H, Huang M, Meng L, et al. SSP: semantic space projection for knowledge graph embedding with text descriptions. AAAI. 2017. (SSP)
  4. Wang Z, Li J Z. Text-Enhanced Representation Learning for Knowledge Graph. IJCAI. 2016: 1293-1299. (TEKE)
  5. Wang Z, Zhang J, Feng J, et al. Knowledge graph and text jointly embedding. EMNLP. 2014: 1591-1601.
  • ---Multi-source Information Fusion Models(Logical Rules)---
  1. Wang Q, Wang B, Guo L. Knowledge base completion using embeddings and rules. IJCAI. 2015.
  2. Guo S, Wang Q, Wang L, et al. Jointly embedding knowledge graphs and logical rules. EMNLP. 2016: 192-202. (KALE)
  3. Guo S, Wang Q, Wang L, et al. Knowledge graph embedding with iterative guidance from soft rules. AAAI. 2018. (RUGE)
  4. Ding B, Wang Q, Wang B, et al. Improving knowledge graph embedding using simple constraints. arXiv preprint arXiv:1805.02408, 2018.
  • ---Multi-source Information Fusion Models(Entity Attributes)---
  1. Nickel M, Tresp V, Kriegel H P. Factorizing yago: scalable machine learning for linked data. Proceedings of the 21st international conference on World Wide Web. ACM, 2012: 271-280.
  • ---Multi-source Information Fusion Models(Temporal Information)---
  1. Jiang T, Liu T, Ge T, et al. Encoding temporal information for time-aware link prediction. Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing. 2016: 2350-2354.
  • ---Multi-source Information Fusion Models(Graph Structure)---
  1. Feng J, Huang M, Yang Y. GAKE: graph aware knowledge embedding. COLING. 2016: 641-651. (GAKE)

知识推理/知识挖掘

  1. Nickel M, Tresp V, Kriegel H P. A Three-Way Model for Collective Learning on Multi-Relational Data. ICML2011: 809-816.
  2. Socher R, Chen D, Manning C D, et al. Reasoning with neural tensor networks for knowledge base completion. NIPS2013: 926-934.
  3. Lao N, Cohen W W. Relational retrieval using a combination of path-constrained random walks. Machine learning, 2010, 81(1): 53-67.
  4. Lin Y, Liu Z, Luan H, et al. Modeling relation paths for representation learning of knowledge bases. EMNLP2015.
  5. Gardner M, Talukdar P, Krishnamurthy J, et al. Incorporating vector space similarity in random walk inference over knowledge bases. EMNLP2014: 397-406.
  6. Xiong W, Hoang T, Wang W Y. DeepPath: A Reinforcement Learning Method for Knowledge Graph Reasoning. EMNLP2017:564-573.
  7. Socher R , Chen D , Manning C D , et al. Reasoning With Neural Tensor Networks for Knowledge Base Completion[C]// International Conference on Neural Information Processing Systems. Curran Associates Inc. 2013.
  8. Shi B , Weninger T . ProjE: Embedding Projection for Knowledge Graph Completion[J]. 2016.
  9. Shi B , Weninger T . Open-World Knowledge Graph Completion[J]. 2017.
  10. Schlichtkrull M , Kipf T N , Bloem P , et al. Modeling Relational Data with Graph Convolutional Networks[J]. 2017.
  11. PTransE: Sun M , Zhu H , Xie R , et al. Iterative Entity Alignment via Joint Knowledge Embeddings[C]// International Joint Conference on Artificial Intelligence. AAAI Press, 2017.
  12. Das R , Neelakantan A , Belanger D , et al. Chains of Reasoning over Entities, Relations, and Text using Recurrent Neural Networks[J]. 2016.
  13. Shen Y , Huang P S , Chang M W , et al. Modeling Large-Scale Structured Relationships with Shared Memory for Knowledge Base Completion[J]. 2016.
  14. Graves A , Wayne G , Reynolds M , et al. Hybrid computing using a neural network with dynamic external memory[J]. Nature.
  15. Yang F , Yang Z , Cohen W W . Differentiable Learning of Logical Rules for Knowledge Base Reasoning[J]. 2017.

实体链接

  1. Zhang W, Su J, Tan C L, et al. Entity linking leveraging: automatically generated annotation[C]// Proceedings of the 23rd International Conference on Computational Linguistics. Stroudsburg: Association for Computational Linguistics, 2010: 1290-1298.
  2. Anastácio I, Martins B, Calado P. Supervised learning for linking named entities to knowledge base entries[C]// Proceedings of TAC. Gaithersburg: NIST, 2011: 1-12.
  3. Francis-Landau M, Durrett G, Klein D. Capturing semantic similarity for entity linking with convolutional neural networks[C] Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics. Stroudsburg: Association for Computational Linguistics, 2016: 1256-1261.
  4. Sun Y, Lin L, Tang D, et al. Modeling mention, context and entity with neural networks for entity disambiguation// Proceedings of the Twenty-Fourth International Joint Conference on Artificial Intelligence. California: IJCAI, 2015: 1333-1339.
  5. Han X, Sun L, Zhao J. Collective entity linking in web text: a graph-based method[C]// Proceedings of the 34th International ACM SIGIR Conference on Research and Development in Information Retrieval. New York: ACM Press, 2011: 765-774.
  6. Rao D, McNamee P, Dredze M. Entity linking: Finding extracted entities in a knowledge base// Multi-source, Multilingual Information Extraction and Summarization. Berlin: Springer, 2013:93-115.
  7. Guo Z, Barbosa D. Robust entity linking via random walks[C]//Proceedings of the 23rd ACM International Conference on Conference on Information and Knowledge Management. New York:ACM Press, 2014: 499-508.

知识存储/知识查询

  1. Bornea M A, Dolby J, Kementsietsidis A, et al. Building an efficient RDF store over a relational database. SIGMOD2013: 121-132.
  2. Huang J, Abadi D J, Ren K. Scalable SPARQL querying of large RDF graphs. Proceedings of the VLDB Endowment, 2011, 4(11): 1123-1134.
  3. Zou L, Özsu M T, Chen L, et al. gStore: a graph-based SPARQL query engine. The VLDB Journal—The International Journal on Very Large Data Bases, 2014, 23(4): 565-590.
  4. Wilkinson K, Sayers C, Kuno H, et al. Efficient RDF storage and retrieval in Jena2[C]//Proceedings of the First International Conference on Semantic Web and Databases. CEUR-WS. org, 2003: 120-139.
  5. Zou L, Mo J, Chen L, et al. gStore: answering SPARQL queries via subgraph matching[J]. Proceedings of the VLDB Endowment, 2011, 4(8): 482-493.
  6. Das S, Agrawal D, El Abbadi A. G-store: a scalable data store for transactional multi key access in the cloud[C]//Proceedings of the 1st ACM symposium on Cloud computing. ACM, 2010: 163-174.
  7. Zou L, Özsu M T, Chen L, et al. gStore: a graph-based SPARQL query engine[J]. The VLDB Journal—The International Journal on Very Large Data Bases, 2014, 23(4): 565-590.
  8. Ma L, Su Z, Pan Y, et al. RStar: an RDF storage and query system for enterprise resource management[C]//Proceedings of the thirteenth ACM international conference on Information and knowledge management. ACM, 2004: 484-491.
  9. Zeng K, Yang J, Wang H, et al. A distributed graph engine for web scale RDF data[C]//Proceedings of the VLDB Endowment. VLDB Endowment, 2013, 6(4): 265-276.
  10. Sakr S, Al-Naymat G. Relational processing of RDF queries: a survey[J]. ACM SIGMOD Record, 2010, 38(4): 23-28.
  11. Harris S, Shadbolt N. SPARQL query processing with conventional relational database systems[C]//International Conference on Web Information Systems Engineering. Springer, Berlin, Heidelberg, 2005: 235-244.
  12. Angles R. A comparison of current graph database models[C]//2012 IEEE 28th International Conference on Data Engineering Workshops. IEEE, 2012: 171-177.
  13. Miller J J. Graph database applications and concepts with Neo4j[C]//Proceedings of the Southern Association for Information Systems Conference, Atlanta, GA, USA. 2013, 2324(S 36).
  14. Iordanov B. HyperGraphDB: a generalized graph database[C]//International conference on web-age information management. Springer, Berlin, Heidelberg, 2010: 25-36.
  15. Sun J, Jin Q. Scalable rdf store based on hbase and mapreduce[C]//2010 3rd international conference on advanced computer theory and engineering (ICACTE). IEEE, 2010, 1: V1-633-V1-636.
  16. Huang J, Abadi D J, Ren K. Scalable SPARQL querying of large RDF graphs[J]. Proceedings of the VLDB Endowment, 2011, 4(11): 1123-1134.
  17. Weiss C, Karras P, Bernstein A. Hexastore: sextuple indexing for semantic web data management[J]. Proceedings of the VLDB Endowment, 2008, 1(1): 1008-1019.
  18. Neumann T, Weikum G. The RDF-3X engine for scalable management of RDF data[J]. The VLDB Journal—The International Journal on Very Large Data Bases, 2010, 19(1): 91-113.

人机交互

  1. Ferrucci D A. Introduction to “this is watson”. IBM Journal of Research and Development, 2012, 56(3.4): 1: 1-1: 15.
  2. Lally A, Prager J M, McCord M C, et al. Question analysis: How Watson reads a clue. IBM Journal of Research and Development, 2012, 56(3.4): 2: 1-2: 14.
  3. Zhou H, Young T, Huang M, et al. Commonsense Knowledge Aware Conversation Generation with Graph Attention. IJCAI. 2018: 4623-4629.
  4. Zhu Y, Zhang C, Ré C, et al. Building a large-scale multimodal knowledge base system for answering visual queries. arXiv:1507.05670, 2015.
  5. Auli M, Galley M, Quirk C, et al. Joint language and translation modeling with recurrent neural networks. EMNLP2013:1044–1054.
  6. Bahdanau D, Cho K, Bengio Y. Neural machine translation by jointly learning to align and translate. arXiv preprint arXiv:1409.0473, 2014.
  7. Cho K, Van Merriënboer B, Gulcehre C, et al. Learning phrase representations using RNN encoder-decoder for statistical machine translation. EMNLP2014.
  8. Chung J, Gulcehre C, Cho K H, et al. Empirical evaluation of gated recurrent neural networks on sequence modeling. arXiv preprint arXiv:1412.3555, 2014.
  9. Graves A. Generating sequences with recurrent neural networks. arXiv preprint arXiv:1308.0850, 2013.

附录B:最新进展论文选读(近1年内)

  1. Bhatia S, Dwivedi P, Kaur A. That’s Interesting, Tell Me More! Finding Descriptive Support Passages for Knowledge Graph Relationships. ISWC2018: 250-267. (Best Paper)
  2. Soulet A, Giacometti A, Markhoff B, et al. Representativeness of Knowledge Bases with the Generalized Benford’s Law. ISWC2018: 374-390.
  3. Wang M, Wang R, Liu J, et al. Towards Empty Answers in SPARQL: Approximating Querying with RDF Embedding. ISWC2018: 513-529.
  4. Salas J, Hogan A. Canonicalisation of monotone SPARQL queries. ISWC2018: 600-616. (Best Student Paper)
  5. Pertsas V, Constantopoulos P, Androutsopoulos I. Ontology Driven Extraction of Research Processes. ISWC2018:162-178.
  6. Saeedi A, Peukert E, Rahm E. Using link features for entity clustering in knowledge graphs. ESWC2018: 576-592. (Best Paper)
  7. Schlichtkrull M, Kipf T N, Bloem P, et al. Modeling relational data with graph convolutional networks. ESWC2018: 593-607. (Best Student Paper)
  8. Hamid Z, Giulio N, Jens L. Formal Query Generation for Question Answering over Knowledge Bases. ESWC2018:714-728.
  9. Zhou L, Gao J, Li D, et al. The Design and Implementation of XiaoIce, an Empathetic Social Chatbot. arXiv preprint arXiv:1812.08989, 2018.
  10. Dasgupta S S, Ray S N, Talukdar P. HyTE: Hyperplane-based Temporally aware Knowledge Graph Embedding. EMNLP2018: 2001-2011.
  11. Dubey M, Banerjee D, Chaudhuri D, et al. EARL: Joint entity and relation linking for question answering over knowledge graphsISWC2018: 108-126.
  12. Chen M, Tian Y, Chang K W, et al. Co-training embeddings of knowledge graphs and entity descriptions for cross-lingual entity alignment. IJCAI2018.
  13. Janke D, Staab S, Thimm M. Impact analysis of data placement strategies on query efforts in distributed rdf stores. Journal of Web Semantics, 2018, 50: 21-48.
  14. Han X, Zhu H, Yu P, et al. FewRel: A Large-Scale Supervised Few-Shot Relation Classification Dataset with State-of-the-Art Evaluation. EMNLP2018.
  15. Hou Y, Liu Y, Che W, et al. Sequence-to-Sequence Data Augmentation for Dialogue Language Understanding. ACL2018: 1234-1245.
  16. Tran V K, Nguyen L M. Adversarial Domain Adaptation for Variational Neural Language Generation in Dialogue Systems. COLING2018: 1205-1217.
  17. Zhang W, Cui Y, Wang Y, et al. Context-Sensitive Generation of Open-Domain Conversational Responses. COLING2018: 2437-2447.
  18. Shi W, Yu Z. Sentiment Adaptive End-to-End Dialog Systems. ACL2018, 1: 1509-1519.
  19. Zhang S, Dinan E, Urbanek J, et al. Personalizing Dialogue Agents: I have a dog, do you have pets too? ACL2018, 1: 2204-2213.
  20. Wei Z, Liu Q, Peng B, et al. Task-oriented dialogue system for automatic diagnosis. ACL2018, 2: 201-207.
  21. Sungjoon Park, Donghyun Kim and Alice Oh. Conversation Model Fine-Tuning for Classifying Client Utterances in Counseling Dialogues. NAACL2019.
  22. Sebastian R. Neural Transfer Learning for Natural Language Processing. PhD Thesis. National University of Ireland, 2019.
  • ---实体识别(ACL)---
  1. Parvez M R, Chakraborty S, Ray B, et al. Building language models for text with named entities. arXiv preprint arXiv:1805.04836, 2018.
  2. Lin Y, Yang S, Stoyanov V, et al. A multi-lingual multi-task architecture for low-resource sequence labeling. Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). 2018, 1: 799-809.
  3. Xu H, Liu B, Shu L, et al. Double embeddings and cnn-based sequence labeling for aspect extraction. arXiv preprint arXiv:1805.04601, 2018.
  4. Ye Z X, Ling Z H. Hybrid semi-markov crf for neural sequence labeling. arXiv preprint arXiv:1805.03838, 2018.
  5. Yang J, Zhang Y. Ncrf++: An open-source neural sequence labeling toolkit. arXiv preprint arXiv:1806.05626, 2018.
  • ---实体识别(NAACL)---
  1. Ju M, Miwa M, Ananiadou S. A neural layered model for nested named entity recognition. Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long Papers). 2018, 1: 1446-1459.
  2. Wang Z, Qu Y, Chen L, et al. Label-aware double transfer learning for cross-specialty medical named entity recognition. NAACL2018.
  3. Moon S, Neves L, Carvalho V. Multimodal named entity recognition for short social ../media posts. NAACL2018.
  4. Katiyar A, Cardie C. Nested named entity recognition revisited. NAACL2018: 861-871.
  • ---实体识别(EMNLP)---
  1. Cao P, Chen Y, Liu K, et al. Adversarial Transfer Learning for Chinese Named Entity Recognition with Self-Attention Mechanism.EMNLP2018: 182-192.
  2. Xie J, Yang Z, Neubig G, et al. Neural cross-lingual named entity recognition with minimal resources. EMNLP2018.
  3. Lin B Y, Lu W. Neural adaptation layers for cross-domain named entity recognition. EMNLP2018.
  4. Shang J, Liu L, Ren X, et al. Learning Named Entity Tagger using Domain-Specific Dictionary. EMNLP2018.
  5. Greenberg N, Bansal T, Verga P, et al. Marginal Likelihood Training of BiLSTM-CRF for Biomedical Named Entity Recognition from Disjoint Label Sets. EMNLP2018: 2824-2829.
  6. Sohrab M G, Miwa M. Deep Exhaustive Model for Nested Named Entity Recognition.EMNLP2018: 2843-2849.
  7. Yu X, Mayhew S, Sammons M, et al. On the Strength of Character Language Models for Multilingual Named Entity Recognition. EMNLP2018.
  • ---实体识别(COLING)---
  1. Mai K, Pham T H, Nguyen M T, et al. An empirical study on fine-grained named entity recognition. Proceedings of the 27th International Conference on Computational Linguistics. 2018: 711-722.
  2. Nagesh A, Surdeanu M. An Exploration of Three Lightly-supervised Representation Learning Approaches for Named Entity Classification. Proceedings of the 27th International Conference on Computational Linguistics. 2018: 2312-2324.
  3. Bhutani N, Qian K, Li Y, et al. Exploiting Structure in Representation of Named Entities using Active Learning. Proceedings of the 27th International Conference on Computational Linguistics. 2018: 687-699.
  4. Yadav V, Bethard S. A survey on recent advances in named entity recognition from deep learning models. Proceedings of the 27th International Conference on Computational Linguistics. 2018: 2145-2158.
  5. Güngör O, Üsküdarlı S, Güngör T. Improving Named Entity Recognition by Jointly Learning to Disambiguate Morphological Tags. arXiv preprint arXiv:1807.06683, 2018.
  6. Chen L, Moschitti A. Learning to Progressively Recognize New Named Entities with Sequence to Sequence Models. Proceedings of the 27th International Conference on Computational Linguistics. 2018: 2181-2191.
  7. Ghaddar A, Langlais P. Robust lexical features for improved neural network named-entity recognition. COLING2018.
  • ---事件抽取(ACL)---
  1. Choubey P K, Huang R. Improving Event Coreference Resolution by Modeling Correlations between Event Coreference Chains and Document Topic Structures.Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). 2018, 1: 485-495.
  2. Lin H, Lu Y, Han X, et al. Nugget Proposal Networks for Chinese Event Detection. ACL2018.
  3. Huang L, Ji H, Cho K, et al. Zero-shot transfer learning for event extraction. ACL2017.
  4. Hong Y, Zhou W, Zhang J, et al. Self-regulation: Employing a Generative Adversarial Network to Improve Event Detection. Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). 2018, 1: 515-526.
  5. Zhao Y, Jin X, Wang Y, et al. Document embedding enhanced event detection with hierarchical and supervised attention. Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers). 2018, 2: 414-419.
  6. Yang H, Chen Y, Liu K, et al. DCFEE: A Document-level Chinese Financial Event Extraction System based on Automatically Labeled Training Data. ACL2018, System Demonstrations, 2018: 50-55.
  • ---事件抽取(NAACL)---
  1. Ferguson J, Lockard C, Weld D S, et al. Semi-Supervised Event Extraction with Paraphrase Clusters. ACL2018.
  • ---事件抽取(EMNLP)---
  1. Orr J W, Tadepalli P, Fern X. Event Detection with Neural Networks: A Rigorous Empirical Evaluation. EMNLP2018.
  2. Liu S, Cheng R, Yu X, et al. Exploiting Contextual Information via Dynamic Memory Network for Event Detection. EMNLP2018.
  3. Liu X, Luo Z, Huang H. Jointly multiple events extraction via attention-based graph information aggregation. EMNLP2018.
  4. Chen Y, Yang H, Liu K, et al. Collective Event Detection via a Hierarchical and Bias Tagging Networks with Gated Multi-level Attention Mechanisms. EMNLP2018: 1267-1276.
  5. Lu W, Nguyen T H. Similar but not the Same: Word Sense Disambiguation Improves Event Detection via Neural Representation Matching. Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. 2018: 4822-4828.
  • ---事件抽取(COLING)---
  1. Araki J, Mitamura T. Open-Domain Event Detection using Distant Supervision. Proceedings of the 27th International Conference on Computational Linguistics. 2018: 878-891.
  2. Muis A O, Otani N, Vyas N, et al. Low-resource Cross-lingual Event Type Detection via Distant Supervision with Minimal Effort. Proceedings of the 27th International Conference on Computational Linguistics. 2018: 70-82.
  3. Kazeminejad G, Bonial C, Brown S W, et al. Automatically Extracting Qualia Relations for the Rich Event Ontology. Proceedings of the 27th International Conference on Computational Linguistics. 2018: 2644-2652.
  4. Liu Z, Mitamura T, Hovy E. Graph-Based Decoding for Event Sequencing and Coreference Resolution. COLING2018.
  • ---关系抽取---
  1. Su Y, Liu H, Yavuz S, et al. Global relation embedding for relation extraction, NAACL2018:820-830.
  2. Zeng X, He S, Liu K, et al. Large scaled relation extraction with reinforcement learning, AAAI2018.
  3. Liu T, Zhang X, Zhou W, et al. Neural relation extraction via inner-sentence noise reduction and transfer learning, EMNLP2018:2195-2204.
  4. Wang S, Zhang Y, Che W, et al. Joint Extraction of Entities and Relations Based on a Novel Graph Scheme, IJCAI2018: 4461-4467.
  5. Feng J, Huang M, Zhao L, et al. Reinforcement learning for relation classification from noisy data, AAAI2018.
  6. He Z, Chen W, Li Z, et al. SEE: Syntax-aware entity embedding for neural relation extraction, AAAI2018.
  7. Vashishth S , Joshi R , Prayaga S S , et al. RESIDE: Improving Distantly-Supervised Neural Relation Extraction using Side Information. ACL2018.
  8. Tan Z, Zhao X, Wang W, et al. Jointly Extracting Multiple Triplets with Multilayer Translation Constraints. AAAI2018.
  9. Ryuichi Takanobu, Tianyang Zhang, JieXi Liu, Minlie HuangA Hierarchical Framework for Relation Extraction with Reinforcement Learning, AAAI2019.
  • ---知识存储---
  1. Davoudian A, Chen L, Liu M. A survey on NoSQL stores[J]. ACM Computing Surveys (CSUR), 2018, 51(2): 40.
  2. Wylot M, Hauswirth M, Cudré-Mauroux P, et al. RDF data storage and query processing schemes: A survey[J]. ACM Computing Surveys (CSUR), 2018, 51(4): 84.
  3. Zeng L, Zou L. Redesign of the gStore system[J]. Frontiers of Computer Science, 2018, 12(4): 623-641.
  4. Zhang X, Zhang M, Peng P, et al. A Scalable Sparse Matrix-Based Join for SPARQL Query Processing[C]//International Conference on Database Systems for Advanced Applications. Springer, Cham, 2019: 510-514.
  5. Libkin L, Reutter J L, Soto A, et al. TriAL: A navigational algebra for RDF triplestores[J]. ACM Transactions on Database Systems (TODS), 2018, 43(1): 5.
  6. Elzein N M, Majid M A, Hashem I A T, et al. Managing big RDF data in clouds: Challenges, opportunities, and solutions[J]. Sustainable Cities and Society, 2018, 39: 375-386.
  • ---知识推理---
  1. Lin, Xi Victoria, Richard Socher, and Caiming Xiong. Multi-hop knowledge graph reasoning with reward shaping. arXiv preprint arXiv:1808.10568 (2018).
  2. Zhang, Y., Dai, H., Kozareva, Z., Smola, A. J., & Song, L. (2018, April). Variational reasoning for question answering with knowledge graph. In Thirty-Second AAAI Conference on Artificial Intelligence.
  3. Gu, L., Xia, Y., Yuan, X., Wang, C., & Jiao, J. (2018). Research on the model for tobacco disease prevention and control based on case-based reasoning and knowledge graph. Filomat, 32(5).
  4. Zhang, Y., Dai, H., Kozareva, Z., Smola, A. J., & Song, L. (2018, April).Variational reasoning for question answering with knowledge graph. In Thirty-Second AAAI Conference on Artificial Intelligence.
  5. Trivedi, R., Dai, H., Wang, Y., & Song, L. (2017, August). Know-evolve: Deep temporal reasoning for dynamic knowledge graphs. In Proceedings of the 34th International Conference on Machine Learning-Volume 70 (pp. 3462-3471). JMLR. org.
  6. Hamilton, W., Bajaj, P., Zitnik, M., Jurafsky, D., & Leskovec, J. (2018).Embedding logical queries on knowledge graphs. In Advances in Neural Information Processing Systems (pp. 2026-2037).
  • ---实体链接---
  1. Sil, A., Kundu, G., Florian, R., & Hamza, W. (2018, April). Neural cross-lingual entity linking. In Thirty-Second AAAI Conference on Artificial Intelligence.
  2. Chen, H., Wei, B., Liu, Y., Li, Y., Yu, J., & Zhu, W. (2018). Bilinear joint learning of word and entity embeddings for Entity Linking. Neurocomputing, 294, 12-18.
  3. Raiman, J. R., & Raiman, O. M. (2018, April). DeepType: multilingual entity linking by neural type system evolution. In Thirty-Second AAAI Conference on Artificial Intelligence.
  4. Kundu, G., Sil, A., Florian, R., & Hamza, W. (2018). Neural cross-lingual coreference resolution and its application to entity linking. arXiv preprint arXiv:1806.10201.
  5. Kilias, T., Löser, A., Gers, F. A., Koopmanschap, R., Zhang, Y., & Kersten, M. (2018). Idel: In-database entity linking with neural embeddings. arXiv preprint arXiv:1803.04884.
  6. Cao, Y., Hou, L., Li, J., & Liu, Z. (2018). Neural collective entity linking. arXiv preprint arXiv:1811.08603.

空文件

简介

东南大学《知识图谱》研究生课程 展开 收起
取消

发行版

暂无发行版

贡献者

全部

近期动态

加载更多
不能加载更多了
1
https://gitee.com/donfar/KnowledgeGraphCourse.git
git@gitee.com:donfar/KnowledgeGraphCourse.git
donfar
KnowledgeGraphCourse
KnowledgeGraphCourse
master

搜索帮助