學術產出-Theses

Article View/Open

Publication Export

Google ScholarTM

政大圖書館

Citation Infomation

題名 從歷史文件到社會人際脈動:基於歷時性文本進行時序知識圖譜構建
From Historical Documents to Social Interpersonal Networks: Temporal Knowledge Graph Construction based on Diachronic Documents
作者 李婕瑜
Lee, Chieh-Yu
貢獻者 黃瀚萱
Huang, Hen-Hsen
李婕瑜
Lee, Chieh-Yu
關鍵詞 自然語言處理
知識圖譜
時序知識圖譜
人際關係抽取
鏈結預測
數位人文
Natural language processing
Knowledge graph
Temporal knowledge graph
Interpersonal relations extraction
Link prediction
Digital humanities
日期 2022
上傳時間 5-Oct-2022 09:15:43 (UTC+8)
摘要 公眾人物的社交網絡,以許多對社會具有高度影響力之人所組成,透
過捕捉人物間的關係變化,對觀察特定時間點的社會情勢是不可或缺。利
用動態社交網絡可進一步判斷隨時間變化的人物關係,提供一個嶄新視角
梳理社會脈動。

然從非結構化文字到時序圖譜,須經過多樣任務。在架構上,首先從
實體辨識得到篇章內的人物後,再進行關係抽取,而在文本預處理上,本
文提出一種階層式句子壓縮,用於協助關係抽取模型從長文檔中提取人與
人之間的關係屬性。因考慮關係提取錯誤之可能性及文本未提及之關係,
本文提出一種圖譜校正方式,來優化關係提取模型所提取出的歷年元組。
最後利用歷年事實元組建構時序知識圖譜,本文改善節點間的訊息傳播層
數以及加入文本相關資訊來輔助圖譜預測下個時間單位人物節點間的關係。

本文研究旨在建立一種從歷史文件、書信構建時序知識圖譜的架構,
可用於分析、預測動態人際關係,以應用各種跨領域學科,例如:政治和
歷史領域,協助達到更具效率且精準的研究。
The social network of public figures delivers rich information for the interpersonal relationships among influential people in a society. The temporal social network can further depict the change of their relationships over time and provide a new perspective to look into the dynamics of a society.

This work demonstrates a novel system for temporal social network construction from textual data such as historical documents. A hierarchical sentence compression is proposed to support extracting interpersonal relationships among character from long documents. Then, we consider the error from relation extraction and the relations not mentioned in the documents, graph correction method is applied to optimize the outputs. Furthermore, we use historic facts to construct a temporal knowledge graph to predict the relationship between character in the next time unit. We make an adjustment for the number of hops in aggregation and add text information to improve the precision of predicting the relationship.

The purpose of this study is to establish a framework for constructing a temporal knowledge graph from historical documents, which can be used to analyze and predict dynamic interpersonal relationships to apply various interdisciplinary researches, such as politics and history.
參考文獻 [1] Farhad Abedini, Mohammad Reza Keyvanpour, and Mohammad Bagher Menhaj.Correction tower: A general embedding method of the error recognition for the knowledgegraph correction. Int. J. Pattern Recognit. Artif. Intell., 34:2059034:1–2059034:38, 2020.
[2] Ivana Balažević, Carl Allen, and Timothy M Hospedales. Tucker: Tensor factorizationfor knowledge graph completion. In Empirical Methods in Natural Language Processing,2019.
[3] Anson Bastos, Abhishek Nadgeri, Kuldeep Singh, Isaiah Onando Mulang, SaeedehShekarpour, Johannes Hoffart, and Manohar Kaul. Recon: relation extraction usingknowledge graph context in a graph neural network. In Proceedings of the Web Conference2021, pages 1673–1685, 2021.
[4] Iz Beltagy, Matthew E Peters, and Arman Cohan. Longformer: The long-documenttransformer. arXiv preprint arXiv:2004.05150, 2020.
[5] Vincent D Blondel, Jean-Loup Guillaume, Renaud Lambiotte, and Etienne Lefebvre. Fastunfolding of communities in large networks. Journal of statistical mechanics: theory andexperiment, 2008(10):P10008, 2008.
[6] Antoine Bordes, Nicolas Usunier, Alberto Garcia-Duran, Jason Weston, and OksanaYakhnenko. Translating embeddings for modeling multi-relational data. In C. J. C. Burges,L. Bottou, M. Welling, Z. Ghahramani, and K. Q. Weinberger, editors, Advances in NeuralInformation Processing Systems, volume 26. Curran Associates, Inc., 2013.
[7] Mingming Chen, Konstantin Kuzmin, and Boleslaw K Szymanski. Community detectionvia maximization of modularity and its variants. IEEE Transactions on ComputationalSocial Systems, 1(1):46–65, 2014.
[8] Shib Sankar Dasgupta, Swayambhu Nath Ray, and Partha Talukdar. HyTE: Hyperplane-based temporally aware knowledge graph embedding. In Proceedings of the 2018Conference on Empirical Methods in Natural Language Processing, pages 2001–2011,Brussels, Belgium, October-November 2018. Association for Computational Linguistics.
[9] Jacob Devlin, Ming-Wei Chang, Kenton Lee, and Kristina Toutanova. BERT: Pre-trainingof deep bidirectional transformers for language understanding. In Proceedings of the2019 Conference of the North American Chapter of the Association for ComputationalLinguistics: Human Language Technologies, Volume 1 (Long and Short Papers),pages 4171–4186, Minneapolis, Minnesota, June 2019. Association for ComputationalLinguistics.
[10] Wenfei Fan, Xueli Liu, Ping Lu, and Chao Tian. Catching numeric inconsistencies ingraphs. In Proceedings of the 2018 International Conference on Management of Data,SIGMOD ’18, page 381–393, New York, NY, USA, 2018. Association for ComputingMachinery.
[11] Alberto García-Durán, Sebastijan Dumancic, and Mathias Niepert. Learning sequenceencoders for temporal knowledge graph completion. In EMNLP, 2018.
[12] Alex Graves, Santiago Fernández, and Jürgen Schmidhuber. Bidirectional lstm networksfor improved phoneme classification and recognition. In Proceedings of the 15thInternational Conference on Artificial Neural Networks: Formal Models and TheirApplications - Volume Part II, ICANN’05, page 799–804, Berlin, Heidelberg, 2005.Springer-Verlag.
[13] Zhijiang Guo, Yan Zhang, and Wei Lu. Attention guided graph convolutional networksfor relation extraction. arXiv preprint arXiv:1906.07510, 2019.
[14] Iris Hendrickx, Su Nam Kim, Zornitsa Kozareva, Preslav Nakov, Diarmuid Ó Séaghdha,Sebastian Padó, Marco Pennacchiotti, Lorenza Romano, and Stan Szpakowicz. SemEval-2010 task 8: Multi-way classification of semantic relations between pairs of nominals.In Proceedings of the 5th International Workshop on Semantic Evaluation, pages 33–38,Uppsala, Sweden, July 2010. Association for Computational Linguistics.
[15] Tingsong Jiang, Tianyu Liu, Tao Ge, Lei Sha, Baobao Chang, Sujian Li, and Zhifang Sui.Towards time-aware knowledge graph completion. In Proceedings of COLING 2016, the26th International Conference on Computational Linguistics: Technical Papers, pages1715–1724, Osaka, Japan, December 2016. The COLING 2016 Organizing Committee.
[16] Woojeong Jin, Meng Qu, Xisen Jin, and Xiang Ren. Recurrent event network:Autoregressive structure inference over temporal knowledge graphs. In EMNLP, 2020.
[17] Jaehun Jung, Jinhong Jung, and U Kang. T-gap: Learning to walk across time for temporalknowledge graph completion. arXiv preprint arXiv:2012.10595, 2020.
[18] Jaehun Jung, Jinhong Jung, and U Kang. Learning to walk across time for interpretabletemporal knowledge graph completion. In Proceedings of the 27th ACM SIGKDDConference on Knowledge Discovery amp; Data Mining, KDD ’21, page 786–795, NewYork, NY, USA, 2021. Association for Computing Machinery.
[19] Timothée Lacroix, Guillaume Obozinski, and Nicolas Usunier. Tensor decompositions fortemporal knowledge base completion. arXiv preprint arXiv:2004.04926, 2020.
[20] Huiying Li, Yuanyuan Li, Feifei Xu, and Xinyu Zhong. Probabilistic error detecting innumerical linked data. In Proceedings, Part I, of the 26th International Conference onDatabase and Expert Systems Applications - Volume 9261, DEXA 2015, page 61–75,Berlin, Heidelberg, 2015. Springer-Verlag.
[21] Chin-Yew Lin and Franz Josef Och. Automatic evaluation of machine translation qualityusing longest common subsequence and skip-bigram statistics. In Proceedings of the 42ndAnnual Meeting of the Association for Computational Linguistics (ACL-04), pages 605–612, Barcelona, Spain, July 2004.
[22] Yinhan Liu, Myle Ott, Naman Goyal, Jingfei Du, Mandar Joshi, Danqi Chen, Omer Levy,Mike Lewis, Luke Zettlemoyer, and Veselin Stoyanov. Roberta: A robustly optimized bertpretraining approach. ArXiv, abs/1907.11692, 2019.
[23] Yunpu Ma, Volker Tresp, and Erik A. Daxberger. Embedding models for episodicknowledge graphs. J. Web Semant., 59, 2019.
[24] Mike Mintz, Steven Bills, Rion Snow, and Dan Jurafsky. Distant supervision for relationextraction without labeled data. In Proceedings of the Joint Conference of the 47th AnnualMeeting of the ACL and the 4th International Joint Conference on Natural LanguageProcessing of the AFNLP, pages 1003–1011, 2009.
[25] Paramita Mirza and Sara Tonelli. Catena: Causal and temporal relation extraction fromnatural language texts. In The 26th international conference on computational linguistics,pages 64–75. ACL, 2016.
[26] Abhishek Nadgeri, Anson Bastos, Kuldeep Singh, Isaiah Onando Mulang, JohannesHoffart, Saeedeh Shekarpour, and Vijay Saraswat. Kgpool: Dynamic knowledge graphcontext selection for relation extraction. arXiv preprint arXiv:2106.00459, 2021.
[27] Heiko Paulheim. Identifying wrong links between datasets by multi-dimensional outlierdetection. In WoDOOM, pages 27–38, 2014.
[28] Heiko Paulheim. Knowledge graph refinement: A survey of approaches and evaluationmethods. Semantic web, 8(3):489–508, 2017.
[29] Julia Perl, Claudia Wagner, Jerome Kunegis, and Steffen Staab. Twitter as a politicalnetwork: Predicting the following and unfollowing behavior of german politicians. InProceedings of the ACM Web Science Conference, pages 1–2, 2015.
[30] Chris Quirk and Hoifung Poon. Distant supervision for relation extraction beyond thesentence boundary. arXiv preprint arXiv:1609.04873, 2016.
[31] Sebastian Riedel, Limin Yao, and Andrew McCallum. Modeling relations and theirmentions without labeled text. In José Luis Balcázar, Francesco Bonchi, Aristides Gionis,and Michèle Sebag, editors, Machine Learning and Knowledge Discovery in Databases,pages 148–163, Berlin, Heidelberg, 2010. Springer Berlin Heidelberg.
[32] Emanuele Rossi, Ben Chamberlain, Fabrizio Frasca, Davide Eynard, Federico Monti, andMichael Bronstein. Temporal graph networks for deep learning on dynamic graphs, 2020.
[33] Aravind Sankar, Yanhong Wu, Liang Gou, Wei Zhang, and Hao Yang. Dysat: Deep neuralrepresentation learning on dynamic graphs via self-attention networks. In Proceedings ofthe 13th International Conference on Web Search and Data Mining, pages 519–527, 2020.
[34] Michael Schlichtkrull, Thomas N Kipf, Peter Bloem, Rianne Van Den Berg, Ivan Titov, andMax Welling. Modeling relational data with graph convolutional networks. In EuropeanSemantic Web Conference, pages 593–607. Springer, 2018.
[35] Livio Baldini Soares, Nicholas FitzGerald, Jeffrey Ling, and Tom Kwiatkowski.Matching the blanks: Distributional similarity for relation learning. arXiv preprintarXiv:1906.03158, 2019.
[36] Petar Velickovic, Guillem Cucurull, Arantxa Casanova, Adriana Romero, Pietro Lio, andYoshua Bengio. Graph attention networks. stat, 1050:20, 2017.
[37] Hong Wang, Christfried Focke, Rob Sylvester, Nilesh Mishra, and William Wang. Fine-tune bert for docred with two-step process, 2019.
[38] ZhongXian Wang, XiangHui He, and XingYan Hu. Chinese sentence compressionalgorithm based on deep analysis of sentence hierarchy in multiple application scenarios.In 2020 3rd International Conference on Advanced Electronic Materials, Computers andSoftware Engineering (AEMCSE), pages 61–66, 2020.
[39] Shanchan Wu and Yifan He. Enriching pre-trained language model with entity informationfor relation classification. In Proceedings of the 28th ACM International Conference onInformation and Knowledge Management, CIKM ’19, page 2361–2364, New York, NY,USA, 2019. Association for Computing Machinery.
[40] Wei Xu and Ralph Grishman. A parse-and-trim approach with information significancefor Chinese sentence compression. In Proceedings of the 2009 Workshop on LanguageGeneration and Summarisation (UCNLG+Sum 2009), pages 48–55, Suntec, Singapore,August 2009. Association for Computational Linguistics.
[41] Jianhao Yan, Lin He, Ruqin Huang, Jian Li, and Ying Liu. Relation extraction withtemporal reasoning based on memory augmented distant supervision. In Proceedings ofthe 2019 Conference of the North American Chapter of the Association for ComputationalLinguistics: Human Language Technologies, Volume 1 (Long and Short Papers),pages 1019–1030, Minneapolis, Minnesota, June 2019. Association for ComputationalLinguistics.
[42] Jianhao Yan, Lin He, Ruqin Huang, Jian Li, and Ying Liu. Relation extraction withtemporal reasoning based on memory augmented distant supervision. In Proceedings ofthe 2019 Conference of the North American Chapter of the Association for ComputationalLinguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pages1019–1030, 2019.
[43] Bishan Yang, Wen tau Yih, Xiaodong He, Jianfeng Gao, and Li Deng. Embedding entitiesand relations for learning and inference in knowledge bases. CoRR, abs/1412.6575, 2015.
[44] Daojian Zeng, Kang Liu, Yubo Chen, and Jun Zhao. Distant supervision for relationextraction via piecewise convolutional neural networks. In Proceedings of the 2015conference on empirical methods in natural language processing, pages 1753–1762, 2015.
[45] Daojian Zeng, Kang Liu, Siwei Lai, Guangyou Zhou, and Jun Zhao. Relationclassification via convolutional deep neural network. In Proceedings of COLING 2014,the 25th International Conference on Computational Linguistics: Technical Papers, pages2335–2344, Dublin, Ireland, August 2014. Dublin City University and Association forComputational Linguistics.
[46] Ben Zhou, Kyle Richardson, Qiang Ning, Tushar Khot, Ashish Sabharwal, and DanRoth. Temporal reasoning on implicit events from distant supervision. arXiv preprintarXiv:2010.12753, 2020.
[47] Cunchao Zhu, Muhao Chen, Changjun Fan, Guangquan Cheng, and Yan Zhan. Learningfrom history: Modeling temporal knowledge graphs with sequential copy-generationnetworks. arXiv preprint arXiv:2012.08492, 2020.
[48] Kangli Zi, Shi Wang, Yu Liu, Jicun Li, Yanan Cao, and Cungen Cao. SOM-NCSCM :An efficient neural Chinese sentence compression model enhanced with self-organizingmap. In Proceedings of the 2021 Conference on Empirical Methods in Natural LanguageProcessing, pages 403–415, Online and Punta Cana, Dominican Republic, November2021. Association for Computational Linguistics.
描述 碩士
國立政治大學
資訊科學系
109753133
資料來源 http://thesis.lib.nccu.edu.tw/record/#G0109753133
資料類型 thesis
dc.contributor.advisor 黃瀚萱zh_TW
dc.contributor.advisor Huang, Hen-Hsenen_US
dc.contributor.author (Authors) 李婕瑜zh_TW
dc.contributor.author (Authors) Lee, Chieh-Yuen_US
dc.creator (作者) 李婕瑜zh_TW
dc.creator (作者) Lee, Chieh-Yuen_US
dc.date (日期) 2022en_US
dc.date.accessioned 5-Oct-2022 09:15:43 (UTC+8)-
dc.date.available 5-Oct-2022 09:15:43 (UTC+8)-
dc.date.issued (上傳時間) 5-Oct-2022 09:15:43 (UTC+8)-
dc.identifier (Other Identifiers) G0109753133en_US
dc.identifier.uri (URI) http://nccur.lib.nccu.edu.tw/handle/140.119/142127-
dc.description (描述) 碩士zh_TW
dc.description (描述) 國立政治大學zh_TW
dc.description (描述) 資訊科學系zh_TW
dc.description (描述) 109753133zh_TW
dc.description.abstract (摘要) 公眾人物的社交網絡,以許多對社會具有高度影響力之人所組成,透
過捕捉人物間的關係變化,對觀察特定時間點的社會情勢是不可或缺。利
用動態社交網絡可進一步判斷隨時間變化的人物關係,提供一個嶄新視角
梳理社會脈動。

然從非結構化文字到時序圖譜,須經過多樣任務。在架構上,首先從
實體辨識得到篇章內的人物後,再進行關係抽取,而在文本預處理上,本
文提出一種階層式句子壓縮,用於協助關係抽取模型從長文檔中提取人與
人之間的關係屬性。因考慮關係提取錯誤之可能性及文本未提及之關係,
本文提出一種圖譜校正方式,來優化關係提取模型所提取出的歷年元組。
最後利用歷年事實元組建構時序知識圖譜,本文改善節點間的訊息傳播層
數以及加入文本相關資訊來輔助圖譜預測下個時間單位人物節點間的關係。

本文研究旨在建立一種從歷史文件、書信構建時序知識圖譜的架構,
可用於分析、預測動態人際關係,以應用各種跨領域學科,例如:政治和
歷史領域,協助達到更具效率且精準的研究。
zh_TW
dc.description.abstract (摘要) The social network of public figures delivers rich information for the interpersonal relationships among influential people in a society. The temporal social network can further depict the change of their relationships over time and provide a new perspective to look into the dynamics of a society.

This work demonstrates a novel system for temporal social network construction from textual data such as historical documents. A hierarchical sentence compression is proposed to support extracting interpersonal relationships among character from long documents. Then, we consider the error from relation extraction and the relations not mentioned in the documents, graph correction method is applied to optimize the outputs. Furthermore, we use historic facts to construct a temporal knowledge graph to predict the relationship between character in the next time unit. We make an adjustment for the number of hops in aggregation and add text information to improve the precision of predicting the relationship.

The purpose of this study is to establish a framework for constructing a temporal knowledge graph from historical documents, which can be used to analyze and predict dynamic interpersonal relationships to apply various interdisciplinary researches, such as politics and history.
en_US
dc.description.tableofcontents 致謝 i
中文摘要 ii
Abstract iii
目錄 iv
表目錄 vi
圖目錄 vii
第一章 緒論 1
第一節 研究背景 1
第二節 研究動機 2
第三節 研究目的 3
第二章 文獻探討 4
第一節 句子壓縮 4
第二節 關係抽取 5
第三節 時序知識圖譜 6
第四節 知識圖譜校正 7
第三章 研究方法 9
第一節 模型整體架構 9
第二節 句子壓縮任務 11
第三節 關係抽取模型架構 14
第四節 動態關係預測模型架構 17
第五節 圖譜校正任務 20
第四章 實驗配置與結果 21
第一節 資料集 21
第二節 實驗評估指標 29
第三節 模型和超參數設置 30
第四節 模型效能 30
一、句子壓縮效能 30
二、關係提取模型 32
三、未來關係預測 33
四、未來關係預測與文字資訊融合 34
五、圖譜更正任務 35
六、節點資訊傳播步數 36
第五章 結論 37
參考文獻 38
zh_TW
dc.format.extent 1591253 bytes-
dc.format.mimetype application/pdf-
dc.source.uri (資料來源) http://thesis.lib.nccu.edu.tw/record/#G0109753133en_US
dc.subject (關鍵詞) 自然語言處理zh_TW
dc.subject (關鍵詞) 知識圖譜zh_TW
dc.subject (關鍵詞) 時序知識圖譜zh_TW
dc.subject (關鍵詞) 人際關係抽取zh_TW
dc.subject (關鍵詞) 鏈結預測zh_TW
dc.subject (關鍵詞) 數位人文zh_TW
dc.subject (關鍵詞) Natural language processingen_US
dc.subject (關鍵詞) Knowledge graphen_US
dc.subject (關鍵詞) Temporal knowledge graphen_US
dc.subject (關鍵詞) Interpersonal relations extractionen_US
dc.subject (關鍵詞) Link predictionen_US
dc.subject (關鍵詞) Digital humanitiesen_US
dc.title (題名) 從歷史文件到社會人際脈動:基於歷時性文本進行時序知識圖譜構建zh_TW
dc.title (題名) From Historical Documents to Social Interpersonal Networks: Temporal Knowledge Graph Construction based on Diachronic Documentsen_US
dc.type (資料類型) thesisen_US
dc.relation.reference (參考文獻) [1] Farhad Abedini, Mohammad Reza Keyvanpour, and Mohammad Bagher Menhaj.Correction tower: A general embedding method of the error recognition for the knowledgegraph correction. Int. J. Pattern Recognit. Artif. Intell., 34:2059034:1–2059034:38, 2020.
[2] Ivana Balažević, Carl Allen, and Timothy M Hospedales. Tucker: Tensor factorizationfor knowledge graph completion. In Empirical Methods in Natural Language Processing,2019.
[3] Anson Bastos, Abhishek Nadgeri, Kuldeep Singh, Isaiah Onando Mulang, SaeedehShekarpour, Johannes Hoffart, and Manohar Kaul. Recon: relation extraction usingknowledge graph context in a graph neural network. In Proceedings of the Web Conference2021, pages 1673–1685, 2021.
[4] Iz Beltagy, Matthew E Peters, and Arman Cohan. Longformer: The long-documenttransformer. arXiv preprint arXiv:2004.05150, 2020.
[5] Vincent D Blondel, Jean-Loup Guillaume, Renaud Lambiotte, and Etienne Lefebvre. Fastunfolding of communities in large networks. Journal of statistical mechanics: theory andexperiment, 2008(10):P10008, 2008.
[6] Antoine Bordes, Nicolas Usunier, Alberto Garcia-Duran, Jason Weston, and OksanaYakhnenko. Translating embeddings for modeling multi-relational data. In C. J. C. Burges,L. Bottou, M. Welling, Z. Ghahramani, and K. Q. Weinberger, editors, Advances in NeuralInformation Processing Systems, volume 26. Curran Associates, Inc., 2013.
[7] Mingming Chen, Konstantin Kuzmin, and Boleslaw K Szymanski. Community detectionvia maximization of modularity and its variants. IEEE Transactions on ComputationalSocial Systems, 1(1):46–65, 2014.
[8] Shib Sankar Dasgupta, Swayambhu Nath Ray, and Partha Talukdar. HyTE: Hyperplane-based temporally aware knowledge graph embedding. In Proceedings of the 2018Conference on Empirical Methods in Natural Language Processing, pages 2001–2011,Brussels, Belgium, October-November 2018. Association for Computational Linguistics.
[9] Jacob Devlin, Ming-Wei Chang, Kenton Lee, and Kristina Toutanova. BERT: Pre-trainingof deep bidirectional transformers for language understanding. In Proceedings of the2019 Conference of the North American Chapter of the Association for ComputationalLinguistics: Human Language Technologies, Volume 1 (Long and Short Papers),pages 4171–4186, Minneapolis, Minnesota, June 2019. Association for ComputationalLinguistics.
[10] Wenfei Fan, Xueli Liu, Ping Lu, and Chao Tian. Catching numeric inconsistencies ingraphs. In Proceedings of the 2018 International Conference on Management of Data,SIGMOD ’18, page 381–393, New York, NY, USA, 2018. Association for ComputingMachinery.
[11] Alberto García-Durán, Sebastijan Dumancic, and Mathias Niepert. Learning sequenceencoders for temporal knowledge graph completion. In EMNLP, 2018.
[12] Alex Graves, Santiago Fernández, and Jürgen Schmidhuber. Bidirectional lstm networksfor improved phoneme classification and recognition. In Proceedings of the 15thInternational Conference on Artificial Neural Networks: Formal Models and TheirApplications - Volume Part II, ICANN’05, page 799–804, Berlin, Heidelberg, 2005.Springer-Verlag.
[13] Zhijiang Guo, Yan Zhang, and Wei Lu. Attention guided graph convolutional networksfor relation extraction. arXiv preprint arXiv:1906.07510, 2019.
[14] Iris Hendrickx, Su Nam Kim, Zornitsa Kozareva, Preslav Nakov, Diarmuid Ó Séaghdha,Sebastian Padó, Marco Pennacchiotti, Lorenza Romano, and Stan Szpakowicz. SemEval-2010 task 8: Multi-way classification of semantic relations between pairs of nominals.In Proceedings of the 5th International Workshop on Semantic Evaluation, pages 33–38,Uppsala, Sweden, July 2010. Association for Computational Linguistics.
[15] Tingsong Jiang, Tianyu Liu, Tao Ge, Lei Sha, Baobao Chang, Sujian Li, and Zhifang Sui.Towards time-aware knowledge graph completion. In Proceedings of COLING 2016, the26th International Conference on Computational Linguistics: Technical Papers, pages1715–1724, Osaka, Japan, December 2016. The COLING 2016 Organizing Committee.
[16] Woojeong Jin, Meng Qu, Xisen Jin, and Xiang Ren. Recurrent event network:Autoregressive structure inference over temporal knowledge graphs. In EMNLP, 2020.
[17] Jaehun Jung, Jinhong Jung, and U Kang. T-gap: Learning to walk across time for temporalknowledge graph completion. arXiv preprint arXiv:2012.10595, 2020.
[18] Jaehun Jung, Jinhong Jung, and U Kang. Learning to walk across time for interpretabletemporal knowledge graph completion. In Proceedings of the 27th ACM SIGKDDConference on Knowledge Discovery amp; Data Mining, KDD ’21, page 786–795, NewYork, NY, USA, 2021. Association for Computing Machinery.
[19] Timothée Lacroix, Guillaume Obozinski, and Nicolas Usunier. Tensor decompositions fortemporal knowledge base completion. arXiv preprint arXiv:2004.04926, 2020.
[20] Huiying Li, Yuanyuan Li, Feifei Xu, and Xinyu Zhong. Probabilistic error detecting innumerical linked data. In Proceedings, Part I, of the 26th International Conference onDatabase and Expert Systems Applications - Volume 9261, DEXA 2015, page 61–75,Berlin, Heidelberg, 2015. Springer-Verlag.
[21] Chin-Yew Lin and Franz Josef Och. Automatic evaluation of machine translation qualityusing longest common subsequence and skip-bigram statistics. In Proceedings of the 42ndAnnual Meeting of the Association for Computational Linguistics (ACL-04), pages 605–612, Barcelona, Spain, July 2004.
[22] Yinhan Liu, Myle Ott, Naman Goyal, Jingfei Du, Mandar Joshi, Danqi Chen, Omer Levy,Mike Lewis, Luke Zettlemoyer, and Veselin Stoyanov. Roberta: A robustly optimized bertpretraining approach. ArXiv, abs/1907.11692, 2019.
[23] Yunpu Ma, Volker Tresp, and Erik A. Daxberger. Embedding models for episodicknowledge graphs. J. Web Semant., 59, 2019.
[24] Mike Mintz, Steven Bills, Rion Snow, and Dan Jurafsky. Distant supervision for relationextraction without labeled data. In Proceedings of the Joint Conference of the 47th AnnualMeeting of the ACL and the 4th International Joint Conference on Natural LanguageProcessing of the AFNLP, pages 1003–1011, 2009.
[25] Paramita Mirza and Sara Tonelli. Catena: Causal and temporal relation extraction fromnatural language texts. In The 26th international conference on computational linguistics,pages 64–75. ACL, 2016.
[26] Abhishek Nadgeri, Anson Bastos, Kuldeep Singh, Isaiah Onando Mulang, JohannesHoffart, Saeedeh Shekarpour, and Vijay Saraswat. Kgpool: Dynamic knowledge graphcontext selection for relation extraction. arXiv preprint arXiv:2106.00459, 2021.
[27] Heiko Paulheim. Identifying wrong links between datasets by multi-dimensional outlierdetection. In WoDOOM, pages 27–38, 2014.
[28] Heiko Paulheim. Knowledge graph refinement: A survey of approaches and evaluationmethods. Semantic web, 8(3):489–508, 2017.
[29] Julia Perl, Claudia Wagner, Jerome Kunegis, and Steffen Staab. Twitter as a politicalnetwork: Predicting the following and unfollowing behavior of german politicians. InProceedings of the ACM Web Science Conference, pages 1–2, 2015.
[30] Chris Quirk and Hoifung Poon. Distant supervision for relation extraction beyond thesentence boundary. arXiv preprint arXiv:1609.04873, 2016.
[31] Sebastian Riedel, Limin Yao, and Andrew McCallum. Modeling relations and theirmentions without labeled text. In José Luis Balcázar, Francesco Bonchi, Aristides Gionis,and Michèle Sebag, editors, Machine Learning and Knowledge Discovery in Databases,pages 148–163, Berlin, Heidelberg, 2010. Springer Berlin Heidelberg.
[32] Emanuele Rossi, Ben Chamberlain, Fabrizio Frasca, Davide Eynard, Federico Monti, andMichael Bronstein. Temporal graph networks for deep learning on dynamic graphs, 2020.
[33] Aravind Sankar, Yanhong Wu, Liang Gou, Wei Zhang, and Hao Yang. Dysat: Deep neuralrepresentation learning on dynamic graphs via self-attention networks. In Proceedings ofthe 13th International Conference on Web Search and Data Mining, pages 519–527, 2020.
[34] Michael Schlichtkrull, Thomas N Kipf, Peter Bloem, Rianne Van Den Berg, Ivan Titov, andMax Welling. Modeling relational data with graph convolutional networks. In EuropeanSemantic Web Conference, pages 593–607. Springer, 2018.
[35] Livio Baldini Soares, Nicholas FitzGerald, Jeffrey Ling, and Tom Kwiatkowski.Matching the blanks: Distributional similarity for relation learning. arXiv preprintarXiv:1906.03158, 2019.
[36] Petar Velickovic, Guillem Cucurull, Arantxa Casanova, Adriana Romero, Pietro Lio, andYoshua Bengio. Graph attention networks. stat, 1050:20, 2017.
[37] Hong Wang, Christfried Focke, Rob Sylvester, Nilesh Mishra, and William Wang. Fine-tune bert for docred with two-step process, 2019.
[38] ZhongXian Wang, XiangHui He, and XingYan Hu. Chinese sentence compressionalgorithm based on deep analysis of sentence hierarchy in multiple application scenarios.In 2020 3rd International Conference on Advanced Electronic Materials, Computers andSoftware Engineering (AEMCSE), pages 61–66, 2020.
[39] Shanchan Wu and Yifan He. Enriching pre-trained language model with entity informationfor relation classification. In Proceedings of the 28th ACM International Conference onInformation and Knowledge Management, CIKM ’19, page 2361–2364, New York, NY,USA, 2019. Association for Computing Machinery.
[40] Wei Xu and Ralph Grishman. A parse-and-trim approach with information significancefor Chinese sentence compression. In Proceedings of the 2009 Workshop on LanguageGeneration and Summarisation (UCNLG+Sum 2009), pages 48–55, Suntec, Singapore,August 2009. Association for Computational Linguistics.
[41] Jianhao Yan, Lin He, Ruqin Huang, Jian Li, and Ying Liu. Relation extraction withtemporal reasoning based on memory augmented distant supervision. In Proceedings ofthe 2019 Conference of the North American Chapter of the Association for ComputationalLinguistics: Human Language Technologies, Volume 1 (Long and Short Papers),pages 1019–1030, Minneapolis, Minnesota, June 2019. Association for ComputationalLinguistics.
[42] Jianhao Yan, Lin He, Ruqin Huang, Jian Li, and Ying Liu. Relation extraction withtemporal reasoning based on memory augmented distant supervision. In Proceedings ofthe 2019 Conference of the North American Chapter of the Association for ComputationalLinguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pages1019–1030, 2019.
[43] Bishan Yang, Wen tau Yih, Xiaodong He, Jianfeng Gao, and Li Deng. Embedding entitiesand relations for learning and inference in knowledge bases. CoRR, abs/1412.6575, 2015.
[44] Daojian Zeng, Kang Liu, Yubo Chen, and Jun Zhao. Distant supervision for relationextraction via piecewise convolutional neural networks. In Proceedings of the 2015conference on empirical methods in natural language processing, pages 1753–1762, 2015.
[45] Daojian Zeng, Kang Liu, Siwei Lai, Guangyou Zhou, and Jun Zhao. Relationclassification via convolutional deep neural network. In Proceedings of COLING 2014,the 25th International Conference on Computational Linguistics: Technical Papers, pages2335–2344, Dublin, Ireland, August 2014. Dublin City University and Association forComputational Linguistics.
[46] Ben Zhou, Kyle Richardson, Qiang Ning, Tushar Khot, Ashish Sabharwal, and DanRoth. Temporal reasoning on implicit events from distant supervision. arXiv preprintarXiv:2010.12753, 2020.
[47] Cunchao Zhu, Muhao Chen, Changjun Fan, Guangquan Cheng, and Yan Zhan. Learningfrom history: Modeling temporal knowledge graphs with sequential copy-generationnetworks. arXiv preprint arXiv:2012.08492, 2020.
[48] Kangli Zi, Shi Wang, Yu Liu, Jicun Li, Yanan Cao, and Cungen Cao. SOM-NCSCM :An efficient neural Chinese sentence compression model enhanced with self-organizingmap. In Proceedings of the 2021 Conference on Empirical Methods in Natural LanguageProcessing, pages 403–415, Online and Punta Cana, Dominican Republic, November2021. Association for Computational Linguistics.
zh_TW
dc.identifier.doi (DOI) 10.6814/NCCU202201527en_US