Publications-Theses
Article View/Open
Publication Export
-
題名 基於圖形卷積神經網路之異質性圖譜表示法學習
Heterogeneous Graph Embedding Based on Graph Convolutional Neural Networks作者 蘇裕勝
Su, Yu-Sheng貢獻者 蔡銘峰
Tsai, Ming-Feng
蘇裕勝
Su, Yu-Sheng關鍵詞 表示法學習
圖形卷積神經網
推薦
連結預測
Network Embedding
GNN
Link prediction
Recommendation日期 2019 上傳時間 3-Oct-2019 17:18:08 (UTC+8) 摘要 近年來由於龐大的資料量,如何保存這些資料以及如何將這些資料做分析、知識庫管理、推薦等變成非常有挑戰的工作。網路學習表示法(Information Network Embedding),能有效將不同節點和關係投射到低維度空間,因而成了非常熱門的領域,近年來GNN(Graph NeuralNetwork) 的概念也被加入到網路學習表示法領域,應用在分類、推薦等工作。本論文提出一個異質網路表示學習法(Heterogeneous InformationNetwork Embedding)的架構:先透過學習表示法產生節點的表示法當作特徵值,並透過建立同質網路圖以及GraphSAGE 的訓練,將我們所需要的節點都投射到同一個空間中,來做連結預測,以及推薦。在連結預測中,我們基於我們的建圖方法,可以做到把多種節點特徵值嵌在一起,並做訓練,能夠有效的提升連結預測的F1-score 成績。在推薦工作中基於我們的建圖方式可以考慮到更多High order 資訊,進而提升推薦系統在MAP、Recall、Hit ratio的成績。
In recent years, information network embedding has become popular because the techniques enable to encode information into low-dimensions representation, even for a graph/network with multiple types of nodes and relations. In addition, graph neural network (GNN) has also shown its effectiveness in learning large-scale node representations on node classification. In this paper, therefore, we propose a framework based on the heterogeneous network embedding and the idea of graph neural network. In our framework, we first generate node representations by various network embedding methods. Then, we split a homogeneous network graph into subgraphs and concatenate the learned node representations into the same embedding space. After that, we apply one of variant GNN, called GraphSAGE, to generate representations for the tasks of link prediction and recommendation. In our experiments, the results on the tasks of link prediction and recommendation both show the effectiveness of the proposed framework.參考文獻 [1] P. W. Battaglia, J. B. Hamrick, V. Bapst, A. Sanchez-Gonzalez, V. F. Zambaldi, M. Malinowski, A. Tacchetti, D. Raposo, A. Santoro, R. Faulkner, C¸ . G¨ulc¸ehre,F. Song, A. J. Ballard, J. Gilmer, G. E. Dahl, A. Vaswani, K. Allen, C. Nash, V. Langston, C. Dyer, N. Heess, D. Wierstra, P. Kohli, M. Botvinick, O. Vinyals,Y. Li, and R. Pascanu. Relational inductive biases, deep learning, and graph networks. CoRR, abs/1806.01261, 2018.[2] R. Burke. Hybrid recommender systems: Survey and experiments. User Modeling and User-Adapted Interaction, 12(4):331–370, Nov 2002.[3] Y. Dong, N. V. Chawla, and A. Swami. metapath2vec: Scalable representation learning for heterogeneous networks. In KDD ’17, pages 135–144. ACM, 2017.[4] A. Grover and J. Leskovec. Node2vec: Scalable feature learning for networks. In Proceedings of the 22Nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD ’16, pages 855–864, New York, NY, USA, 2016. ACM.[5] W. L. Hamilton, R. Ying, and J. Leskovec. Inductive representation learning on large graphs. In NIPS, 2017.[6] G. E. Hinton. Learning distributed representations of concepts. In Proceedings of the eighth annual conference of the cognitive science society, volume 1, page 12. Amherst, MA, 1986.[7] T. N. Kipf and M. Welling. Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:1609.02907, 2016.[8] T. Mikolov, I. Sutskever, K. Chen, G. Corrado, and J. Dean. Distributed representations of words and phrases and their compositionality. In Proceedings of the 26thInternational Conference on Neural Information Processing Systems - Volume 2, NIPS’13, pages 3111–3119, USA, 2013. Curran Associates Inc.[9] M. J. Pazzani and D. Billsus. The adaptive web. chapter Content-based Recommendation Systems, pages 325–341. Springer-Verlag, Berlin, Heidelberg, 2007.[10] B. Perozzi, R. Al-Rfou, and S. Skiena. Deepwalk: Online learning of social representations.In Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD ’14, pages 701–710, New York, NY, USA, 2014. ACM.[11] J. Qiu, Y. Dong, H. Ma, J. Li, K. Wang, and J. Tang. Network embedding as matrix factorization: Unifying deepwalk, line, pte, and node2vec. In Proceedings of theEleventh ACM International Conference on Web Search and Data Mining, WSDM ’18, pages 459–467, New York, NY, USA, 2018. ACM.[12] P. Resnick, N. Iacovou, M. Suchak, P. Bergstrom, and J. Riedl. Grouplens: An open architecture for collaborative filtering of netnews. In Proceedings of the 1994 ACMConference on Computer Supported Cooperative Work, CSCW ’94, pages 175–186, New York, NY, USA, 1994. ACM.[13] B. Sarwar, G. Karypis, J. Konstan, and J. Riedl. Item-based collaborative filtering recommendation algorithms. In Proceedings of the 10th International Conference on World Wide Web,WWW’01, pages 285–295, New York, NY, USA, 2001. ACM.[14] C. Shi and P. S. Yu. Heterogeneous Information Network Analysis and Applications. Springer Publishing Company, Incorporated, 1st edition, 2017.[15] J. Tang, M. Qu, M. Wang, M. Zhang, J. Yan, and Q. Mei. Line: Large-scale information network embedding. In Proceedings of the 24th International Conference onWorld Wide Web, WWW ’15, pages 1067–1077, Republic and Canton of Geneva, Switzerland, 2015. InternationalWorldWideWeb Conferences Steering Committee.[16] P. Veliˇckovi´c, G. Cucurull, A. Casanova, A. Romero, P. Li"o, and Y. Bengio. Graph Attention Networks. International Conference on Learning Representations, 2018.accepted as poster.[17] K. Xu, W. Hu, J. Leskovec, and S. Jegelka. How powerful are graph neural networks? In International Conference on Learning Representations, 2019.[18] J. Zhou, G. Cui, Z. Zhang, C. Yang, Z. Liu, and M. Sun. Graph neural networks: A review of methods and applications. CoRR, abs/1812.08434, 2018. 描述 碩士
國立政治大學
資訊科學系
106753004資料來源 http://thesis.lib.nccu.edu.tw/record/#G0106753004 資料類型 thesis dc.contributor.advisor 蔡銘峰 zh_TW dc.contributor.advisor Tsai, Ming-Feng en_US dc.contributor.author (Authors) 蘇裕勝 zh_TW dc.contributor.author (Authors) Su, Yu-Sheng en_US dc.creator (作者) 蘇裕勝 zh_TW dc.creator (作者) Su, Yu-Sheng en_US dc.date (日期) 2019 en_US dc.date.accessioned 3-Oct-2019 17:18:08 (UTC+8) - dc.date.available 3-Oct-2019 17:18:08 (UTC+8) - dc.date.issued (上傳時間) 3-Oct-2019 17:18:08 (UTC+8) - dc.identifier (Other Identifiers) G0106753004 en_US dc.identifier.uri (URI) http://nccur.lib.nccu.edu.tw/handle/140.119/126582 - dc.description (描述) 碩士 zh_TW dc.description (描述) 國立政治大學 zh_TW dc.description (描述) 資訊科學系 zh_TW dc.description (描述) 106753004 zh_TW dc.description.abstract (摘要) 近年來由於龐大的資料量,如何保存這些資料以及如何將這些資料做分析、知識庫管理、推薦等變成非常有挑戰的工作。網路學習表示法(Information Network Embedding),能有效將不同節點和關係投射到低維度空間,因而成了非常熱門的領域,近年來GNN(Graph NeuralNetwork) 的概念也被加入到網路學習表示法領域,應用在分類、推薦等工作。本論文提出一個異質網路表示學習法(Heterogeneous InformationNetwork Embedding)的架構:先透過學習表示法產生節點的表示法當作特徵值,並透過建立同質網路圖以及GraphSAGE 的訓練,將我們所需要的節點都投射到同一個空間中,來做連結預測,以及推薦。在連結預測中,我們基於我們的建圖方法,可以做到把多種節點特徵值嵌在一起,並做訓練,能夠有效的提升連結預測的F1-score 成績。在推薦工作中基於我們的建圖方式可以考慮到更多High order 資訊,進而提升推薦系統在MAP、Recall、Hit ratio的成績。 zh_TW dc.description.abstract (摘要) In recent years, information network embedding has become popular because the techniques enable to encode information into low-dimensions representation, even for a graph/network with multiple types of nodes and relations. In addition, graph neural network (GNN) has also shown its effectiveness in learning large-scale node representations on node classification. In this paper, therefore, we propose a framework based on the heterogeneous network embedding and the idea of graph neural network. In our framework, we first generate node representations by various network embedding methods. Then, we split a homogeneous network graph into subgraphs and concatenate the learned node representations into the same embedding space. After that, we apply one of variant GNN, called GraphSAGE, to generate representations for the tasks of link prediction and recommendation. In our experiments, the results on the tasks of link prediction and recommendation both show the effectiveness of the proposed framework. en_US dc.description.tableofcontents 致謝中文摘要Abstract第一章 緒論 11.1 前 言 11.2 研究目的 2第二章 相關文獻探討 42.1 網路表示法學習 42.1.1 基於神經網路的方法 42.1.2 基於矩陣分解的方法 52.1.3 基於圖神經網路的方法 52.1.4 基於圖關注的方法(Graph Attention) 62.2 異質性資訊網路 62.3 推薦系統 62.3.1 基於內容的推薦 72.3.2 基於協同過濾的推薦 72.3.3 混合式推薦 9第三章 研究方法 103.1 異質中心網路之定義 103.2 中心節點表示式 103.2.1 獨熱編碼(One-Hot Encoding) 113.2.2 深度隨機游走表示法學習(DeepWalk) 113.2.3 大規模資訊網路表示法學習(LINE:Large-scale Information Network Embedding) 113.2.4 異質網路偏好表示法學習(HPE:Heterogeneous Preference Embedding) 123.3 建圖方式 133.3.1 連結預測之建圖 133.3.2 推薦任務之建圖 143.4 模型訓練 163.5 目標與損失函數 18第四章 實驗結果與討論 194.1 資料集 194.2 實驗設定 204.3 評估標準 234.4 實驗結果 244.4.1 連結預測模型 F1-score 表現 244.4.2 推薦模型準確率、召回率、命中率表現 25第五章 結論 315.1 結 論 31第六章 參考文獻 32 zh_TW dc.format.extent 1341987 bytes - dc.format.mimetype application/pdf - dc.source.uri (資料來源) http://thesis.lib.nccu.edu.tw/record/#G0106753004 en_US dc.subject (關鍵詞) 表示法學習 zh_TW dc.subject (關鍵詞) 圖形卷積神經網 zh_TW dc.subject (關鍵詞) 推薦 zh_TW dc.subject (關鍵詞) 連結預測 zh_TW dc.subject (關鍵詞) Network Embedding en_US dc.subject (關鍵詞) GNN en_US dc.subject (關鍵詞) Link prediction en_US dc.subject (關鍵詞) Recommendation en_US dc.title (題名) 基於圖形卷積神經網路之異質性圖譜表示法學習 zh_TW dc.title (題名) Heterogeneous Graph Embedding Based on Graph Convolutional Neural Networks en_US dc.type (資料類型) thesis en_US dc.relation.reference (參考文獻) [1] P. W. Battaglia, J. B. Hamrick, V. Bapst, A. Sanchez-Gonzalez, V. F. Zambaldi, M. Malinowski, A. Tacchetti, D. Raposo, A. Santoro, R. Faulkner, C¸ . G¨ulc¸ehre,F. Song, A. J. Ballard, J. Gilmer, G. E. Dahl, A. Vaswani, K. Allen, C. Nash, V. Langston, C. Dyer, N. Heess, D. Wierstra, P. Kohli, M. Botvinick, O. Vinyals,Y. Li, and R. Pascanu. Relational inductive biases, deep learning, and graph networks. CoRR, abs/1806.01261, 2018.[2] R. Burke. Hybrid recommender systems: Survey and experiments. User Modeling and User-Adapted Interaction, 12(4):331–370, Nov 2002.[3] Y. Dong, N. V. Chawla, and A. Swami. metapath2vec: Scalable representation learning for heterogeneous networks. In KDD ’17, pages 135–144. ACM, 2017.[4] A. Grover and J. Leskovec. Node2vec: Scalable feature learning for networks. In Proceedings of the 22Nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD ’16, pages 855–864, New York, NY, USA, 2016. ACM.[5] W. L. Hamilton, R. Ying, and J. Leskovec. Inductive representation learning on large graphs. In NIPS, 2017.[6] G. E. Hinton. Learning distributed representations of concepts. In Proceedings of the eighth annual conference of the cognitive science society, volume 1, page 12. Amherst, MA, 1986.[7] T. N. Kipf and M. Welling. Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:1609.02907, 2016.[8] T. Mikolov, I. Sutskever, K. Chen, G. Corrado, and J. Dean. Distributed representations of words and phrases and their compositionality. In Proceedings of the 26thInternational Conference on Neural Information Processing Systems - Volume 2, NIPS’13, pages 3111–3119, USA, 2013. Curran Associates Inc.[9] M. J. Pazzani and D. Billsus. The adaptive web. chapter Content-based Recommendation Systems, pages 325–341. Springer-Verlag, Berlin, Heidelberg, 2007.[10] B. Perozzi, R. Al-Rfou, and S. Skiena. Deepwalk: Online learning of social representations.In Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD ’14, pages 701–710, New York, NY, USA, 2014. ACM.[11] J. Qiu, Y. Dong, H. Ma, J. Li, K. Wang, and J. Tang. Network embedding as matrix factorization: Unifying deepwalk, line, pte, and node2vec. In Proceedings of theEleventh ACM International Conference on Web Search and Data Mining, WSDM ’18, pages 459–467, New York, NY, USA, 2018. ACM.[12] P. Resnick, N. Iacovou, M. Suchak, P. Bergstrom, and J. Riedl. Grouplens: An open architecture for collaborative filtering of netnews. In Proceedings of the 1994 ACMConference on Computer Supported Cooperative Work, CSCW ’94, pages 175–186, New York, NY, USA, 1994. ACM.[13] B. Sarwar, G. Karypis, J. Konstan, and J. Riedl. Item-based collaborative filtering recommendation algorithms. In Proceedings of the 10th International Conference on World Wide Web,WWW’01, pages 285–295, New York, NY, USA, 2001. ACM.[14] C. Shi and P. S. Yu. Heterogeneous Information Network Analysis and Applications. Springer Publishing Company, Incorporated, 1st edition, 2017.[15] J. Tang, M. Qu, M. Wang, M. Zhang, J. Yan, and Q. Mei. Line: Large-scale information network embedding. In Proceedings of the 24th International Conference onWorld Wide Web, WWW ’15, pages 1067–1077, Republic and Canton of Geneva, Switzerland, 2015. InternationalWorldWideWeb Conferences Steering Committee.[16] P. Veliˇckovi´c, G. Cucurull, A. Casanova, A. Romero, P. Li"o, and Y. Bengio. Graph Attention Networks. International Conference on Learning Representations, 2018.accepted as poster.[17] K. Xu, W. Hu, J. Leskovec, and S. Jegelka. How powerful are graph neural networks? In International Conference on Learning Representations, 2019.[18] J. Zhou, G. Cui, Z. Zhang, C. Yang, Z. Liu, and M. Sun. Graph neural networks: A review of methods and applications. CoRR, abs/1812.08434, 2018. zh_TW dc.identifier.doi (DOI) 10.6814/NCCU201901186 en_US