學術產出-Theses

Article View/Open

Publication Export

Google ScholarTM

政大圖書館

Citation Infomation

題名 基於圖形卷積神經網路之異質性圖譜表示法學習
Heterogeneous Graph Embedding Based on Graph Convolutional Neural Networks
作者 蘇裕勝
Su, Yu-Sheng
貢獻者 蔡銘峰
Tsai, Ming-Feng
蘇裕勝
Su, Yu-Sheng
關鍵詞 表示法學習
圖形卷積神經網
推薦
連結預測
Network Embedding
GNN
Link prediction
Recommendation
日期 2019
上傳時間 3-Oct-2019 17:18:08 (UTC+8)
摘要 近年來由於龐大的資料量,如何保存這些資料以及如何將這些資料
做分析、知識庫管理、推薦等變成非常有挑戰的工作。網路學習表示
法(Information Network Embedding),能有效將不同節點和關係投射
到低維度空間,因而成了非常熱門的領域,近年來GNN(Graph Neural
Network) 的概念也被加入到網路學習表示法領域,應用在分類、
推薦等工作。本論文提出一個異質網路表示學習法(Heterogeneous Information
Network Embedding)的架構:先透過學習表示法產生節點的
表示法當作特徵值,並透過建立同質網路圖以及GraphSAGE 的訓練,
將我們所需要的節點都投射到同一個空間中,來做連結預測,以及推
薦。在連結預測中,我們基於我們的建圖方法,可以做到把多種節點
特徵值嵌在一起,並做訓練,能夠有效的提升連結預測的F1-score 成
績。在推薦工作中基於我們的建圖方式可以考慮到更多High order 資
訊,進而提升推薦系統在MAP、Recall、Hit ratio的成績。
In recent years, information network embedding has become popular because the techniques enable to encode information into low-dimensions representation, even for a graph/network with multiple types of nodes and relations. In addition, graph neural network (GNN) has also shown its effectiveness in learning large-scale node representations on node classification. In this paper, therefore, we propose a framework based on the heterogeneous network embedding and the idea of graph neural network. In our framework, we first generate node representations by various network embedding methods. Then, we split a homogeneous network graph into subgraphs and concatenate the learned node representations into the same embedding space. After that, we apply one of variant GNN, called GraphSAGE, to generate representations for the tasks of link prediction and recommendation. In our experiments, the results on the tasks of link prediction and recommendation both show the effectiveness of the proposed framework.
參考文獻 [1] P. W. Battaglia, J. B. Hamrick, V. Bapst, A. Sanchez-Gonzalez, V. F. Zambaldi, M. Malinowski, A. Tacchetti, D. Raposo, A. Santoro, R. Faulkner, C¸ . G¨ulc¸ehre,
F. Song, A. J. Ballard, J. Gilmer, G. E. Dahl, A. Vaswani, K. Allen, C. Nash, V. Langston, C. Dyer, N. Heess, D. Wierstra, P. Kohli, M. Botvinick, O. Vinyals,
Y. Li, and R. Pascanu. Relational inductive biases, deep learning, and graph networks. CoRR, abs/1806.01261, 2018.
[2] R. Burke. Hybrid recommender systems: Survey and experiments. User Modeling and User-Adapted Interaction, 12(4):331–370, Nov 2002.
[3] Y. Dong, N. V. Chawla, and A. Swami. metapath2vec: Scalable representation learning for heterogeneous networks. In KDD ’17, pages 135–144. ACM, 2017.
[4] A. Grover and J. Leskovec. Node2vec: Scalable feature learning for networks. In Proceedings of the 22Nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD ’16, pages 855–864, New York, NY, USA, 2016. ACM.
[5] W. L. Hamilton, R. Ying, and J. Leskovec. Inductive representation learning on large graphs. In NIPS, 2017.
[6] G. E. Hinton. Learning distributed representations of concepts. In Proceedings of the eighth annual conference of the cognitive science society, volume 1, page 12. Amherst, MA, 1986.
[7] T. N. Kipf and M. Welling. Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:1609.02907, 2016.
[8] T. Mikolov, I. Sutskever, K. Chen, G. Corrado, and J. Dean. Distributed representations of words and phrases and their compositionality. In Proceedings of the 26th
International Conference on Neural Information Processing Systems - Volume 2, NIPS’13, pages 3111–3119, USA, 2013. Curran Associates Inc.
[9] M. J. Pazzani and D. Billsus. The adaptive web. chapter Content-based Recommendation Systems, pages 325–341. Springer-Verlag, Berlin, Heidelberg, 2007.
[10] B. Perozzi, R. Al-Rfou, and S. Skiena. Deepwalk: Online learning of social representations.
In Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD ’14, pages 701–710, New York, NY, USA, 2014. ACM.
[11] J. Qiu, Y. Dong, H. Ma, J. Li, K. Wang, and J. Tang. Network embedding as matrix factorization: Unifying deepwalk, line, pte, and node2vec. In Proceedings of the
Eleventh ACM International Conference on Web Search and Data Mining, WSDM ’18, pages 459–467, New York, NY, USA, 2018. ACM.
[12] P. Resnick, N. Iacovou, M. Suchak, P. Bergstrom, and J. Riedl. Grouplens: An open architecture for collaborative filtering of netnews. In Proceedings of the 1994 ACM
Conference on Computer Supported Cooperative Work, CSCW ’94, pages 175–186, New York, NY, USA, 1994. ACM.
[13] B. Sarwar, G. Karypis, J. Konstan, and J. Riedl. Item-based collaborative filtering recommendation algorithms. In Proceedings of the 10th International Conference on World Wide Web,WWW’01, pages 285–295, New York, NY, USA, 2001. ACM.
[14] C. Shi and P. S. Yu. Heterogeneous Information Network Analysis and Applications. Springer Publishing Company, Incorporated, 1st edition, 2017.
[15] J. Tang, M. Qu, M. Wang, M. Zhang, J. Yan, and Q. Mei. Line: Large-scale information network embedding. In Proceedings of the 24th International Conference on
World Wide Web, WWW ’15, pages 1067–1077, Republic and Canton of Geneva, Switzerland, 2015. InternationalWorldWideWeb Conferences Steering Committee.
[16] P. Veliˇckovi´c, G. Cucurull, A. Casanova, A. Romero, P. Li"o, and Y. Bengio. Graph Attention Networks. International Conference on Learning Representations, 2018.
accepted as poster.
[17] K. Xu, W. Hu, J. Leskovec, and S. Jegelka. How powerful are graph neural networks? In International Conference on Learning Representations, 2019.
[18] J. Zhou, G. Cui, Z. Zhang, C. Yang, Z. Liu, and M. Sun. Graph neural networks: A review of methods and applications. CoRR, abs/1812.08434, 2018.
描述 碩士
國立政治大學
資訊科學系
106753004
資料來源 http://thesis.lib.nccu.edu.tw/record/#G0106753004
資料類型 thesis
dc.contributor.advisor 蔡銘峰zh_TW
dc.contributor.advisor Tsai, Ming-Fengen_US
dc.contributor.author (Authors) 蘇裕勝zh_TW
dc.contributor.author (Authors) Su, Yu-Shengen_US
dc.creator (作者) 蘇裕勝zh_TW
dc.creator (作者) Su, Yu-Shengen_US
dc.date (日期) 2019en_US
dc.date.accessioned 3-Oct-2019 17:18:08 (UTC+8)-
dc.date.available 3-Oct-2019 17:18:08 (UTC+8)-
dc.date.issued (上傳時間) 3-Oct-2019 17:18:08 (UTC+8)-
dc.identifier (Other Identifiers) G0106753004en_US
dc.identifier.uri (URI) http://nccur.lib.nccu.edu.tw/handle/140.119/126582-
dc.description (描述) 碩士zh_TW
dc.description (描述) 國立政治大學zh_TW
dc.description (描述) 資訊科學系zh_TW
dc.description (描述) 106753004zh_TW
dc.description.abstract (摘要) 近年來由於龐大的資料量,如何保存這些資料以及如何將這些資料
做分析、知識庫管理、推薦等變成非常有挑戰的工作。網路學習表示
法(Information Network Embedding),能有效將不同節點和關係投射
到低維度空間,因而成了非常熱門的領域,近年來GNN(Graph Neural
Network) 的概念也被加入到網路學習表示法領域,應用在分類、
推薦等工作。本論文提出一個異質網路表示學習法(Heterogeneous Information
Network Embedding)的架構:先透過學習表示法產生節點的
表示法當作特徵值,並透過建立同質網路圖以及GraphSAGE 的訓練,
將我們所需要的節點都投射到同一個空間中,來做連結預測,以及推
薦。在連結預測中,我們基於我們的建圖方法,可以做到把多種節點
特徵值嵌在一起,並做訓練,能夠有效的提升連結預測的F1-score 成
績。在推薦工作中基於我們的建圖方式可以考慮到更多High order 資
訊,進而提升推薦系統在MAP、Recall、Hit ratio的成績。
zh_TW
dc.description.abstract (摘要) In recent years, information network embedding has become popular because the techniques enable to encode information into low-dimensions representation, even for a graph/network with multiple types of nodes and relations. In addition, graph neural network (GNN) has also shown its effectiveness in learning large-scale node representations on node classification. In this paper, therefore, we propose a framework based on the heterogeneous network embedding and the idea of graph neural network. In our framework, we first generate node representations by various network embedding methods. Then, we split a homogeneous network graph into subgraphs and concatenate the learned node representations into the same embedding space. After that, we apply one of variant GNN, called GraphSAGE, to generate representations for the tasks of link prediction and recommendation. In our experiments, the results on the tasks of link prediction and recommendation both show the effectiveness of the proposed framework.en_US
dc.description.tableofcontents 致謝
中文摘要
Abstract
第一章 緒論 1
1.1 前 言 1
1.2 研究目的 2
第二章 相關文獻探討 4
2.1 網路表示法學習 4
2.1.1 基於神經網路的方法 4
2.1.2 基於矩陣分解的方法 5
2.1.3 基於圖神經網路的方法 5
2.1.4 基於圖關注的方法(Graph Attention) 6
2.2 異質性資訊網路 6
2.3 推薦系統 6
2.3.1 基於內容的推薦 7
2.3.2 基於協同過濾的推薦 7
2.3.3 混合式推薦 9
第三章 研究方法 10
3.1 異質中心網路之定義 10
3.2 中心節點表示式 10
3.2.1 獨熱編碼(One-Hot Encoding) 11
3.2.2 深度隨機游走表示法學習(DeepWalk) 11
3.2.3 大規模資訊網路表示法學習(LINE:Large-scale Information Network Embedding) 11
3.2.4 異質網路偏好表示法學習(HPE:Heterogeneous Preference Embedding) 12
3.3 建圖方式 13
3.3.1 連結預測之建圖 13
3.3.2 推薦任務之建圖 14
3.4 模型訓練 16
3.5 目標與損失函數 18
第四章 實驗結果與討論 19
4.1 資料集 19
4.2 實驗設定 20
4.3 評估標準 23
4.4 實驗結果 24
4.4.1 連結預測模型 F1-score 表現 24
4.4.2 推薦模型準確率、召回率、命中率表現 25
第五章 結論 31
5.1 結 論 31
第六章 參考文獻 32
zh_TW
dc.format.extent 1341987 bytes-
dc.format.mimetype application/pdf-
dc.source.uri (資料來源) http://thesis.lib.nccu.edu.tw/record/#G0106753004en_US
dc.subject (關鍵詞) 表示法學習zh_TW
dc.subject (關鍵詞) 圖形卷積神經網zh_TW
dc.subject (關鍵詞) 推薦zh_TW
dc.subject (關鍵詞) 連結預測zh_TW
dc.subject (關鍵詞) Network Embeddingen_US
dc.subject (關鍵詞) GNNen_US
dc.subject (關鍵詞) Link predictionen_US
dc.subject (關鍵詞) Recommendationen_US
dc.title (題名) 基於圖形卷積神經網路之異質性圖譜表示法學習zh_TW
dc.title (題名) Heterogeneous Graph Embedding Based on Graph Convolutional Neural Networksen_US
dc.type (資料類型) thesisen_US
dc.relation.reference (參考文獻) [1] P. W. Battaglia, J. B. Hamrick, V. Bapst, A. Sanchez-Gonzalez, V. F. Zambaldi, M. Malinowski, A. Tacchetti, D. Raposo, A. Santoro, R. Faulkner, C¸ . G¨ulc¸ehre,
F. Song, A. J. Ballard, J. Gilmer, G. E. Dahl, A. Vaswani, K. Allen, C. Nash, V. Langston, C. Dyer, N. Heess, D. Wierstra, P. Kohli, M. Botvinick, O. Vinyals,
Y. Li, and R. Pascanu. Relational inductive biases, deep learning, and graph networks. CoRR, abs/1806.01261, 2018.
[2] R. Burke. Hybrid recommender systems: Survey and experiments. User Modeling and User-Adapted Interaction, 12(4):331–370, Nov 2002.
[3] Y. Dong, N. V. Chawla, and A. Swami. metapath2vec: Scalable representation learning for heterogeneous networks. In KDD ’17, pages 135–144. ACM, 2017.
[4] A. Grover and J. Leskovec. Node2vec: Scalable feature learning for networks. In Proceedings of the 22Nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD ’16, pages 855–864, New York, NY, USA, 2016. ACM.
[5] W. L. Hamilton, R. Ying, and J. Leskovec. Inductive representation learning on large graphs. In NIPS, 2017.
[6] G. E. Hinton. Learning distributed representations of concepts. In Proceedings of the eighth annual conference of the cognitive science society, volume 1, page 12. Amherst, MA, 1986.
[7] T. N. Kipf and M. Welling. Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:1609.02907, 2016.
[8] T. Mikolov, I. Sutskever, K. Chen, G. Corrado, and J. Dean. Distributed representations of words and phrases and their compositionality. In Proceedings of the 26th
International Conference on Neural Information Processing Systems - Volume 2, NIPS’13, pages 3111–3119, USA, 2013. Curran Associates Inc.
[9] M. J. Pazzani and D. Billsus. The adaptive web. chapter Content-based Recommendation Systems, pages 325–341. Springer-Verlag, Berlin, Heidelberg, 2007.
[10] B. Perozzi, R. Al-Rfou, and S. Skiena. Deepwalk: Online learning of social representations.
In Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD ’14, pages 701–710, New York, NY, USA, 2014. ACM.
[11] J. Qiu, Y. Dong, H. Ma, J. Li, K. Wang, and J. Tang. Network embedding as matrix factorization: Unifying deepwalk, line, pte, and node2vec. In Proceedings of the
Eleventh ACM International Conference on Web Search and Data Mining, WSDM ’18, pages 459–467, New York, NY, USA, 2018. ACM.
[12] P. Resnick, N. Iacovou, M. Suchak, P. Bergstrom, and J. Riedl. Grouplens: An open architecture for collaborative filtering of netnews. In Proceedings of the 1994 ACM
Conference on Computer Supported Cooperative Work, CSCW ’94, pages 175–186, New York, NY, USA, 1994. ACM.
[13] B. Sarwar, G. Karypis, J. Konstan, and J. Riedl. Item-based collaborative filtering recommendation algorithms. In Proceedings of the 10th International Conference on World Wide Web,WWW’01, pages 285–295, New York, NY, USA, 2001. ACM.
[14] C. Shi and P. S. Yu. Heterogeneous Information Network Analysis and Applications. Springer Publishing Company, Incorporated, 1st edition, 2017.
[15] J. Tang, M. Qu, M. Wang, M. Zhang, J. Yan, and Q. Mei. Line: Large-scale information network embedding. In Proceedings of the 24th International Conference on
World Wide Web, WWW ’15, pages 1067–1077, Republic and Canton of Geneva, Switzerland, 2015. InternationalWorldWideWeb Conferences Steering Committee.
[16] P. Veliˇckovi´c, G. Cucurull, A. Casanova, A. Romero, P. Li"o, and Y. Bengio. Graph Attention Networks. International Conference on Learning Representations, 2018.
accepted as poster.
[17] K. Xu, W. Hu, J. Leskovec, and S. Jegelka. How powerful are graph neural networks? In International Conference on Learning Representations, 2019.
[18] J. Zhou, G. Cui, Z. Zhang, C. Yang, Z. Liu, and M. Sun. Graph neural networks: A review of methods and applications. CoRR, abs/1812.08434, 2018.
zh_TW
dc.identifier.doi (DOI) 10.6814/NCCU201901186en_US