Please use this identifier to cite or link to this item: https://ah.lib.nccu.edu.tw/handle/140.119/137294
題名: 基於使用者表示法轉換之跨領域偏好排序於推薦系統
User Embedding Transformation on Cross-domain Preference Ranking for Recommender Systems
作者: 陳先灝
Chen, Hsien-Hao
貢獻者: 蔡銘峰
Tsai, Ming-Feng
陳先灝
Chen, Hsien-Hao
關鍵詞: 推薦系統
機器學習
跨領域推薦
冷啟動問題
Recommendation System
Recommender System
Machine Learning
Cross Domain Recommendation
Cold-start
日期: 2021
上傳時間: 1-Oct-2021
摘要: 隨著電子商務、影像串流服務等線上服務平台的發展,各大服務供應商對於「精準掌握用戶喜好」等相關技術的需求也逐季提升。其中,推薦系統作為這類方法的核心技術,如何在多變的現實問題中,提出符合特定需求的解決方式,也成為近年來相關研究的主要方向。\n\n在本研究中,我們特別關心的是推薦系統中的冷啟動 (Cold Start) 問題。 冷啟動問題發生的主要原因,是因為特定情況造成的資料稀缺,比如推薦系統中的新用戶/物品等等。由於其困難性和實際應用中的無可避免,一直是推薦系統研究中,的一個具有挑戰性的問題。\n\n其中,緩解此問題的一種有效方法,是利用相關領域的知識來彌補目標領域的數據缺失問題,即所謂跨領域推薦 (Cross-Domain Recommendation)。跨領域推薦的主要目的在於,在多個不同的領域中實行推薦演算法,從中描繪出用戶的個人偏好 (Personal Preference),再利用這些資訊來補充目標領域缺少的數據,從而在某種程度上解決冷啟動問題。\n\n在本文中,我們提出了一個基於用戶轉換的的跨領域偏好排序方法(CPR),它讓用戶從源域 (Source Domain) 和目標域 (Target Domain)的物品中同時擷取資訊,並據此進行表示法學習,將其轉化為自身偏好的表示向量。通過這樣的轉換形式,CPR 將除了能有效地利用源域的資訊之外,也能直接地以此更新目標域中用戶和物品的相關表示,從而有效地改善目標域的推薦成果。\n\n在數據實驗中,為了能有效證明 CPR 方法的能力,我們將 CPR 方法實驗在六個不同的工業級資料上,並在差異化的條件設定 (目標域全體、冷啟動用戶、共同用戶) 中進行測試,也以先進的跨領域和單領域推薦演算法做為比較基準,進行比較。最後發現,CPR 不僅成功提高目標域整體的推薦效能,針對特定的冷啟動用戶也達到相當好的成果。
With the development of online service platforms such as e-commerce and video streaming services, the major service providers’ demand for related technologies such as ”accurately extracting user preferences” has also\nincreased quarter by quarter. Among them, the recommendation system is the core technology of this kind of method. Therefore, how to propose solutions that meet specific needs in changing real problems has also become\nthe main direction of related research in recent years. In this research, we are particularly concerned about the ”cold-start problem” in the recommendation system.\nThe main reason for the cold-start problem is the scarcity of data caused by specific circumstances, such as recommending new users/items in the system, and so on. Due to its difficulty and inevitable practical application, it has always been a challenging problem in recommender systems research.\nIn this thesis, we propose a cross-domain preference ranking method (CPR) based on user conversion, which allows users to simultaneously extract information from items in the source domain (Source Domain) and the target domain, and based on this, perform representation learning and transform it into a representation vector of their preferences. Through this conversion form, CPR will effectively use the information in the source domain\nand directly update the relevant representations of users and items in the target domain, thereby effectively improving the recommendation results of the target domain.\nIn the data experiment, to effectively prove the ability of the CPR method, we experimented with the CPR method on six different industrial-level data and conducted it in a differentiated condition setting (all target domains, coldstart users, shared users). The test also uses advanced cross-domain and single-domain recommendation algorithms as a benchmark for comparison.\nFinally, it was found that CPR successfully improved the overall recommendation performance of the target domain and achieved quite good results for specific cold-start users.
參考文獻: [1] A. Andoni, R. Panigrahy, G. Valiant, and L. Zhang. Learning polynomials with neural networks. In ICML, volume 32 of JMLR Workshop and Conference Proceedings, pages 1908–1916. JMLR.org, 2014.\n[2] C. Burges, T. Shaked, E. Renshaw, A. Lazier, M. Deeds, N. Hamilton, and G. Hullender. Learning to rank using gradient descent. In ICML ’05: Proceedings of the 22nd international conference on Machine learning, pages 89–96, New York, NY, USA, 2005. ACM.\n[3] C.-C. Chang and C.-J. Lin. LIBSVM: A library for support vector machines. ACM Transactions on Intelligent Systems and Technology, 2:27:1–27:27, 2011. Software\navailable at http://www.csie.ntu.edu.tw/˜cjlin/libsvm.\n[4] W.-L. Chiang, X. Liu, S. Si, Y. Li, S. Bengio, and C.-J. Hsieh. Cluster-gcn: An efficient algorithm for training deep and large graph convolutional networks. CoRR,\nabs/1905.07953, 2019.\n[5] P. Covington, J. Adams, and E. Sargin. Deep neural networks for youtube recommendations. In Proceedings of the 10th ACM Conference on Recommender Systems, New York, NY, USA, 2016.\n[6] J. Devlin, M.-W. Chang, K. Lee, and K. Toutanova. Bert: Pre-training of deep bidirectional transformers for language understanding, 2019.\n[7] J. Gilmer, S. S. Schoenholz, P. F. Riley, O. Vinyals, and G. E. Dahl. Neural message passing for quantum chemistry. CoRR, abs/1704.01212, 2017.\n[8] A. Grover and J. Leskovec. node2vec: Scalable feature learning for networks. CoRR, abs/1607.00653, 2016.\n[9] W. Hamilton, Z. Ying, and J. Leskovec. Inductive representation learning on large graphs. In Advances in Neural Information Processing Systems, page 11, 2017.\n[10] X. He, K. Deng, X. Wang, Y. Li, Y. Zhang, and M. Wang. Lightgcn: Simplifying and powering graph convolution network for recommendation. CoRR, abs/2002.02126, 2020.\n[11] X. He, L. Liao, H. Zhang, L. Nie, X. Hu, and T.-S. Chua. Neural collaborative filtering. In Proceedings of the 26th International Conference on World Wide Web, WWW ’17, page 173–182, Republic and Canton of Geneva, CHE, 2017. International World Wide Web Conferences Steering Committee.\n[12] S. Hochreiter and J. Schmidhuber. Long short-term memory. Neural computation, 9(8):1735–1780, 1997.\n[13] K. Hornik, M. Stinchcombe, and H. White. Multilayer feedforward networks are universal approximators. Neural Networks, 2(5):359 – 366, 1989.\n[14] T. N. Kipf and M. Welling. Semi-supervised classification with graph convolutional networks. 5th International Conference on Learning Representations, 2016.\n[15] Y. Koren, R. Bell, and C. Volinsky. Matrix factorization techniques for recommender systems. Computer, 42(8):30–37, Aug. 2009.\n[16] Y. LeCun, Y. Bengio, and G. Hinton. Deep learning. Nature, 521(7553):436–444, 2015.\n[17] M. Liu, J. Li, G. Li, and P. Pan. Cross domain recommendation via bi-directional transfer graph collaborative filtering networks. In M. d’Aquin, S. Dietze, C. Hauff, E. Curry, and P. Cudre-Mauroux, editors, ´ CIKM, pages 885–894. ACM, 2020.\n[18] T. Man, H. Shen, X. Jin, and X. Cheng. Cross-domain recommendation: An embedding and mapping approach. In Proceedings of the Twenty-Sixth International Joint\nConference on Artificial Intelligence, IJCAI-17, pages 2464–2470, 2017.\n[19] T. Mikolov, K. Chen, G. Corrado, and J. Dean. Efficient estimation of word representations in vector space. In Y. Bengio and Y. LeCun, editors, 1st International\nConference on Learning Representations, ICLR 2013, Scottsdale, Arizona, USA, May 2-4, 2013, Workshop Track Proceedings, 2013.\n[20] T. Mikolov, I. Sutskever, K. Chen, G. Corrado, and J. Dean. Distributed representations of words and phrases and their compositionality. CoRR, abs/1310.4546, 2013.\n[21] B. Perozzi, R. Al-Rfou, and S. Skiena. Deepwalk: Online learning of social representations. CoRR, abs/1403.6652, 2014.\n[22] R. Raina, A. Madhavan, and A. Y. Ng. Large-scale deep unsupervised learning using graphics processors. In A. P. Danyluk, L. Bottou, and M. L. Littman, editors, ICML,\nvolume 382 of ACM International Conference Proceeding Series, pages 873–880. ACM, 2009.\n[23] J. Redmon, S. Divvala, R. Girshick, and A. Farhadi. You only look once: Unified, real-time object detection, 2015. cite arxiv:1506.02640.\n[24] S. Rendle, C. Freudenthaler, Z. Gantner, and L. Schmidt-Thieme. BPR: bayesian personalized ranking from implicit feedback. CoRR, abs/1205.2618, 2012.\n[25] S. Rendle, W. Krichene, L. Zhang, and J. R. Anderson. Neural collaborative filtering vs. matrix factorization revisited. CoRR, abs/2005.09683, 2020.\n[26] P. Resnick, N. Iacovou, M. Suchak, P. Bergstrom, and J. Riedl. Grouplens: an open architecture for collaborative filtering of netnews. In CSCW ’94: Proceedings of the\n1994 ACM conference on Computer supported cooperative work, pages 175–186, New York, NY, USA, 1994. ACM Press.\n[27] B. Sarwar, G. Karypis, J. Konstan, and J. Riedl. Incremental singular value decomposition algorithms for highly scalable recommender systems. In Proceedings of the\n5th International Conference in Computers and Information Technology, 2002.\n[28] K. Simonyan and A. Zisserman. Very deep convolutional networks for large-scale image recognition. arXiv preprint arXiv:1409.1556, 2014.\n[29] A. P. Singh and G. J. Gordon. Relational learning via collective matrix factorization. In Proceedings of the 14th ACM SIGKDD international conference on Knowledge\ndiscovery and data mining, pages 650–658, 2008.\n[30] J. Tang, M. Qu, M. Wang, M. Zhang, J. Yan, and Q. Mei. LINE: large-scale information network embedding. CoRR, abs/1503.03578, 2015.\n[31] A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A. N. Gomez, L. Laiser, and I. Polosukhin. Attention is all you need. In I. Guyon, U. V. Luxburg, S. Bengio,\nH. Wallach, R. Fergus, S. Vishwanathan, and R. Garnett, editors, Advances in Neural Information Processing Systems 30, page 5998–6008. Curran Associates, Inc., 2017.\n[32] P. Velickovi ˇ c, G. Cucurull, A. Casanova, A. Romero, P. Li ´ o, and Y. Bengio. Graph attention networks, 2018.\n[33] M. Volkovs, G. W. Yu, and T. Poutanen. Content-based neighbor models for cold start in recommender systems. In Proceedings of the Recommender Systems Challenge 2017, pages 1–6. 2017.\n[34] M. Volkovs, G. W. Yu, and T. Poutanen. Dropoutnet: Addressing cold start in recommender systems. In I. Guyon, U. von Luxburg, S. Bengio, H. M. Wallach, R. Fergus,\nS. V. N. Vishwanathan, and R. Garnett, editors, Advances in Neural Information Processing Systems 30: Annual Conference on Neural Information Processing Systems 2017, December 4-9, 2017, Long Beach, CA, USA, pages 4957–4966, 2017.\n[35] X. Wang, X. He, Y. Cao, M. Liu, and T.-S. Chua. Kgat: Knowledge graph attention network for recommendation. CoRR, abs/1905.07854, 2019.\n[36] X. Wang, X. He, M. Wang, F. Feng, and T. Chua. Neural graph collaborative filtering. CoRR, abs/1905.08108, 2019.\n[37] K. Xu, W. Hu, J. Leskovec, and S. Jegelka. How powerful are graph neural networks? CoRR, abs/1810.00826, 2018.\n[38] R. Ying, R. He, K. Chen, P. Eksombatchai, W. L. Hamilton, and J. Leskovec. Graph convolutional neural networks for web-scale recommender systems, 2018.\ncite arxiv:1806.01973Comment: KDD 2018.
描述: 碩士
國立政治大學
資訊科學系
108753107
資料來源: http://thesis.lib.nccu.edu.tw/record/#G0108753107
資料類型: thesis
Appears in Collections:學位論文

Files in This Item:
File Description SizeFormat
310701.pdf1.79 MBAdobe PDF2View/Open
Show full item record

Google ScholarTM

Check

Altmetric

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.