Please use this identifier to cite or link to this item: https://ah.lib.nccu.edu.tw/handle/140.119/138941
題名: 時間序列特徵學習應用於股票市場預測
Time Series Representation Learning for Stock Market Prediction
作者: 焉然
Yen, Jan
貢獻者: 蔡炎龍
Tsai, Yen-Lung
焉然
Yen, Jan
關鍵詞: 深度學習
卷積神經網路
長短期記憶神經網路
孿生神經網路
特徵學習
對比學習
p進數
碎形p進位表示法
股票市場預測
Deep Learning
CNN
LSTM
Siamese Network
Representation Learning
Contrastive Learning
p-adic Number
Fractal p-adic Representation
Stock Market Prediction
日期: 2021
上傳時間: 10-Feb-2022
摘要: 特徵學習是當今深度學習的熱門議題,但目前大多數的特徵學習都是針對圖像或是自然語言處理。本文嘗試利用特徵學習的方法,對時間序列資料做特徵學習。並以股票資料為主要應用。本文的特徵學習主要採用孿生神經網路做對比學習,以找到最佳的特徵學習函數。我們也嘗試結合以p進數表示的股價資訊的方法來輔助對比學習的訓練。我們發現就股票預測問題而言,利用特徵學習訓練的模型相對於單純的卷積神經網路或是長短期記憶神經網路預測出來的結果穩定,而且結合碎形p進位表示法所訓練出來的結果是最好的。
Representation learning has become a popular mehthod in deep learning. However, most of applications and researches of it are image recognition or natural language process. In this paper, we try to apply representation learning method to time-series data such as stock. We propose a SiamCL model to implement contrastive representation learning with Siamese network. With this model, our goal is to find the most suitable representation of data. We also combine the fractal p-adic representation to improve the performance of models. We find the fact that SiamCL is rather stable than CNN and LSTM. Moreover, when dealing with extremely imbalanced dataset, SiamCL is more powerful and fractal p-adic representation indeed can improve the performance of models.
參考文獻: [1] Dzmitry Bahdanau, Kyunghyun Cho, and Yoshua Bengio. Neural machine translation by jointly learning to align and translate. arXiv preprint arXiv:1409.0473, 2014.\n[2] Y. Bengio, P. Simard, and P. Frasconi. Learning long-term dependencies with gradient descent is difficult. IEEE ransactions on Neural Networks, 5(2):157–166, 1994.\n[3] Yoshua Bengio, Aaron Courville, and Pascal Vincent. Representation learning: A review and new perspectives. IEEE transactions on pattern analysis and machine intelligence, 35(8):1798–1828, 2013.\n[4] Jane Bromley, James W Bentz, Léon Bottou, Isabelle Guyon, Yann LeCun, Cliff Moore, Eduard Säckinger, and Roopak Shah. Signature verification using a“siamese"time delay neural network. International Journal of Pattern Recognition and Artificial Intelligence, 7(04):669–688, 1993.\n[5] Ting Chen, Simon Kornblith, Mohammad Norouzi, and Geoffrey Hinton. A simple framework for contrastive learning of visual representations. In International conference on machine learning, pages 1597–1607. PMLR, 2020.\n[6] Xinlei Chen and Kaiming He. Exploring simple siamese representation learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 15750–15758, 2021.\n[7] Sounak Dey, Anjan Dutta, J Ignacio Toledo, Suman K Ghosh, Josep Lladós, and Umapada Pal. Signet: Convolutional siamese network for writer independent offline signature verification. arXiv preprint arXiv:1707.02131, 2017.\n[8] Kunihiko Fukushima. Neural network model for a mechanism of pattern recognition unaffected by shift in position-neocognitron. IEICE Technical Report, A, 62(10):658–665, 1979.\n[9] Kunihiko Fukushima, Sei Miyake, and Takayuki Ito. Neocognitron: A neural network model for a mechanism of visual pattern recognition. IEEE Transactions on Systems, Man, and Cybernetics, SMC-13(5):826–834, 1983.\n[10] Fernando Q. Gouvêa. Apéritif, pages 9–30. Springer International Publishing, Cham, 2020.\n[11] Jean-Bastien Grill, Florian Strub, Florent Altché, Corentin Tallec, Pierre H Richemond, Elena Buchatskaya, Carl Doersch, Bernardo Avila Pires, Zhaohan Daniel Guo, Mohammad Gheshlaghi Azar, et al. Bootstrap your own latent: A new approach to selfsupervised learning. arXiv preprint arXiv:2006.07733, 2020.\n[12] R. Hadsell, S. Chopra, and Y. LeCun. Dimensionality reduction by learning an invariant mapping. In 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’06), volume 2, pages 1735–1742, 2006.\n[13] Kaiming He, Haoqi Fan, Yuxin Wu, Saining Xie, and Ross Girshick. Momentum contrast for unsupervised visual representation learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 9729–9738, 2020.\n[14] Kaiming He, Xiangyu Zhang, Shaoqing Ren, and Jian Sun. Deep residual learning for image recognition. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 770–778, 2016.\n[15] Sepp Hochreiter and Jürgen Schmidhuber. Long short-term memory. Neural computation, 9(8):1735–1780, 1997.\n[16] Phuc H. Le-Khac, Graham Healy, and Alan F. Smeaton. Contrastive representation learning: A framework and review. IEEE Access, 8:193907–193934, 2020.\n[17] Yann LeCun, Yoshua Bengio, and Geoffrey Hinton. Deep learning. nature, 521(7553): 436–444, 2015.\n[18] Mengchen Liu, Jiaxin Shi, Zhen Li, Chongxuan Li, Jun Zhu, and Shixia Liu. Towards better analysis of deep convolutional neural networks. IEEE transactions on visualization and computer graphics, 23(1):91–100, 2016.\n[19] Xiao Liu, Fanjin Zhang, Zhenyu Hou, Li Mian, Zhaoyu Wang, Jing Zhang, and Jie Tang. Self-supervised learning: Generative or contrastive. IEEE Transactions on Knowledge and Data Engineering, page 1–1, 2021.\n[20] Warren S McCulloch and Walter Pitts. A logical calculus of the ideas immanent in nervous activity. The bulletin of mathematical biophysics, 5(4):115–133, 1943.\n[21] J. Neukirch. The p-Adic Numbers, pages 155–178. Springer New York, New York, NY, 1991.\n[22] Dean A Pomerleau. Alvinn: An autonomous land vehicle in a neural network. Technical report, 1989.\n[23] David E Rumelhart, Geoffrey E Hinton, and Ronald J Williams. Learning internal representations by error propagation. Technical report, 1985.\n[24] David E Rumelhart, Geoffrey E Hinton, and Ronald J Williams. Learning representations by back-propagating errors. nature, 323(6088):533–536, 1986.\n[25] Attaullah Sahito, Eibe Frank, and Bernhard Pfahringer. Semi-supervised learning using siamese networks. Lecture Notes in Computer Science, page 586–597, 2019.\n[26] Jürgen Schmidhuber. Deep learning in neural networks: An overview. Neural Networks, 61:85–117, Jan 2015.\n[27] Avraam Tsantekidis, Nikolaos Passalis, Anastasios Tefas, Juho Kanniainen, Moncef Gabbouj, and Alexandros Iosifidis. Forecasting stock prices from the limit order book using convolutional neural networks. In 2017 IEEE 19th Conference on Business Informatics (CBI), volume 1, pages 7–12. IEEE, 2017.\n[28] V. Zharkov. Description of conductivity steps in polymer and other materials by functions of p-adic argument, 2011.\n[29] Victor Zharkov. Adelic theory of the stock market. In Market Risk and Financial Markets Modeling, pages 255–267. Springer, 2012.\n[30] Viktor Zharkov. Multiagent’s model of stock market with p-adic description of prices. arXiv preprint arXiv:1310.8431, 2013.
描述: 碩士
國立政治大學
應用數學系
105751005
資料來源: http://thesis.lib.nccu.edu.tw/record/#G0105751005
資料類型: thesis
Appears in Collections:學位論文

Files in This Item:
File Description SizeFormat
100501.pdf3.16 MBAdobe PDF2View/Open
Show full item record

Google ScholarTM

Check

Altmetric

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.