學術產出-Theses

Article View/Open

Publication Export

Google ScholarTM

政大圖書館

Citation Infomation

  • No doi shows Citation Infomation
題名 應用機器學習於標準普爾指數期貨
An application of machine learning to Standard & Poor`s 500 index future.
作者 林雋鈜
Lin, Jyun-Hong
貢獻者 蔡瑞煌
Tsaih, Rua-Huan
林雋鈜
Lin, Jyun-Hong
關鍵詞 機器學習
類神經網路
圖形處理器
標準普爾500指數
期貨市場
張量流
VIX指數
Machine learning
Artificial neural network
GPU
S&P500
Futures market
TensorFlow
VIX index
日期 2017
上傳時間 2-Oct-2017 10:15:01 (UTC+8)
摘要 本系統係藉由分析歷史交易資料來預測S&P500期貨市場之漲幅。 我們改進了Tsaih et al. (1998)提出的混和式AI系統。 該系統結合了Rule Base 系統以及類神經網路作為其預測之機制。我們針對該系統在以下幾點進行改善:(1) 將原本的日期資料改為使用分鐘資料作為輸入。(2) 本研究採用了“移動視窗”的技術,在移動視窗的概念下,每一個視窗我們希望能夠在60分鐘內訓練完成。(3)在擴增了額外的變數 – VIX價格做為系統的輸入。(4) 由於運算量上升,因此本研究利用TensorFlow 以及GPU運算來改進系統之運作效能。
我們發現VIX變數確實可以改善系統之預測精準度,但訓練的時間雖然平均低於60分鐘,但仍有部分視窗的時間會小幅超過60分鐘。
The system is made to predict the Futures’ trend through analyzing the transaction data in the past, and gives advices to the investors who are hesitating to make decisions. We improved the system proposed by Tsaih et al. (1998), which was called hybrid AI system. It was combined with rule-based system and artificial neural network system, which can give suggestions depends on the past data. We improved the hybrid system with the following aspects: (1) The index data are changed from daily-based in into the minute-based in this study. (2) The “moving-window” mechanism is adopted in this study. For each window, we hope we can finish training in 60 minutes. (3) There is one extra variable VIX, which is calculated by the VIX in this study. (4) Due to the more computation demand, TensorFlow and GPU computing is applied in our system.
We discover that the VIX can obviously has positively influence of the predicting performance of our proposed system. The average training time is lower than 60 minutes, however, some of the windows still cost more than 60 minutes to train.
參考文獻 1. Abadi, M., Agarwal, A., Barham, P., Brevdo, E., Chen, Z., Citro, C., ... & Ghemawat, S. “TensorFlow: Large-scale machine learning on heterogeneous distributed systems,” arXiv preprint arXiv:1603.04467, 2016.
2. Arner, D. W., Barberis, J., & Buckley, R. P., “The Evolution of Fintech: A New Post-Crisis Paradigm?”, 2015.
3. Babcock, B., Datar, M., & Motwani, R. “Sampling from a moving window over streaming data,” Proceedings of the thirteenth annual ACM-SIAM symposium on Discrete algorithms. Society for Industrial and Applied Mathematics, January 2002, pp. 633-634.
4. Bartlett, M. S., Littlewort, G., Frank, M., Lainscsek, C., Fasel, I., & Movellan, J., “Recognizing facial expression: machine learning and application to spontaneous behavior,” Computer Vision and Pattern Recognition, 2005. CVPR 2005. IEEE Computer Society Conference, Vol. 2, June 2005, pp. 568-573.
5. Catanzaro, B., Sundaram, N., & Keutzer, K., “Fast support vector machine training and classification on graphics processors,” Proceedings of the 25th international conference on Machine learning. ACM, July 2008, pp. 104-111.
6. Chen, A. S., Leung, M. T., & Daouk, H., “Application of neural networks to an emerging financial market: forecasting and trading the Taiwan Stock Index.” Computers & Operations Research 30(6), 2003, pp. 901-923.
7. Clark, J, “Google Turning Its Lucrative Web Search Over to AI Machines,” Bloomberg Technology, August 2015 (available online at https://www.bloomberg.com/news/articles/2015-10-26/google-turning-its-lucrative-web-search-over-to-ai-machines).
8. Cohen, W. W., Machine Learning Proceedings 1994: Proceedings of the Eighth, International Conference. Morgan Kaufmann., 2017.
9. Google Brain, “TensorFlow,” Google Brain, 2017, available online at https://www.TensorFlow.org/.
10. Hull, J. C., Options, futures, and other derivatives. Pearson Education India, 2006.
11. Heakal R., “Futures Fundamentals: Characteristics”, Investopedia (available online at http://www.investopedia.com/university/futures/futures4.asp).
12. Hornik, K., Stinchcombe, M., & White, H., “Multilayer feedforward networks are universal approximators,” Neural networks, 2(5), 1989, pp359-366.
13. Kashani, M. N., Aminian, J., Shahhosseini, S., & Farrokhi, M., “Dynamic crude oil fouling prediction in industrial preheaters using optimized ANN based moving window technique,” Chemical Engineering Research and Design, 90(7), 2012, pp. 938-949.
14. Metz C., “TensorFlow, Google’s Open Source AI, Signals Big Changes in Hardware Too,” Wired.com, November 2015 (available online at https://www.wired.com/2015/11/googles-open-source-ai-TensorFlow-signals-fast-changing-hardware-world/).
15. Owens, J. D., Houston, M., Luebke, D., Green, S., Stone, J. E., & Phillips, J. C. “GPU computing,” Proceedings of the IEEE, 96(5), 2008, pp. 879-899.
16. Rampasek, L., & Goldenberg, A., “TensorFlow: Biology’s gateway to deep learning?,” Cell systems, 2(1), 2016, pp. 12-14.
17. Scherer, K. R., “Studying the emotion-antecedent appraisal process: An expert system approach,” Cognition & Emotion, 7(3-4), 1993, pp. 325-355.
18. Stoll, H. R., & Whaley, R. E., “Commodity index investing and commodity futures prices,” 2015.
19. Thomson Reuters, “Google`s AI beats human champion at Go,” CBC News, January 2016 (available online at http://www.cbc.ca/news/technology/alphago-ai-1.3422347).
20. Tsaih, R. R., “The softening learning procedure,” Mathematical and computer modelling, 18(8), 1993, pp. 61-64.
21. Tsaih, R. R., “Reasoning neural networks,”. Mathematics of Neural Networks, 1997, pp. 366-371.
22. Tsaih, R., Hsu, Y., & Lai, C. C., “Forecasting S&P 500 stock index futures with a hybrid AI system,” Decision Support Systems, 23(2), 1998, pp. 161-174.
23. Whaley, R. E., “Understanding the VIX,” The Journal of Portfolio Management, 35(3), 2009, pp. 98-105.
24. Yadan, O., Adams, K., Taigman, Y., & Ranzato, M. A., “Multi-gpu training of convnets,” arXiv preprint arXiv:1312.5853, 2013.
25. ZhaoZhi-Ming, Overview of Futures,Winson Taipei, 1993.
描述 碩士
國立政治大學
資訊管理學系
104356036
資料來源 http://thesis.lib.nccu.edu.tw/record/#G0104356036
資料類型 thesis
dc.contributor.advisor 蔡瑞煌zh_TW
dc.contributor.advisor Tsaih, Rua-Huanen_US
dc.contributor.author (Authors) 林雋鈜zh_TW
dc.contributor.author (Authors) Lin, Jyun-Hongen_US
dc.creator (作者) 林雋鈜zh_TW
dc.creator (作者) Lin, Jyun-Hongen_US
dc.date (日期) 2017en_US
dc.date.accessioned 2-Oct-2017 10:15:01 (UTC+8)-
dc.date.available 2-Oct-2017 10:15:01 (UTC+8)-
dc.date.issued (上傳時間) 2-Oct-2017 10:15:01 (UTC+8)-
dc.identifier (Other Identifiers) G0104356036en_US
dc.identifier.uri (URI) http://nccur.lib.nccu.edu.tw/handle/140.119/113286-
dc.description (描述) 碩士zh_TW
dc.description (描述) 國立政治大學zh_TW
dc.description (描述) 資訊管理學系zh_TW
dc.description (描述) 104356036zh_TW
dc.description.abstract (摘要) 本系統係藉由分析歷史交易資料來預測S&P500期貨市場之漲幅。 我們改進了Tsaih et al. (1998)提出的混和式AI系統。 該系統結合了Rule Base 系統以及類神經網路作為其預測之機制。我們針對該系統在以下幾點進行改善:(1) 將原本的日期資料改為使用分鐘資料作為輸入。(2) 本研究採用了“移動視窗”的技術,在移動視窗的概念下,每一個視窗我們希望能夠在60分鐘內訓練完成。(3)在擴增了額外的變數 – VIX價格做為系統的輸入。(4) 由於運算量上升,因此本研究利用TensorFlow 以及GPU運算來改進系統之運作效能。
我們發現VIX變數確實可以改善系統之預測精準度,但訓練的時間雖然平均低於60分鐘,但仍有部分視窗的時間會小幅超過60分鐘。
zh_TW
dc.description.abstract (摘要) The system is made to predict the Futures’ trend through analyzing the transaction data in the past, and gives advices to the investors who are hesitating to make decisions. We improved the system proposed by Tsaih et al. (1998), which was called hybrid AI system. It was combined with rule-based system and artificial neural network system, which can give suggestions depends on the past data. We improved the hybrid system with the following aspects: (1) The index data are changed from daily-based in into the minute-based in this study. (2) The “moving-window” mechanism is adopted in this study. For each window, we hope we can finish training in 60 minutes. (3) There is one extra variable VIX, which is calculated by the VIX in this study. (4) Due to the more computation demand, TensorFlow and GPU computing is applied in our system.
We discover that the VIX can obviously has positively influence of the predicting performance of our proposed system. The average training time is lower than 60 minutes, however, some of the windows still cost more than 60 minutes to train.
en_US
dc.description.tableofcontents Chapter 1. Introduction 1
1.1 Background 1
1.2 Motivation 2
1.3 Objective 3
Chapter 2. Literature Review 5
2.1 Futures Market Background Review 5
1. Commodity Cash Market and Commodity Futures Market 5
2. The Standard &Poor’s 500 (S&P 500) 6
3. The CBOE Volatility Index (VIX) 6
2.2 Decision Support Mechanism 7
1. Hybrid AI System 7
2. Reasoning Neural Network (RN) 7
2.3 Machine Learning 10
1. History and Introduction 10
2. TensorFlow 10
3. GPU-Computing 15
2.4 Moving Window 15
Chapter 3. Experiment Design 17
3.1 Experiment overview 17
3.2 The design of the Variables. 18
1. Data Preprocessing 19
3.3 The design of the System. 24
1. Moving window in our system 24
2. The trigger 25
3. Summarize mechanism. 25
4. The proposed predicting mechanism 26
5. The voting mechanism 29
3.4 Experiment Environment 30
Chapter 4. Experiment result 31
4.1 Result overview 31
4.2 Result of proposed predicting system 35
The training time for every window 39
4.3 Result without VIX variables. 40
Chapter 5. Conclusion and Future work. 45
5.1 Conclusions 45
1. The performance of minute data: 45
2. The use of VIX variable: 45
5.2 Future works 45
Reference 47
zh_TW
dc.format.extent 2959535 bytes-
dc.format.mimetype application/pdf-
dc.source.uri (資料來源) http://thesis.lib.nccu.edu.tw/record/#G0104356036en_US
dc.subject (關鍵詞) 機器學習zh_TW
dc.subject (關鍵詞) 類神經網路zh_TW
dc.subject (關鍵詞) 圖形處理器zh_TW
dc.subject (關鍵詞) 標準普爾500指數zh_TW
dc.subject (關鍵詞) 期貨市場zh_TW
dc.subject (關鍵詞) 張量流zh_TW
dc.subject (關鍵詞) VIX指數zh_TW
dc.subject (關鍵詞) Machine learningen_US
dc.subject (關鍵詞) Artificial neural networken_US
dc.subject (關鍵詞) GPUen_US
dc.subject (關鍵詞) S&P500en_US
dc.subject (關鍵詞) Futures marketen_US
dc.subject (關鍵詞) TensorFlowen_US
dc.subject (關鍵詞) VIX indexen_US
dc.title (題名) 應用機器學習於標準普爾指數期貨zh_TW
dc.title (題名) An application of machine learning to Standard & Poor`s 500 index future.en_US
dc.type (資料類型) thesisen_US
dc.relation.reference (參考文獻) 1. Abadi, M., Agarwal, A., Barham, P., Brevdo, E., Chen, Z., Citro, C., ... & Ghemawat, S. “TensorFlow: Large-scale machine learning on heterogeneous distributed systems,” arXiv preprint arXiv:1603.04467, 2016.
2. Arner, D. W., Barberis, J., & Buckley, R. P., “The Evolution of Fintech: A New Post-Crisis Paradigm?”, 2015.
3. Babcock, B., Datar, M., & Motwani, R. “Sampling from a moving window over streaming data,” Proceedings of the thirteenth annual ACM-SIAM symposium on Discrete algorithms. Society for Industrial and Applied Mathematics, January 2002, pp. 633-634.
4. Bartlett, M. S., Littlewort, G., Frank, M., Lainscsek, C., Fasel, I., & Movellan, J., “Recognizing facial expression: machine learning and application to spontaneous behavior,” Computer Vision and Pattern Recognition, 2005. CVPR 2005. IEEE Computer Society Conference, Vol. 2, June 2005, pp. 568-573.
5. Catanzaro, B., Sundaram, N., & Keutzer, K., “Fast support vector machine training and classification on graphics processors,” Proceedings of the 25th international conference on Machine learning. ACM, July 2008, pp. 104-111.
6. Chen, A. S., Leung, M. T., & Daouk, H., “Application of neural networks to an emerging financial market: forecasting and trading the Taiwan Stock Index.” Computers & Operations Research 30(6), 2003, pp. 901-923.
7. Clark, J, “Google Turning Its Lucrative Web Search Over to AI Machines,” Bloomberg Technology, August 2015 (available online at https://www.bloomberg.com/news/articles/2015-10-26/google-turning-its-lucrative-web-search-over-to-ai-machines).
8. Cohen, W. W., Machine Learning Proceedings 1994: Proceedings of the Eighth, International Conference. Morgan Kaufmann., 2017.
9. Google Brain, “TensorFlow,” Google Brain, 2017, available online at https://www.TensorFlow.org/.
10. Hull, J. C., Options, futures, and other derivatives. Pearson Education India, 2006.
11. Heakal R., “Futures Fundamentals: Characteristics”, Investopedia (available online at http://www.investopedia.com/university/futures/futures4.asp).
12. Hornik, K., Stinchcombe, M., & White, H., “Multilayer feedforward networks are universal approximators,” Neural networks, 2(5), 1989, pp359-366.
13. Kashani, M. N., Aminian, J., Shahhosseini, S., & Farrokhi, M., “Dynamic crude oil fouling prediction in industrial preheaters using optimized ANN based moving window technique,” Chemical Engineering Research and Design, 90(7), 2012, pp. 938-949.
14. Metz C., “TensorFlow, Google’s Open Source AI, Signals Big Changes in Hardware Too,” Wired.com, November 2015 (available online at https://www.wired.com/2015/11/googles-open-source-ai-TensorFlow-signals-fast-changing-hardware-world/).
15. Owens, J. D., Houston, M., Luebke, D., Green, S., Stone, J. E., & Phillips, J. C. “GPU computing,” Proceedings of the IEEE, 96(5), 2008, pp. 879-899.
16. Rampasek, L., & Goldenberg, A., “TensorFlow: Biology’s gateway to deep learning?,” Cell systems, 2(1), 2016, pp. 12-14.
17. Scherer, K. R., “Studying the emotion-antecedent appraisal process: An expert system approach,” Cognition & Emotion, 7(3-4), 1993, pp. 325-355.
18. Stoll, H. R., & Whaley, R. E., “Commodity index investing and commodity futures prices,” 2015.
19. Thomson Reuters, “Google`s AI beats human champion at Go,” CBC News, January 2016 (available online at http://www.cbc.ca/news/technology/alphago-ai-1.3422347).
20. Tsaih, R. R., “The softening learning procedure,” Mathematical and computer modelling, 18(8), 1993, pp. 61-64.
21. Tsaih, R. R., “Reasoning neural networks,”. Mathematics of Neural Networks, 1997, pp. 366-371.
22. Tsaih, R., Hsu, Y., & Lai, C. C., “Forecasting S&P 500 stock index futures with a hybrid AI system,” Decision Support Systems, 23(2), 1998, pp. 161-174.
23. Whaley, R. E., “Understanding the VIX,” The Journal of Portfolio Management, 35(3), 2009, pp. 98-105.
24. Yadan, O., Adams, K., Taigman, Y., & Ranzato, M. A., “Multi-gpu training of convnets,” arXiv preprint arXiv:1312.5853, 2013.
25. ZhaoZhi-Ming, Overview of Futures,Winson Taipei, 1993.
zh_TW