Publications-Proceedings
Article View/Open
Publication Export
Google ScholarTM
NCCU Library
Citation Infomation
Related Publications in TAIR
Title | MIG at the NTCIR-15 FinNum-2 Task: Use the transfer learning and feature engineering for numeral attachment task |
Creator | 劉昭麟 Liu, Chao-Lin Chen, Yu-Yu |
Contributor | 資科系 |
Key Words | Numeral attachment ; financial social media ; transfer learning ; feature engineering |
Date | 2020-12 |
Date Issued | 22-Sep-2021 10:39:33 (UTC+8) |
Summary | In the FinNum-2 task, the goal is to judge whether the specified numeral is related to the given stock symbol in a financial tweet. We employ a transfer-learning mechanism and the Google BERT embeddings so that we only need to collect and annotate a small amount of data to train the classifiers for the task. In addition, our classifiers consider some intuitive but useful syntactic features, e.g., the positions of words in the tweets. Experimental results indicate that these new features boost the prediction quality, and we achieved better than 68% in the tests in the formal run. |
Relation | NTCIR-15 Proceedings, NII, Japan, pp.79‒82 |
Type | conference |
dc.contributor | 資科系 | |
dc.creator (作者) | 劉昭麟 | |
dc.creator (作者) | Liu, Chao-Lin | |
dc.creator (作者) | Chen, Yu-Yu | |
dc.date (日期) | 2020-12 | |
dc.date.accessioned | 22-Sep-2021 10:39:33 (UTC+8) | - |
dc.date.available | 22-Sep-2021 10:39:33 (UTC+8) | - |
dc.date.issued (上傳時間) | 22-Sep-2021 10:39:33 (UTC+8) | - |
dc.identifier.uri (URI) | http://nccur.lib.nccu.edu.tw/handle/140.119/137222 | - |
dc.description.abstract (摘要) | In the FinNum-2 task, the goal is to judge whether the specified numeral is related to the given stock symbol in a financial tweet. We employ a transfer-learning mechanism and the Google BERT embeddings so that we only need to collect and annotate a small amount of data to train the classifiers for the task. In addition, our classifiers consider some intuitive but useful syntactic features, e.g., the positions of words in the tweets. Experimental results indicate that these new features boost the prediction quality, and we achieved better than 68% in the tests in the formal run. | |
dc.format.extent | 554805 bytes | - |
dc.format.mimetype | application/pdf | - |
dc.relation (關聯) | NTCIR-15 Proceedings, NII, Japan, pp.79‒82 | |
dc.subject (關鍵詞) | Numeral attachment ; financial social media ; transfer learning ; feature engineering | |
dc.title (題名) | MIG at the NTCIR-15 FinNum-2 Task: Use the transfer learning and feature engineering for numeral attachment task | |
dc.type (資料類型) | conference |