| 研究生: |
張群 Chang, Chun |
|---|---|
| 論文名稱: |
利用領域適應建構自然語言情緒分類模型:以台灣財經新聞為例 A Semantic Sentiment Classification Model based on Domain Adaption: the Case of TWSE News |
| 指導教授: |
江彌修
Chiang, Mi-Hsiu |
| 口試委員: |
徐之強
Hsu, Chih-Chiang 池祥萱 Chih, Hsiang-Hsuan 許育進 Hsu, Yu-Chin |
| 學位類別: |
碩士
Master |
| 系所名稱: |
商學院 - 金融學系 Department of Money and Banking |
| 論文出版年: | 2021 |
| 畢業學年度: | 109 |
| 語文別: | 中文 |
| 論文頁數: | 50 |
| 中文關鍵詞: | 自然語言處理 、預訓練 、下游任務 、情緒分類 、領域適應 、主題分組 |
| 外文關鍵詞: | Nature Language Process, Pre-training, Downstream-task, Topic Grouping |
| DOI URL: | http://doi.org/10.6814/NCCU202100606 |
| 相關次數: | 點閱:124 下載:0 |
| 分享至: |
| 查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
本研究在台灣財經新聞上,使用領域適應以及主題分組的方式解決 BERT 在無監督文本與領域文本,以及下游訓練文本之間都存在的文本異質性,並且 依照上述方式方邊分別建立情緒分類模型。在領域適應對於情緒分類效果不論 在損失函數、混淆矩陣,以及 ROC 曲線與 AUC 值上的結論都與 Araci (2019)相 同,並無發現顯著的差異。然而在主題分組上對於情緒分類效果上,在上述的 指標中都能發現顯著差異;除此之外,若觀察主題分組之後的模型,亦能進一 步發現負面新聞的理解能力提升,解決負面新聞在先天樣本不足下的劣勢。
In this paper, we implement domain adaption and topic grouping to deal with heterogeneity of corpus between pre-training and domain corpus, and among downstream corpus for TWSE news semantic sentiment classification model. The empirical results show that domain adaption model can’t conclude a significant effect, which agrees with Araci (2019). However, the topic grouping model can achieve a better performance than other models, and increase the understanding of negative corpus.
第一章 緒論 9
第二章 文獻回顧 13
第一節 情緒分析應用在財務領域 13
第二節 二階段模型的起源 14
第三節 領域適應與主題分組 22
第三章 研究方法 23
第一節 實驗流程 23
第二節 資料蒐集 25
第三節 資料預處理與主題分類 28
第四節 模型實驗 30
第四章 實證結果 35
第一節 實驗設置 35
第二節 實驗結果 36
第五章 結論與後續研究建議 47
第一節 結論 47
第二節 後續研究建議 48
參考文獻 49
[1] Araci, D. (2019), “FinBERT: Financial Sentiment Analysis with Pre-trained.
Language Models.”, “arXiv preprint arXiv:1908.10063”
[2] Bengio, Y., Ducharme, R., Vincent, R. and Jauvin C. (2003), “A Neural Probabilistic Language Model”, “Journal of Machine Learning Research 3 (2003)”, 1137–1155
[3] Devlin, J., Chang, M. W., Lee, K. and Toutanova, K. (2018), “BERT: Pre- training of. Deep Bidirectional Transformers for Language Understanding.”, “arXiv preprint arXiv 1811.03600v2”
[4] Santos, C. D., and Gatti, M. (2014). “Deep Convolutional Neural Networks for Sentiment. Analysis of Short Texts, in Proceedings of COLING.”, “The 25th International Conference on Computational Linguistics.”, 69–78
[5] Howard, J. and Ruder, S. (2018). Universal Language Model Fine- tuning for. Text Classification. (jan 2018). arXiv preprint arXiv: 1801.06146
[6] Jotheeswaran, J. and Koteeswaran, S. (2015), “Decision Tree Based Feature Selection. and Multilayer Perceptron for Sentiment Analysis.”, “ARPN J Eng Appl Sci, ISSN 1819–6608 10(14)”, 5883– 5894
[7] Mikolov, T., Chen, K., Corrado, G. and Dean J. (2013), “Efficient Estimation of Word. Representations in Vector Space”, “arXiv preprint arXiv 1301.3781”
[8] Pranckevičius, T. and Marcinkevicius V. (2017), “Comparison of naive bayes, random. forest, decision tree, support vector machines, and logistic regression classifiers for text reviews classification.”, “Baltic J Mod Comput 5:221”
[9] Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L. and Gomez , A. N., Kaiser, L., Polosukhin I. (2017), “Attention Is All You Need”, “arXiv preprint arXiv:1706.03762”
[10] Wang, X., Liu, Y., Sun, C., Wang, B., Wang, X. (2015), “Predicting. Polarities of Tweets by Composing Word Embeddings with Long Short-Term Memory”, “in Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), Beijing, China”, 1343–1353
[11] Yang, Y., Cer, D., Ahmad, A., Guo, M., Law, J., Constant, N., Abrego, A. G., Yuan, S., Tar, C., Sung, Y., Strope, B., Kurzweil, R. (2019), “Multilingual Universal Sentence Encoder for Semantic Retrieval”, “arXiv preprint arXiv:1907.04307”
[12] Wang, Y., Wang, M., Fujita, H. (2020), “Word sense disambiguation: A. comprehensive knowledge exploitation framework”, “Knowledge- Based Systems 190 (Feb, 2020): 105030”
[13] Zhang, D. W., Xu, H., Su, Z., Xu, Y. (2015), “Chinese comments. sentiment classification based on word2vec and SVMperf.”, “Expert Systems with Applications, 42(4)”, 1857–1863.
此全文未授權公開