跳到主要內容

簡易檢索 / 詳目顯示

研究生: 陳柏勳
Chen, Bo-Xun
論文名稱: 邏輯斯迴歸與隨機森林預測能力比較探討
A Comparative Study of Predictive Performance between Logistic Regression and Random Forest
指導教授: 黃子銘
Huang, Tzee-Ming
口試委員: 翁久幸
Weng, Chiu-Hsing
鄭宇翔
Cheng, Yu-Hsiang
學位類別: 碩士
Master
系所名稱: 商學院 - 統計學系
Department of Statistics
論文出版年: 2020
畢業學年度: 109
語文別: 中文
論文頁數: 18
中文關鍵詞: 邏輯斯迴歸隨機森林
DOI URL: http://doi.org/10.6814/NCCU202100027
相關次數: 點閱:157下載:5
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 在針對二元變數的預測中,邏輯斯迴歸往往會被選為比較的對象之一。而在許多預測的競賽之中,邏輯斯迴歸所產生的預測結果,往往差強人意。因此本研究將針對邏輯斯迴歸與隨機森林比較的問題進行探討。在本研究中,對於邏輯斯迴歸之所以會輸於隨機森林表現的原因歸結在模型複雜度,並針對此點去設計模擬比較傳統邏輯斯迴歸、以 additive B­spline 和 tensor product B­spline 為基底的邏輯斯迴歸模型以及隨機森林四個模型,最終在模擬的設定下,結果得出後三個模型僅具有些微差距,而與傳統邏輯斯迴歸差距甚遠。因而歸結出在模型比較上,應該要先避免邏輯斯迴歸低度配適的問題,再行比較,才較為公平。


    In the prediction of binary variables, logistic regression is often selected asbase line for comparing. In many competitions, the prediction performancesof logistic regression are often unsatisfactory. Therefore, this study will dis­cuss the comparison between logistic regression and random forest. In this study, we design a simulation to compare traditional logistic regression, ad­ditive B­spline logistic regression, tensor product B-­spline logistic regression and random forest. The results showed that the latter three models only have a slightly different performance, which is far from the traditional logistic re­gression. Hence, if we want the comparison to be more fair, we need to deal with the problem of underfitting of logistic regression first.

    誌謝 i
    摘要 ii
    Abstract iii
    Contents iv
    List of Figures vi
    List of Tables vii

    1 緒論 1

    2 文獻回顧 2

    3 研究方法 3
    3.1 B­spline 4
    3.2 以 B­spline 為基底的邏輯斯迴歸 5
    3.3 節點選擇 7
    3.3.1 節點選擇的方法介紹 7
    3.3.2 節點選擇步驟 9
    3.4 隨機森林 (Random Forest) 10

    4 模擬分析 11
    4.1 模擬流程 11
    4.2 模擬設定 13
    4.3 模擬結果 15

    5 結論 17
    參考文獻 18

    Bessaoud, F., Daures, J.­P., and Molinari, N. (2005). Free knot splines for logistic models and threshold selection. Computer methods and programs in biomedicine, 77(1):1–9.

    Breiman, L. (2001). Random forests. Machine Learning, 45(1):5–32.

    De Boor, C. (1972). On calculating with b­splines. Journal of Approximation theory,6(1):50–62.

    Hastie, T. J. and Tibshirani, R. J. (1990). Generalized additive models, volume 43. CRC press.

    Huang, T. M. (2019). A knot selection algorithm for regression splines. 62nd ISI World Statistics Congress, Kuala Lumpur.

    Kay, R. and Little, S. (1987). Transformations of the explanatory variables in the logistic regression model for binary data. Biometrika, 74(3):495–501.

    Leathwick, J., Elith, J., and Hastie, T. (2006). Comparative performance of generalized additive models and multivariate adaptive regression splines for statistical modelling of species distributions. Ecological Modelling, 199(2):188 – 196. Predicting Species Distributions.

    Li, M., Zhang, C., Xu, B., Xue, Y., and Ren, Y. (2020). A comparison of gam and gwr in modelling spatial distribution of japanese mantis shrimp (oratosquilla oratoria) in coastal waters. Estuarine, Coastal and Shelf Science, 244:106928.

    Wood, S. N. and Augustin, N. H. (2002). Gams with integrated model selection using penalized regression splines and applications to environmental modelling. Ecological Modelling, 157(2):157 – 177.

    QR CODE
    :::