跳到主要內容

簡易檢索 / 詳目顯示

研究生: 林哲緯
Lin, Je-Wei
論文名稱: 基於 Copula Entropy 與核密度估計的變數選取方法
A Variable Selection Method Based on Copula Entropy and Kernel Density Estimation
指導教授: 黃子銘
Huang, Tzee-Ming
口試委員: 翁久幸
Weng, Chiu-Hsing
鄭宇翔
Cheng, Yu-Hsiang
學位類別: 碩士
Master
系所名稱: 商學院 - 統計學系
Department of Statistics
論文出版年: 2025
畢業學年度: 114
語文別: 中文
論文頁數: 32
中文關鍵詞: 變數選取copula 密度函數邊界偏誤核密度函數距離相關係數拔靴法逐步迴歸
外文關鍵詞: Variable Selection, Copula Density, Entropy, Boundary Bias, Kernel Density Estimation, Distance Correlation, Bootstrap, Stepwise Regression
相關次數: 點閱:12下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 本研究考慮迴歸模型架構下的變數選取問題,提出一種基於 copula entropy 的檢定,用以檢測模型中是否需要納入新的解釋變數。基於此檢定,建構一種逐步向前變數選取演算法,透過模擬實驗結果顯示,本方法在非線性問題中具有良好的選取表現,可以作為非線性模型中的一種非參數選取工具。


    This study addresses the problem of variable selection within the framework of regression models. A novel test based on copula entropy is proposed to assess whether additional explanatory variables should be included in the model. Based on this test, a forward stepwise variable selection algorithm is constructed. Simulation results show that the proposed method performs well in nonlinear settings, suggesting its potential as a nonparametric variable selection tool for complex nonlinear modeling scenarios.

    第一章 緒論 6
    第二章 文獻探討 7
    2.1 密度函數估計 7
    2.2 Copula 函數 8
    2.3 邊界偏誤與 Copula 密度函數估計 8
    2.4 距離相關係數 12
    2.5 Copula Entropy 與其估計 13
    2.6 變數選取 15
    2.7 逐步迴歸 15
    第三章 研究方法 17
    3.1 邊界誤差修正與密度函數估計 17
    3.1.1 Copula Entropy估計 18
    3.2 變數檢定問題 19
    3.3 逐步向前變數選取法 20
    3.3.1 基於 Copula Entropy 與核密度估計之演算法 21
    第四章 模擬實驗 22
    4.1 帶寬選取 22
    4.2 資料生成與模型設置 23
    4.2.1 模擬實驗一 23
    4.2.2 模擬實驗二 24
    4.2.3 模擬實驗三 25
    4.2.4 模擬實驗四 26
    4.2.5 模擬實驗五 26
    4.3 模擬結果 28
    第五章 結論 30
    參考文獻 31

    [1] Jan Beirlant, Edward J Dudewicz, László Györfi, Edward C Van der Meulen, et al. Nonparametric entropy estimation: An overview. International Journal of Mathematical and Statistical Sciences, 6(1):17–39, 1997.

    [2] Arthur Charpentier, Jean-David Fermanian, and Olivier Scaillet. The estimation of cop- ulas: Theory and practice. Copulas: From theory to application in finance, 35, 2007.

    [3] Song Xi Chen. Beta kernel estimators for density functions. Computational Statistics & Data Analysis, 31(2):131–145, 1999.

    [4] Thomas Cover and Peter Hart. Nearest neighbor pattern classification. IEEE transactions on information theory, 13(1):21–27, 1967.

    [5] Luc Devroye. Nonparametric density estimation. The L_1 View, 1985.

    [6] Kjell Doksum, Shijie Tang, and Kam-Wah Tsui. Nonparametric variable selection: the earth algorithm. Journal of the American Statistical Association, 103(484):1609–1620, 2008.

    [7] Michael Alin Efroymson. Multiple regression analysis. Mathematical methods for digital computers, pages 191–203, 1960.

    [8] Evelyn Fix. Discriminatory analysis: nonparametric discrimination, consistency proper- ties, volume 1. USAF school of Aviation Medicine, 1985.

    [9] Gery Geenens, Arthur Charpentier, and Davy Paindaveine. Probit transformation for nonparametric kernel estimation of the copula density. 2017.

    [10] Iène Gijbels and Jan Mielniczuk. Estimating the density of a copula function. Communications in Statistics-Theory and Methods, 19(2):445–464, 1990.

    [11] M Chris Jones. Simple boundary correction for kernel density estimation. Statistics and computing, 3:135–146, 1993.

    [12] Alexander Kraskov, Harald Stögbauer, and Peter Grassberger. Estimating mutual information. Physical Review E—Statistical, Nonlinear, and Soft Matter Physics, 69(6):066138, 2004.

    [13] Runze Li, Wei Zhong, and Liping Zhu. Feature screening via distance correlation learning. Journal of the American Statistical Association, 107(499):1129–1139, 2012.

    [14] Jian Ma. Variable selection with copula entropy. arXiv preprint arXiv:1910.12389, 2019.

    [15] Jian Ma and Zengqi Sun. Mutual information is copula entropy. Tsinghua Science & Technology, 16(1):51–54, 2011.

    [16] Marek Omelka, Irene Gijbels, and Noël Veraverbeke. Improved kernel estimation of copulas: weak convergence and goodness-of-fit testing. 2009.

    [17] Liam Paninski. Estimation of entropy and mutual information. Neural computation, 15(6):1191–1253, 2003.

    [18] Emanuel Parzen. On estimation of a probability density function and mode. The annals of mathematical statistics, 33(3):1065–1076, 1962.

    [19] M Rosenblat. Remarks on some nonparametric estimates of a density function. Ann. Math. Stat, 27:832–837, 1956.

    [20] Eugene F Schuster. Incorporating support constraints into nonparametric estimators of densities. Communications in Statistics-Theory and methods, 14(5):1123–1136, 1985.

    [21] Gábor J Székely and Maria L Rizzo. Brownian distance covariance. 2009.

    [22] Gábor J Székely, Maria L Rizzo, and Nail K Bakirov. Measuring and testing dependence by correlation of distances. 2007.

    無法下載圖示 全文公開日期 2031/02/02
    QR CODE
    :::