dc.contributor.advisor | 黃子銘 | zh_TW |
dc.contributor.author (Authors) | 楊博崴 | zh_TW |
dc.contributor.author (Authors) | Yang, Bo-Wei | en_US |
dc.creator (作者) | 楊博崴 | zh_TW |
dc.creator (作者) | Yang, Bo-Wei | en_US |
dc.date (日期) | 2018 | en_US |
dc.date.accessioned | 30-Jul-2018 14:53:53 (UTC+8) | - |
dc.date.available | 30-Jul-2018 14:53:53 (UTC+8) | - |
dc.date.issued (上傳時間) | 30-Jul-2018 14:53:53 (UTC+8) | - |
dc.identifier (Other Identifiers) | G0105354009 | en_US |
dc.identifier.uri (URI) | http://nccur.lib.nccu.edu.tw/handle/140.119/119034 | - |
dc.description (描述) | 碩士 | zh_TW |
dc.description (描述) | 國立政治大學 | zh_TW |
dc.description (描述) | 統計學系 | zh_TW |
dc.description (描述) | 105354009 | zh_TW |
dc.description.abstract (摘要) | 在無母數迴歸中,經常透過樣條函數來近似迴歸函數,並利用最小平方法進行估計。由於樣條函數節點的個數與位置會影響到最終的近似效果,本文以截斷冪函數作為樣條函數的基底,並藉由三種變數選取方法篩選樣條函數的節點。第一與第二種方法是透過假設檢定判斷重要變數的向前與向後選取法,並將其檢定統計量中共同變異數的估計式修改為較穩健的估計式。第三種方法為貝氏變數選取法。給定適合的參數先驗分配,並透過潛在變量之後驗機率選取重要變數,且過程中以分量式吉布斯採樣減輕計算負擔。 最後,本文以ISE(Integrated squared error)做為評估準則,比較前兩種與第三種方法間的估計效果。我們模擬不同平滑程度的函數,並產生不同樣本數與誤差的資料。發現當函數圖型較平滑時,不論樣本與資料誤差大小,向前與向後選取法之估計效果皆優於貝氏變數選取法,且後者有選取不必要節點的問題。而貝氏方法在較陡峭的函數圖型且資料誤差大時,相較於其他兩種方法會有較好的估計效果。 | zh_TW |
dc.description.abstract (摘要) | In nonparametric regression, it is common to approximate the regression fun-ction using a spline function, and then obtain the regression function estimate using least squares. When approximating the regression function using a spline, it is imp-ortant to choose the number of knots and knot locations. In this thesis, we use three variable selection methods to select knots. The first and second methods are forward and backward selection. We replace the usual residual-based variance estimator in the test statistics by a more robust estimator. The third method is the Bayesian variable selection method. Given the appropriate parameters of the prior distribution, variables are selected based on the posterior probabilities of latent variables. In the process of computing the posterior probabilities, the componentwise Gibbs sampler is used to reduce the computational burden. Simulation experiments are carried out in this study to compare the three me-thods in a nonparametric regression setting. ISE (integrated squared error) is used to evaluate knot selection results. In those experiments, regression functions with dif-ferrent degrees of smoothness, and data of different sample sizes and error variance levels, are considered. It is found that when the function is relatively smooth, both the forward and backward selection methods are superior to the Bayesian variable select-ion method regardless of the sizes of the sample and the levels of error variance, and the Bayesian method has the problem of selecting unnecessary knots. The Bayesian method outperforms the other two methods when the regression function has a steep pattern and the error variance is large. | en_US |
dc.description.tableofcontents | 第一章 緒論 1 第二章 文獻探討 3 第三章 研究方法 7 3.1 建立Splines迴歸模型基底與評估準則 7 3.2 貝氏變數選取法 8 3.3 修改向前與向後選取法 11 第四章 資料模擬分析 13 4.1 資料產生與實驗過程 13 4.2 實驗結果 14 第五章 結論與建議 23 第六章 附錄 25 第七章 參考文獻 27 | zh_TW |
dc.source.uri (資料來源) | http://thesis.lib.nccu.edu.tw/record/#G0105354009 | en_US |
dc.subject (關鍵詞) | 無母數迴歸 | zh_TW |
dc.subject (關鍵詞) | 樣條函數 | zh_TW |
dc.subject (關鍵詞) | 貝氏變數選取法 | zh_TW |
dc.subject (關鍵詞) | 吉布斯採樣 | zh_TW |
dc.subject (關鍵詞) | Nonparametric regression | en_US |
dc.subject (關鍵詞) | Spline functions | en_US |
dc.subject (關鍵詞) | Bayesian variable selection | en_US |
dc.subject (關鍵詞) | Gibbs sampling | en_US |
dc.title (題名) | 基於貝氏方法應用於樣條迴歸節點選取 | zh_TW |
dc.title (題名) | A Bayesian Knots Selection Method for Regression Spline Estimation | en_US |
dc.type (資料類型) | thesis | en_US |
dc.relation.reference (參考文獻) | [1] A.J. Miller : Subset Selection in Regression. Monographs on Statistics and Applied Probability 40(1990) [2] Barbieri, M., Berger, J.O. : Optimal predictive model selection. Ann. Stat. 32, 870–897(2004) [3] Carl De Boor : A practical guide to splines; rev. ed. Applied mathematical sciences. Springer, Berlin (2001) [4] Chen, Ray-Bing , Chu, Chi-Hsiang, Lai, Te-You , Wu, Ying Nian : Stochastic matching pursuit for Bayesian variable selection. Statistics and Computing, 21, 247–259(2011) [5] Chen, Ray-Bing and Lai, Te-You : Variable selection via MCMC matching pursuit. Technical Report, Institute of Statistics, National University of Kaohsiung, Kaohsiung, Taiwan(2007) [6] Edward I. George, Robert E. McCulloch : Variable Selection Via Gibbs Sampling. Journal of the American Statistical Association, 88, 881-889(1993) [7] Wu, Y.-N., Zhu, S.-C., Guo, C. : Statistical modeling of texture sketch. Proceedings of European Conference of Computer Vision, 240–254 (2002) [8] Xuming He, Lixin Shen, Zuowei Shen : A data-adaptive knot selection scheme for fitting splines. IEEE Signal Processing Letters, 8, 137-139(2001) | zh_TW |
dc.identifier.doi (DOI) | 10.6814/THE.NCCU.STAT.013.2018.B03 | - |