Please use this identifier to cite or link to this item: https://ah.lib.nccu.edu.tw/handle/140.119/106389
DC FieldValueLanguage
dc.contributor.advisor薛慧敏zh_TW
dc.contributor.author鍾其昀zh_TW
dc.creator鍾其昀zh_TW
dc.date2017en_US
dc.date.accessioned2017-02-08T08:32:56Z-
dc.date.available2017-02-08T08:32:56Z-
dc.date.issued2017-02-08T08:32:56Z-
dc.identifierG0103354015en_US
dc.identifier.urihttp://nccur.lib.nccu.edu.tw/handle/140.119/106389-
dc.description碩士zh_TW
dc.description國立政治大學zh_TW
dc.description統計學系zh_TW
dc.description103354015zh_TW
dc.description.abstract隨著資料量龐大,解釋變數過多的時代來臨,變數選取將是我們重要的議題。在線性迴歸分析中,傳統採用最小平方法(least square method)來估計模型,然而得到的迴歸係數估計值的偏差雖然比較小,但其變異程度卻較大,且預測得也不夠精準。若是考慮對迴歸係數加入限制式時,則估計量將與原本的最小平方法有何差異,偏差與標準差之間的比較。接著將此估計法應用至羅吉斯迴模型時,利用三筆實際資料,比較與最大概似估計(maximum likelihood estimate,簡稱MLE)法建立的迴歸模型及預測準確率,並於模擬實驗中,以表格及圖型呈現兩方法在估計量上的差異。zh_TW
dc.description.tableofcontents1. 緒論..............................................1\n2. 研究方法............................................3\n2.1 線性迴歸之LASSO....................................3\n2.2 羅吉斯迴歸之LASSO..................................14\n3. 實例資料分析........................................15\n3.1 脊椎後凸的預測.....................................15\n3.2 貓與狗影像的辨識...................................18\n3.3 鐵達尼號倖存者的預測................................21\n4. 模擬實驗...........................................24\n4.1 模擬流程與參數設計..................................24\n4.2 估計量的比較.......................................25\n5. 結論.............................................49\n參考文獻..............................................50\n附錄一(2.3)之推導證明..................................51\n附錄二(2.3)之推導證明..................................52zh_TW
dc.format.extent934999 bytes-
dc.format.mimetypeapplication/pdf-
dc.source.urihttp://thesis.lib.nccu.edu.tw/record/#G0103354015en_US
dc.subject最小平方法zh_TW
dc.subject最大概似估計zh_TW
dc.titleLASSO於羅吉斯迴歸模型之估計的應用zh_TW
dc.titleApplication of LASSO Estimation of a Logistic Regression Modelen_US
dc.typethesisen_US
dc.relation.reference一、英文文獻\n1. Boyd, S. and Vandenberghe, L. (2004), Convex Optimization, Cambridge University Press, 215-244.\n2. Breiman, L. (1995) Better Subset Regression Using the Nonnegative Garrotte, American Statistical Association, 37, 373-384.\n3. Breiman, L. and Spector P. (1992) Submodel Selection and Evaluation in Regression. The X-Random Case,\nInternational Statistical Review, 60, 291-319.\n4. Dalal, N. Triggs, B. (2005) Histograms of Oriented Gradients for Human Detection, http://lear.inrialpes.fr/.\n5. Friedl, I., Tilg, N. (1995) Variance estimates in logistic regression using the bootstrap, Communications in Statistic-Theory and Methods, 24(2), 473-486.\n6. Hoerl, E. and Kennard, R. (1970) Ridge Regression: Biased Estimation for Nonorthogonal Problems, American Statistical Association, 12, 55-67.\n7. Osborne, M., Presnell, B. and Turlach, B. (2000) On the LASSO and its dual, Journal of Computational and Graphical Statistics, 9, 319–337.\n8. Sill, M.,Hielscher, T. A and Becker M. (2014) Extended Inference with Lasso and Elastic-Net Regularized Cox and Generalized Linear Models, Journal of Statistical Software, 62, 1-22.\n9. Tibshirani, R. (1996) Regression Shrinkage and Selection via the Lasso, Journal of the Royal Statistical Society, 58, 267-288.\n10. Zhao, X. (2008) Lasso and Its Applications, University of Minnesota Duluth, 4-17.\n11. Zou, H. and Hastie, T. (2005) Regularization and Variable Selection via the Elastic Net, Journal of the Royal Statistical Society, 67, 301-320.\n\n二、中文文獻\n1. 全民人體力學健康教室,淺談三種脊椎歪斜。\n2. 賈金柱,高等統計選講,高等統計入門分析,2.2 節Duality。zh_TW
item.fulltextWith Fulltext-
item.grantfulltextopen-
item.openairecristypehttp://purl.org/coar/resource_type/c_46ec-
item.cerifentitytypePublications-
item.openairetypethesis-
Appears in Collections:學位論文
Files in This Item:
File Description SizeFormat
401501.pdf913.08 kBAdobe PDF2View/Open
Show simple item record

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.