學術產出-期刊論文

文章檢視/開啟

書目匯出

Google ScholarTM

政大圖書館

引文資訊

TAIR相關學術產出

題名 A Stochastic Approximation View of Boosting
作者 張源俊
Tsao,C. Andy;Chang,Yuan-chin Ivan
日期 2005
上傳時間 19-十二月-2008 09:07:23 (UTC+8)
摘要 The boosting as a stochastic approximation algorithm is considered. This new interpretation provides an alternative theoretical framework for investigation. Following the results of stochastic approximation theory a stochastic approximation boosting algorithm, SABoost, is proposed. By adjusting its step sizes, SABoost will have different kinds of properties. Empirically, it is found that SABoost with a small step size will have smaller training and testing errors difference, and when the step size becomes large, it tends to overfit (i.e. bias towards training scenarios). This choice of step size can be viewed as a smooth (early) stopping rule. The performance of AdaBoost is compared and contrasted.
關聯 Computational Statistics and Data Analysis,52(1),325-334
資料類型 article
DOI http://dx.doi.org/http://dx.doi.org/10.1016/j.csda.2007.06.020
dc.creator (作者) 張源俊zh_TW
dc.creator (作者) Tsao,C. Andy;Chang,Yuan-chin Ivan-
dc.date (日期) 2005en_US
dc.date.accessioned 19-十二月-2008 09:07:23 (UTC+8)-
dc.date.available 19-十二月-2008 09:07:23 (UTC+8)-
dc.date.issued (上傳時間) 19-十二月-2008 09:07:23 (UTC+8)-
dc.identifier.uri (URI) https://nccur.lib.nccu.edu.tw/handle/140.119/17956-
dc.description.abstract (摘要) The boosting as a stochastic approximation algorithm is considered. This new interpretation provides an alternative theoretical framework for investigation. Following the results of stochastic approximation theory a stochastic approximation boosting algorithm, SABoost, is proposed. By adjusting its step sizes, SABoost will have different kinds of properties. Empirically, it is found that SABoost with a small step size will have smaller training and testing errors difference, and when the step size becomes large, it tends to overfit (i.e. bias towards training scenarios). This choice of step size can be viewed as a smooth (early) stopping rule. The performance of AdaBoost is compared and contrasted.-
dc.format application/en_US
dc.language enen_US
dc.language en-USen_US
dc.language.iso en_US-
dc.relation (關聯) Computational Statistics and Data Analysis,52(1),325-334en_US
dc.title (題名) A Stochastic Approximation View of Boostingen_US
dc.type (資料類型) articleen
dc.identifier.doi (DOI) 10.1016/j.csda.2007.06.020en_US
dc.doi.uri (DOI) http://dx.doi.org/http://dx.doi.org/10.1016/j.csda.2007.06.020en_US