摘要翻译:
最近的研究表明,在边缘(或低噪声)假设下,存在一些分类器可以达到快的超额贝叶斯风险收敛速度,即快于$n^{-1/2}$。本课题的研究工作提出了以下两个猜想:(一)可达到的最佳快速速率为$n^{-1}$;(二)插件分类器的收敛速度通常比基于经验风险最小化的分类器慢。我们证明了两个猜想都不正确。特别是,我们构造的插件分类器不仅可以实现快速,而且还可以实现超快的速率,即速率快于$n^{-1}$。我们建立了minimax下界,表明所得到的速率不能提高。
---
英文标题:
《Fast learning rates for plug-in classifiers》
---
作者:
Jean-Yves Audibert, Alexandre B. Tsybakov
---
最新提交年份:
2007
---
分类信息:
一级分类:Mathematics 数学
二级分类:Statistics Theory 统计理论
分类描述:Applied, computational and theoretical statistics: e.g. statistical inference, regression, time series, multivariate analysis, data analysis, Markov chain Monte Carlo, design of experiments, case studies
应用统计、计算统计和理论统计:例如统计推断、回归、时间序列、多元分析、数据分析、马尔可夫链蒙特卡罗、实验设计、案例研究
--
一级分类:Statistics 统计学
二级分类:Statistics Theory 统计理论
分类描述:stat.TH is an alias for math.ST. Asymptotics, Bayesian Inference, Decision Theory, Estimation, Foundations, Inference, Testing.
Stat.Th是Math.St的别名。渐近,贝叶斯推论,决策理论,估计,基础,推论,检验。
--
---
英文摘要:
It has been recently shown that, under the margin (or low noise) assumption, there exist classifiers attaining fast rates of convergence of the excess Bayes risk, that is, rates faster than $n^{-1/2}$. The work on this subject has suggested the following two conjectures: (i) the best achievable fast rate is of the order $n^{-1}$, and (ii) the plug-in classifiers generally converge more slowly than the classifiers based on empirical risk minimization. We show that both conjectures are not correct. In particular, we construct plug-in classifiers that can achieve not only fast, but also super-fast rates, that is, rates faster than $n^{-1}$. We establish minimax lower bounds showing that the obtained rates cannot be improved.
---
PDF链接:
https://arxiv.org/pdf/708.2321