楼主: kedemingshi
521 0

[统计数据] 再生核Hilbert中的Minimax鲁棒函数重构 空格 [推广有奖]

  • 0关注
  • 4粉丝

会员

学术权威

78%

还不是VIP/贵宾

-

威望
10
论坛币
15 个
通用积分
89.2735
学术水平
0 点
热心指数
8 点
信用等级
0 点
经验
24665 点
帖子
4127
精华
0
在线时间
0 小时
注册时间
2022-2-24
最后登录
2022-4-15

楼主
kedemingshi 在职认证  发表于 2022-3-8 19:55:20 来自手机 |AI写论文

+2 论坛币
k人 参与回答

经管之家送您一份

应届毕业生专属福利!

求职就业群
赵安豆老师微信:zhaoandou666

经管之家联合CDA

送您一个全额奖学金名额~ !

感谢您参与论坛问题回答

经管之家送您两个论坛币!

+2 论坛币
摘要翻译:
本文给出了再生核Hilbert空间(RKHS)中函数逼近的一种统一方法,它为几种已知的函数逼近技术,如最小范数插值、光滑样条和伪逆,建立了一种以前未被认识的最优性。本文研究了R^d上任意实值RKHS函数的逼近问题。在实际观测(即真函数值)只属于可容许观测的凸集的意义上,观测是近似的。对于使RKHS范数的上确界在真函数与所选逼近的误差上最小的函数,我们寻求一个极小极大最优逼近,条件是真函数属于满足观测约束的一致有界不确定函数集,并且该逼近是RKHS的一员。我们把这样的解称为极小极大鲁棒重构。我们刻画了极大极小鲁棒重构问题的解,并证明它等价于求解一个直接的凸优化问题。证明了极大极小鲁棒重构一般比基于标称观测值集插值的逼近更稳定,并且在容许观测值的凸集上满足一些温和的正则性条件的情况下,极大极小鲁棒重构是无条件稳定的。我们通过刻画几个具体的凸观测模型的minimax鲁棒重构来激励我们的结果,并讨论了与其他函数逼近方法的关系。
---
英文标题:
《Minimax Robust Function Reconstruction in Reproducing Kernel Hilbert
  Spaces》
---
作者:
Richard J. Barton
---
最新提交年份:
2013
---
分类信息:

一级分类:Mathematics        数学
二级分类:Statistics Theory        统计理论
分类描述:Applied, computational and theoretical statistics: e.g. statistical inference, regression, time series, multivariate analysis, data analysis, Markov chain Monte Carlo, design of experiments, case studies
应用统计、计算统计和理论统计:例如统计推断、回归、时间序列、多元分析、数据分析、马尔可夫链蒙特卡罗、实验设计、案例研究
--
一级分类:Statistics        统计学
二级分类:Statistics Theory        统计理论
分类描述:stat.TH is an alias for math.ST. Asymptotics, Bayesian Inference, Decision Theory, Estimation, Foundations, Inference, Testing.
Stat.Th是Math.St的别名。渐近,贝叶斯推论,决策理论,估计,基础,推论,检验。
--

---
英文摘要:
  In this paper, we present a unified approach to function approximation in reproducing kernel Hilbert spaces (RKHS) that establishes a previously unrecognized optimality property for several well-known function approximation techniques, such as minimum-norm interpolation, smoothing splines, and pseudo-inverses. We consider the problem of approximating a function belonging to an arbitrary real-valued RKHS on R^d based on approximate observations of the function. The observations are approximate in the sense that the actual observations (i.e., the true function values) are known only to belong to a convex set of admissible observations. We seek a minimax optimal approximation for the function that minimizes the supremum of the RKHS norm on the error between the true function and the chosen approximation subject only to the conditions that the true function belongs to a uniformly bounded uncertainty set of functions that satisfy the constraints on the observations and that the approximation is a member of the RKHS. We refer to such a solution as a minimax robust reconstruction. We characterize the solution to the minimax robust reconstruction problem and show that it is equivalent to solving a straightforward convex optimization problem. We demonstrate that a minimax robust reconstruction will generally be more stable than an approximation based on interpolation through a nominal set of observations and that, subject to some mild regularity conditions on the convex set of admissible observations, the minimax robust reconstruction is unconditionally stable. We motivate our results by characterizing the minimax robust reconstruction for several specific convex observational models and discuss relationships with other approaches to function approximation.
---
PDF链接:
https://arxiv.org/pdf/707.0082
二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

关键词:hilbert minimax Mini Hil max true 函数 凸集 convex 容许

您需要登录后才可以回帖 登录 | 我要注册

本版微信群
jg-xs1
拉您进交流群
GMT+8, 2026-1-4 01:19