楼主: 何人来此
782 0

[计算机科学] 高阶偏最小二乘(HOPLS):一种广义多元线性 回归方法 [推广有奖]

  • 0关注
  • 4粉丝

会员

学术权威

78%

还不是VIP/贵宾

-

威望
10
论坛币
10 个
通用积分
64.8012
学术水平
1 点
热心指数
6 点
信用等级
0 点
经验
24593 点
帖子
4128
精华
0
在线时间
0 小时
注册时间
2022-2-24
最后登录
2022-4-15

楼主
何人来此 在职认证  发表于 2022-3-30 08:25:00 来自手机 |AI写论文

+2 论坛币
k人 参与回答

经管之家送您一份

应届毕业生专属福利!

求职就业群
赵安豆老师微信:zhaoandou666

经管之家联合CDA

送您一个全额奖学金名额~ !

感谢您参与论坛问题回答

经管之家送您两个论坛币!

+2 论坛币
摘要翻译:
本文提出了一种新的广义多元线性回归模型--高阶偏最小二乘(HOPLS),其目的是通过将数据投影到潜在空间并对相应的潜在变量进行回归,从张量{X}中预测张量{Y}。HOPLS与其他回归模型的本质不同之处在于,它用一个正交Tucker张量的总和来解释数据,而正交加载数作为控制模型复杂性和防止过拟合的参数。通过压缩操作,对低维潜空间进行优化,得到了对$\tensor{X}$和$\tensor{Y}$的最佳联合子空间逼近。在新定义的广义互协方差张量上,采用高阶奇异值分解来优化正交载荷,而不是单独分解$张量{X}和$张量{Y}。通过对合成数据和基于ECoG信号的三维运动轨迹解码的系统比较,证明了HOPLS比现有方法具有更好的预测能力、适合处理小样本和对噪声的鲁棒性等优点。
---
英文标题:
《Higher-Order Partial Least Squares (HOPLS): A Generalized Multi-Linear
  Regression Method》
---
作者:
Qibin Zhao, Cesar F. Caiafa, Danilo P. Mandic, Zenas C. Chao, Yasuo
  Nagasaka, Naotaka Fujii, Liqing Zhang and Andrzej Cichocki
---
最新提交年份:
2012
---
分类信息:

一级分类:Computer Science        计算机科学
二级分类:Artificial Intelligence        人工智能
分类描述:Covers all areas of AI except Vision, Robotics, Machine Learning, Multiagent Systems, and Computation and Language (Natural Language Processing), which have separate subject areas. In particular, includes Expert Systems, Theorem Proving (although this may overlap with Logic in Computer Science), Knowledge Representation, Planning, and Uncertainty in AI. Roughly includes material in ACM Subject Classes I.2.0, I.2.1, I.2.3, I.2.4, I.2.8, and I.2.11.
涵盖了人工智能的所有领域,除了视觉、机器人、机器学习、多智能体系统以及计算和语言(自然语言处理),这些领域有独立的学科领域。特别地,包括专家系统,定理证明(尽管这可能与计算机科学中的逻辑重叠),知识表示,规划,和人工智能中的不确定性。大致包括ACM学科类I.2.0、I.2.1、I.2.3、I.2.4、I.2.8和I.2.11中的材料。
--

---
英文摘要:
  A new generalized multilinear regression model, termed the Higher-Order Partial Least Squares (HOPLS), is introduced with the aim to predict a tensor (multiway array) $\tensor{Y}$ from a tensor $\tensor{X}$ through projecting the data onto the latent space and performing regression on the corresponding latent variables. HOPLS differs substantially from other regression models in that it explains the data by a sum of orthogonal Tucker tensors, while the number of orthogonal loadings serves as a parameter to control model complexity and prevent overfitting. The low dimensional latent space is optimized sequentially via a deflation operation, yielding the best joint subspace approximation for both $\tensor{X}$ and $\tensor{Y}$. Instead of decomposing $\tensor{X}$ and $\tensor{Y}$ individually, higher order singular value decomposition on a newly defined generalized cross-covariance tensor is employed to optimize the orthogonal loadings. A systematic comparison on both synthetic data and real-world decoding of 3D movement trajectories from electrocorticogram (ECoG) signals demonstrate the advantages of HOPLS over the existing methods in terms of better predictive ability, suitability to handle small sample sizes, and robustness to noise.
---
PDF链接:
https://arxiv.org/pdf/1207.1230
二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

关键词:偏最小二乘 最小二乘 回归方法 PLS Sequentially Partial model 回归 Higher space

您需要登录后才可以回帖 登录 | 我要注册

本版微信群
jg-xs1
拉您进交流群
GMT+8, 2025-12-23 20:29