楼主: yzxcr1
1356 1

[书籍介绍] 最优化理论与机器学习应用,Fundamentals of Optimization Theory With Applica [推广有奖]

  • 0关注
  • 1粉丝

已卖:41份资源

硕士生

42%

还不是VIP/贵宾

-

威望
0
论坛币
1289 个
通用积分
29.8736
学术水平
0 点
热心指数
0 点
信用等级
0 点
经验
3647 点
帖子
52
精华
0
在线时间
279 小时
注册时间
2009-6-11
最后登录
2025-10-4

楼主
yzxcr1 发表于 2020-4-12 21:20:27 |AI写论文

+2 论坛币
k人 参与回答

经管之家送您一份

应届毕业生专属福利!

求职就业群
赵安豆老师微信:zhaoandou666

经管之家联合CDA

送您一个全额奖学金名额~ !

感谢您参与论坛问题回答

经管之家送您两个论坛币!

+2 论坛币
缺论坛币,把手上资源拿出来,赚点论坛币。
书名:《Fundamentals of Optimization Theory With Applications to Machine Learning 》,by Jean Gallier, Jocelyn uaintance
前言:In recent years, computer vision, robotics, machine learning, and data science have been some of the key areas that have contributed to major advances in technology. Anyone who looks at papers or books in the above areas will be baed by a strange jargon involving exotic terms such as kernel PCA, ridge regression, lasso regression, support vector machines (SVM), Lagrange multipliers, KKT conditions, etc. Do support vector machines chase cattle to catch them with some kind of super lasso? No! But one will quickly discover that behind the jargon which always comes with a new eld (perhaps to keep the outsiders out of the club), lies a lot of \classical" linear algebra and techniques from optimization theory. And there comes
the main challenge: in order to understand and use tools from machine learning, computer vision, and so on, one needs to have a rm background in linear algebra and optimization theory. To be honest, some probability theory and statistics should also be included, but we already have enough to contend with. Many books on machine learning struggle with the above problem. How can one understand what are the dual variables of a ridge regression problem if one doesn't know about the
Lagrangian duality framework? Similarly, how is it possible to discuss the dual formulation of SVM without a rm understanding of the Lagrangian framework? The easy way out is to sweep these diculties under the rug. If one is just a consumer of the techniques we mentioned above, the cookbook recipe approach is probably adequate. But this approach doesn't work for someone who really wants to do serious research and make signi cant contributions. To do so, we believe that one must have a solid background in linear algebra and optimization theory. This is a problem because it means investing a great deal of time and energy studying these elds, but we believe that perseverance will be amply rewarded. This second volume covers some elements of optimization theory and applications, especially to machine learning. This volume is divided in ve parts:
(1) Preliminaries of Optimization Theory.
(2) Linear Optimization.
(3) Nonlinear Optimization.
(4) Applications to Machine Learning.
(5) An appendix consisting of two chapers Fundamentals of Optimization Theory With Applications to Machine Learning by Jea.pdf (13.33 MB, 需要: 20 个论坛币)
二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝


沙发
snzpro(真实交易用户) 发表于 2020-4-23 06:50:12
谢谢分享

您需要登录后才可以回帖 登录 | 我要注册

本版微信群
加好友,备注jltj
拉您入交流群
GMT+8, 2026-2-9 12:39