楼主: xuehe
3178 0

[学科前沿] Pattern Recognition and Machine Learning [推广有奖]

贵宾

学术权威

90%

还不是VIP/贵宾

-

威望
8
论坛币
571284 个
通用积分
434.9290
学术水平
365 点
热心指数
358 点
信用等级
202 点
经验
353457 点
帖子
4354
精华
9
在线时间
2621 小时
注册时间
2004-12-31
最后登录
2024-4-20

+2 论坛币
k人 参与回答

经管之家送您一份

应届毕业生专属福利!

求职就业群
赵安豆老师微信:zhaoandou666

经管之家联合CDA

送您一个全额奖学金名额~ !

感谢您参与论坛问题回答

经管之家送您两个论坛币!

+2 论坛币
Stat 231--- CS 276APattern Recognition and Machine Learning

MW 3:30-4:45 PM, Fall 2013,     Math Science 5147 www.stat.ucla.edu/~sczhu/Courses/UCLA/Stat_231/Stat_231.html


Course DescriptionThis course introduces fundamental concepts, theories, and algorithms for pattern recognition and machine learning,
which are used in computer vision, speech recognition, data mining, statistics, information retrieval, and bioinformatics.
Topics include: Bayesian decision theory, parametric and non-parametric learning, data clustering, component analysis,
boosting techniques, kernel methods and support vector machine, and fast nearest neighbor indexing and hashing.
Prerequisites
  • Math 33A Linear Algebra and Its Applications, Matrix Analysis
  • Stat 100B Intro to Mathematical Statistics,
  • CS 180 Intro to Algorithms and Complexity.
Textbook
  • R. Duda, P. Hart, D. Stork, "Pattern Classification", second edition, 2000. [Good for CS students]
  • T. Hastie, R. Tibshurani, and J.H. Friedman, "The Elements of Statistical Learning: Data Mining, Inference, and Prediction", Spinger Series in Statistics, 2001. [Good for Statistics students]
InstructorsGrading Plan: 4 units, letter grades

Two Homework assignments

[td=37]

20%

Three projects:

[td]


15%
15%
15%


Middle-Term Exam: No.

[td]

0%

Final Exam: Dec 10, Tuesday 11:30AM-2:30PM (we only need 2 hours 12:15-14:15, close book exam)

[td]

35%

Grading policy
  • Homework policy:
    Homework must be finished independently. Do not discuss with classmates.
  • Project policy:
    You are encouraged to work and discuss in a group, but each person must finish his/her own project. Hand in
    (i) a brief description of the experiment in hard copy, (ii) results and plots in hard copy, (iii) your code in e-copy to the reader.
  • Late policy:
    You have a total of three late days for the class, but after using the three late days, no credit will be given for late homework/project.
Tentative Schedule for 2013

Lecture

Date

Topics

Reading Materials

Handouts

1

09-30

Introduction to Pattern Recognition[Problems, applications, examples, and project introduction]

Ch 1

syllabus.pdfLect1.pdf

2

10-02

Bayesian Decision Theory I[Bayes rule, discriminant functions]

Ch 2.1-2.6

Lect2.pdf

3

10-07

Bayesian Decision Theory II [loss functions and Bayesian error analysis]

Ch 2.1-2.6

Lect3.pdf

4

10-09

Component Analysis and Dimension Reduction I:[principal component analysis (PCA)], face modeling][Explanation of Project 1: code and data format]

Ch 3.8.1, Ch 10.13.1Project 1

HW1Lect4-5.pdf

5

10-14

Component  Analysis and Dimension Reduction II:[Fisher Linear Discriminant ][Multi-dimensional scaling (MDS)]

Ch 3.8.2, Ch10.14

FisherFace.pdfLect5-6.pdf

6

10-16

Component  Analysis and Dimension Reduction III:[Local Linear Embedding (LLE), Intrinsic dimension]

paper

LLE paper

7

10-21

Boosting Techniques I:  [perceptron, backpropagation and Adaboost]

Ch 9.5

Lect7-9.pdf

8

10-23

Boosting Techniques II:[RealBoost and Example on face detection][ Explanation of project II ]

Tutorial Handout 1Handout 2

9

10-28

Boosting Techniques III:[Probabilistic analysis, Logit boost]

10

10-30

Non-metric method I: [tree structured Classification: principle and example]

Ch 8.1-8.3

Lect10.pdf

11

11-04

Non-metric method II:Syntactic pattern recognition and example on human parsing

Ch 8.5-8.8

Lect11.pdf

12

11-06

Support vector machine I:  Kernel-induced feature space

Tutorial paper

Lect12-15.pdf

11-11

Veterans day holiday

13

11-13

Support vector machine II: [Support vector classifier][Explanation of project III]

Ch 5.11

14

11-18

Support vector machine III:[Loss functions, Latent SVM, Neual networks and DeepNet]

15

11-20

Parametric Learning       [ Maximum Likelihood Estimation (MLE) ]        [ Sufficient Statistics and Maximum entropy ]

Ch 3.1-3.6

Lect16.pdf

16

11-25

Non-parametric Learning I
  [ Parzen window and K-nn classifer]

Ch 4.1-4.5

Lect17.pdf

17

11-27

Non-parametric Learning II:[K-nn classifer and Error analysis]

Ch 4.6handout

Lect18.pdf

18

12-02

Non-parametric Learning III: [K-nn fast approximate computing:KD-tree and Hashing ]


paper1paper2

Lect 19.pdf

19

12-04

Data Clustering and Bi-clustering:   [K-mean clustering,  EM clustering by MLE, Provable 2-step EM,mean-shift and landscape ]

Ch 10.1-10.4Handout

Lect 20.pdf


二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

关键词:Recognition cognition Learning earning machine computer learning concepts include machine

您需要登录后才可以回帖 登录 | 我要注册

本版微信群
加好友,备注jltj
拉您入交流群

京ICP备16021002-2号 京B2-20170662号 京公网安备 11010802022788号 论坛法律顾问:王进律师 知识产权保护声明   免责及隐私声明

GMT+8, 2024-4-27 14:11