请选择 进入手机版 | 继续访问电脑版
楼主: Lisrelchen
3854 32

Combining Pattern Classifiers(Using Matlab), 2nd Edition [推广有奖]

  • 0关注
  • 62粉丝

VIP

院士

67%

还不是VIP/贵宾

-

TA的文库  其他...

Bayesian NewOccidental

Spatial Data Analysis

东西方数据挖掘

威望
0
论坛币
49955 个
通用积分
79.3687
学术水平
253 点
热心指数
300 点
信用等级
208 点
经验
41518 点
帖子
3256
精华
14
在线时间
766 小时
注册时间
2006-5-4
最后登录
2022-11-6

Lisrelchen 发表于 2015-3-28 00:31:03 |显示全部楼层 |坛友微信交流群
相似文件 换一批

+2 论坛币
k人 参与回答

经管之家送您一份

应届毕业生专属福利!

求职就业群
赵安豆老师微信:zhaoandou666

经管之家联合CDA

送您一个全额奖学金名额~ !

感谢您参与论坛问题回答

经管之家送您两个论坛币!

+2 论坛币
Combining Pattern Classifiers 2nd Edition
Methods and Algorithms


Book Description

A unified, coherent treatment of current classifier ensemble methods, from fundamentals of pattern recognition to ensemble feature selection, now in its second edition

The art and science of combining pattern classifiers has flourished into a prolific discipline since the first edition of Combining Pattern Classifiers was published in 2004. Dr. Kuncheva has plucked from the rich landscape of recent classifier ensemble literature the topics, methods, and algorithms that will guide the reader toward a deeper understanding of the fundamentals, design, and applications of classifier ensemble methods.
Book Details
Publisher:Wiley
By:Ludmila I. Kuncheva
ISBN:978-1-118-31523-1
Year:2014
Pages:384
Language:English
File size:8.9 MB
File format:PDF

本帖隐藏的内容

Combining Pattern Classifiers, 2nd Edition.pdf (8.51 MB, 需要: 25 个论坛币)





  • http://pages.bangor.ac.uk/~mas00a/
  • http://ca.wiley.com/WileyCDA/WileyTitle/productCd-1118315235.html

二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

关键词:classifiers classifier combining Edition Pattern discipline published coherent current methods

本帖被以下文库推荐

jhmath 在职认证  发表于 2015-3-28 00:39:25 来自手机 |显示全部楼层 |坛友微信交流群

Description

A unified, coherent treatment of current classifier ensemble methods, from fundamentals of pattern recognition to ensemble feature selection, now in its second edition

The art and science of combining pattern classifiers has flourished into a prolific discipline since the first edition ofCombining Pattern Classifiers was published in 2004. Dr. Kuncheva has plucked from the rich landscape of recent classifier ensemble literature the topics, methods, and algorithms that will guide the reader toward a deeper understanding of the fundamentals, design, and applications of classifier ensemble methods.

Thoroughly updated, with MATLAB® code and practice data sets throughout, Combining Pattern Classifiers includes:

• Coverage of Bayes decision theory and experimental comparison of classifiers

• Essential ensemble methods such as Bagging, Random forest, AdaBoost, Random subspace, Rotation forest, Random oracle, and Error Correcting Output Code, among others

• Chapters on classifier selection, diversity, and ensemble feature selection

With firm grounding in the fundamentals of pattern recognition, and featuring more than 140 illustrations, Combining Pattern Classifiers, Second Edition is a valuable reference for postgraduate students, researchers, and practitioners in computing and engineering.


http://ca.wiley.com/WileyCDA/WileyTitle/productCd-1118315235.html

使用道具

auirzxp 学生认证  发表于 2015-3-28 00:41:16 |显示全部楼层 |坛友微信交流群

Decision Tree using Matlab

  1. %-----------------------------------------------------------------%
  2. function T = tree_build(data, labels, classN, chi2_threshold)
  3. % --- train tree classifier
  4. if numel(unique(labels)) == 1 % all data are of the same class
  5. T = [labels(1),0,0,0]; % make a leaf
  6. else
  7. [chosen_feature,threshold] = tree_select_feature(data,labels);
  8. leftIndex = data(:,chosen_feature) <= threshold;
  9. chi2 = tree_chi2(leftIndex,labels,classN);
  10. if chi2 > chi2_threshold % accept the split
  11. leftIndex = data(:,chosen_feature) <= threshold;
  12. Tl = tree_build(data(leftIndex,:),labels(leftIndex),...
  13. classN,chi2_threshold); % left subtree
  14. Tr = tree_build(data(˜ leftIndex,:),labels(˜ leftIndex),...
  15. classN,chi2_threshold); % right subtree
  16. % merge the two trees
  17. Tl(:,[3 4]) = Tl(:,[3 4]) + (Tl(:,[3 4]) > 0) * 1;
  18. Tr(:,[3 4]) = Tr(:,[3 4]) + (Tr(:,[3 4]) > 0) * (size(Tl,1)+1);
  19. T = [chosen_feature, threshold, 2, size(Tl,1)+2; Tl; Tr];
  20. else % make a leaf
  21. T = [mode(labels), 0, 0, 0];
  22. end
  23. end

  24. %......................................................................
  25. function [top_feature, top_thre] = tree_select_feature(data,labels)
  26. % --- select the best feature
  27. [n, m] = size(data);
  28. i_G = Gini(labels); % Gini index of impurity at the parent node
  29. [D, s] = deal(zeros(1, m)); % preallocate for speed
  30. for j = 1 : m % check each feature
  31. if numel(unique(data(:,j))) == 1 % the feature has only 1 value
  32. D(j) = 0; s(j) = -999; % missing
  33. else
  34. Dsrt = sort(data(:,j)); % sort j-th feature
  35. dde_i = zeros(1, n); % preallocate for speed
  36. for i = 1 : n-1 % check the n-1 split points
  37. sp = (Dsrt(i) + Dsrt(i+1)) / 2;
  38. left = data(:,j) <= sp;
  39. % Make sure that there are points in both children nodes
  40. if sum(left) > 0 && sum(left) < n
  41. i_GL = Gini(labels(left));i_GR = Gini(labels(˜ left));
  42. dde_i(i) = i_G - mean(left)*i_GL - mean(˜ left)*i_GR;
  43. else % one child node is empty
  44. dde_i(i)=0;
  45. end
  46. end
  47. [D(j), index_s] = max(dde_i); % best impurity reduction
  48. s(j) = (Dsrt(index_s) + Dsrt(index_s+1)) / 2; % threshold
  49. end
  50. end
  51. [˜ , top_feature] = max(D); top_thre = s(top_feature);

  52. %......................................................................
  53. function chi2 = tree_chi2(left, labels, classN)
  54. % --- calculate chiˆ2 statistic for the split of labels on "left"
  55. n = numel(labels); chi2 = 0; n_L = sum(left);
  56. for i = 1 : classN
  57. n_i = sum(labels == i); n_iL = sum(labels(left) == i);
  58. if n_i > 0 && n_L > 0 % add only for non-empty children nodes
  59. chi2 = chi2 + (n * n_iL - n_i * n_L)ˆ2 /...
  60. (2 * n_i * (n_L) * (n - n_L));
  61. end
  62. end

  63. %......................................................................
  64. function i_G = Gini(labels)
  65. % --- calculate Gini index
  66. for i = 1 : max(labels)
  67. P(i) = mean(labels == i);
  68. end
  69. i_G = 1 - P * P';
  70. %-----------------------------------------------------------------%
  71. e function below is the classification part of the decision tree code. The output
  72. nsists of the labels of the supplied data using tree T trained through tree_build.
  73. %---------------------------------------------------------%
  74. function labels = tree_classify(T, test_data)
  75. % classify test_data using the tree classifier T
  76. for i = 1 : size(test_data,1)
  77. index = 1; leaf = 0;
  78. while leaf == 0,
  79. if T(index,3) == 0, % leaf is found
  80. labels(i) = T(index,1); leaf = 1;
  81. else
  82. if test_data(i,T(index,1)) <= T(index,2)
  83. index = T(index,3); %left
  84. else
  85. index = T(index,4); %right
  86. end
  87. end
  88. end
  89. end
  90. %---------------------------------------------------------%

  91. %---------------------------------------------------------%
  92. % The TREE code
  93. [x,y,labels] = fish_data(30,10);
  94. T = tree_build([x y], labels, 2, 5);
  95. la = tree_classify(T,[x y]);
  96. ax = axes; hold on
  97. scatter(ax,x,y,12,labels,'linewidth',3);
  98. colormap gray, axis square off
  99. plot(x(la==1),y(la==1),'bo','linewidth',3);
  100. %---------------------------------------------------------%
复制代码

已有 1 人评分论坛币 收起 理由
Nicolle + 20 鼓励积极发帖讨论

总评分: 论坛币 + 20   查看全部评分

使用道具

xiaohulu99 发表于 2015-3-28 01:40:29 |显示全部楼层 |坛友微信交流群

Naive Bayes in Matlab

  1. %---------------------------------------------------------%
  2. C = naive_bayes_train([x y], labels);
  3. la = naive_bayes_classify(C,[x y]);
  4. %---------------------------------------------------------%
  5. %---------------------------------------------------------%
  6. function C = naive_bayes_train(data, labels)
  7. % --- train parametric NB classifier
  8. for i = 1:max(labels)
  9. c_index = labels == i;
  10. C(i).prior = mean(c_index);
  11. C(i).mean = mean(data(c_index,:),1);
  12. if sum(c_index) > 1 % class with 1 object
  13. C(i).std = std(data(c_index,:));
  14. else
  15. C(i).std = zeros(1,size(data,2));
  16. end
  17. end
  18. %---------------------------------------------------------%
  19. %---------------------------------------------------------%
  20. function labels = naive_bayes_classify(C, data)
  21. % --- classify with the trained NB classifier
  22. g = zeros(numel(C),size(data,1));
  23. for i = 1:numel(C)
  24. Ms = repmat(C(i).mean,size(data,1),1);
  25. Ss = repmat(C(i).std,size(data,1),1);
  26. t = 1./Ss .* exp(-(data - Ms).ˆ2 ./(2* Ss.ˆ2));
  27. g(i,:) = prod(t') * C(i).prior;
  28. end
  29. [˜ ,labels] = max(g); labels = labels(:);
  30. %---------------------------------------------------------%
复制代码

已有 1 人评分论坛币 收起 理由
Nicolle + 20 奖励积极上传好的资料

总评分: 论坛币 + 20   查看全部评分

使用道具

Nicolle 学生认证  发表于 2015-3-28 02:14:23 |显示全部楼层 |坛友微信交流群

Multi-Layer Perceptron in Matlab

提示: 作者被禁止或删除 内容自动屏蔽

使用道具

Nicolle 学生认证  发表于 2015-3-28 02:21:33 |显示全部楼层 |坛友微信交流群
提示: 作者被禁止或删除 内容自动屏蔽

使用道具

Nicolle 学生认证  发表于 2015-3-28 02:58:35 |显示全部楼层 |坛友微信交流群
提示: 作者被禁止或删除 内容自动屏蔽

使用道具

fhc2010 发表于 2015-3-28 03:08:12 |显示全部楼层 |坛友微信交流群
Thanks for sharing

使用道具

lhf8059 发表于 2015-3-28 07:34:34 |显示全部楼层 |坛友微信交流群
看看!

使用道具

lm972 发表于 2015-3-28 08:06:00 |显示全部楼层 |坛友微信交流群
谢谢分享

使用道具

您需要登录后才可以回帖 登录 | 我要注册

本版微信群
加好友,备注jltj
拉您入交流群

京ICP备16021002-2号 京B2-20170662号 京公网安备 11010802022788号 论坛法律顾问:王进律师 知识产权保护声明   免责及隐私声明

GMT+8, 2024-3-29 15:39