楼主: Lisrelchen
2499 18

Github:Machine Learning using Matlab [推广有奖]

  • 0关注
  • 62粉丝

VIP

已卖:4194份资源

院士

67%

还不是VIP/贵宾

-

TA的文库  其他...

Bayesian NewOccidental

Spatial Data Analysis

东西方数据挖掘

威望
0
论坛币
50288 个
通用积分
83.6306
学术水平
253 点
热心指数
300 点
信用等级
208 点
经验
41518 点
帖子
3256
精华
14
在线时间
766 小时
注册时间
2006-5-4
最后登录
2022-11-6

楼主
Lisrelchen 发表于 2016-4-25 07:30:07 |AI写论文

+2 论坛币
k人 参与回答

经管之家送您一份

应届毕业生专属福利!

求职就业群
赵安豆老师微信:zhaoandou666

经管之家联合CDA

送您一个全额奖学金名额~ !

感谢您参与论坛问题回答

经管之家送您两个论坛币!

+2 论坛币

本帖隐藏的内容

machine-learning-matlab-master.zip (4.56 MB)

二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

关键词:Learning earning machine GitHub MATLAB

本帖被以下文库推荐

沙发
Lisrelchen 发表于 2016-4-25 07:31:38
LOGISTIC REGRESSION
  1. clc; clear; close all;

  2. load datax.dat;
  3. load datay.dat;

  4. X = [ones(size(datax, 1),1) datax];
  5. Y = datay;

  6. m = size(X,1);
  7. n = size(X,2)-1;

  8. %initialize
  9. theta = zeros(n+1,1);
  10. thetaold = ones(n+1,1);

  11. while ( ((theta-thetaold)'*(theta-thetaold)) > 0.0000001 )
  12.         %calculate dellltheta
  13.     dellltheta = zeros(n+1,1);
  14.         for j=1:n+1,
  15.                 for i=1:m,
  16.                         dellltheta(j,1) = dellltheta(j,1) + [Y(i,1) - (1/(1 + exp(-theta'*X(i,:)')))]*X(i,j);
  17.                 end;
  18.         end;
  19.         %calculate hessian
  20.         H = zeros(n+1, n+1);
  21.         for j=1:n+1,
  22.                 for k=1:n+1,
  23.                                 for i=1:m,
  24.                                         H(j,k) = H(j,k) -[1/(1 + exp(-theta'*X(i,:)'))]*[1-(1/(1 + exp(-theta'*X(i,:)')))]*[X(i,j)]*[X(i,k)];
  25.                                 end;
  26.                 end;
  27.         end;
  28.         thetaold = theta;
  29.         theta = theta - inv(H)*dellltheta;
  30.         (theta-thetaold)'*(theta-thetaold)
  31. end



  32. %part b
  33. figure(1); hold on;
  34. X0 = []; Y0 = []; X1 = []; Y1 = [];

  35. %training points
  36. for i=1:m,
  37.         if Y(i)==0
  38.                 X0 = [X0 X(i,2)];
  39.                 Y0 = [Y0 X(i,3)];
  40.         else
  41.                 X1 = [X1 X(i,2)];
  42.                 Y1 = [Y1 X(i,3)];
  43.         end;
  44. end;
  45. scatter(X0, Y0, 'o');
  46. scatter(X1, Y1, '+');

  47. %decision boundary
  48. Xb = -2:0.01:8;
  49. Yb = (0.5 - theta(1) - theta(2)*Xb)/(theta(3));
  50. plot(Xb, Yb);
复制代码

藤椅
Lisrelchen 发表于 2016-4-25 07:33:19
Locally-Weighted-Linear-Regression
  1. clc; clear; close all;

  2. load datax.dat;
  3. load datay.dat;

  4. X = [ones(size(datax, 1),1) datax];
  5. Y = datay;

  6. m = size(X,1);
  7. n = size(X,2)-1;

  8. %part a
  9. figure(1); hold on;
  10. theta = inv(X'*X)*X'*Y;
  11. scatter (q2x, Y);
  12. Xp = -6:0.01:13;
  13. Yp = theta(1) + theta(2)*Xp;
  14. plot (Xp, Yp);

  15. %part b
  16. figure(2); hold on;
  17. W = zeros(m);
  18. t = 0.8;
  19. LWRXp = -6:0.01:13;
  20. LWRYp = [];
  21. for temp = 1:size(LWRXp,2),
  22.         for i=1:m,
  23.                 W(i,i) = 0.5 * exp(-((LWRXp(temp)-X(i,2))^2)/(2*t*t));
  24.         end;
  25.         theta = inv(X'*W*X)*X'*W*Y;
  26.         LWRYp(temp) = theta(1)+theta(2)*LWRXp(temp);
  27. end;
  28. scatter (q2x, Y);
  29. plot (LWRXp, LWRYp);

  30. %part c
  31. figure(3); hold on;
  32. scatter (q2x, Y);
  33. W = zeros(m);
  34. t_array = [0.1, 0.3, 2,10];
  35. LWRXp = -6:0.01:13;
  36. LWRYp = [];
  37. for t=1:4,
  38.         for temp = 1:size(LWRXp,2),
  39.                 for i=1:m,
  40.                         W(i,i) = 0.5 * exp(-((LWRXp(temp)-X(i,2))^2)/(2*t_array(t)*t_array(t)));
  41.                 end;
  42.                 theta = inv(X'*W*X)*X'*W*Y;
  43.                 LWRYp(t,temp) = theta(1)+theta(2)*LWRXp(temp);
  44.         end;
  45. end;
  46. plot (LWRXp, LWRYp);
复制代码

板凳
Lisrelchen 发表于 2016-4-25 07:35:14
Linear Regression With Polynomial Basis
  1. clc; clear; close all;

  2. %fileread
  3. Xtrain_tmp = []; Ytrain_tmp = [];
  4. Xtest_tmp = []; Ytest_tmp = [];

  5. fileID = fopen('q3.dat');

  6. for linenum = 1:101
  7.         C = textscan(fileID,'%s', 8, 'CommentStyle',{'"', '"'});
  8.         if strcmp(C{1,1},'?') == 0
  9.                 Xtrain_tmp = [Xtrain_tmp; C{1,1}(2:8,1)'];
  10.                 Ytrain_tmp = [Ytrain_tmp; C{1,1}(1,1)];
  11.         end
  12. end;
  13. Xtrain = str2double(Xtrain_tmp);
  14. Ytrain = str2double(Ytrain_tmp);
  15. mtrain = size(Xtrain,1);

  16. for linenum = 102:399
  17.         C = textscan(fileID,'%s', 8, 'CommentStyle',{'"', '"'});
  18.         if strcmp(C{1,1},'?') == 0
  19.                 Xtest_tmp = [Xtest_tmp; C{1,1}(2:8,1)'];
  20.                 Ytest_tmp = [Ytest_tmp; C{1,1}(1,1)];
  21.         end
  22. end;
  23. Xtest = str2double(Xtest_tmp);
  24. Ytest = str2double(Ytest_tmp);
  25. mtest = size(Xtest,1);

  26. %normalize
  27. XMin = min([Xtrain; Xtest]);
  28. XMax = max([Xtrain; Xtest]);
  29. YMin = mean([Ytrain; Ytest]);
  30. YMax = std([Ytrain; Ytest]);
  31. Xtrain = (Xtrain - repmat(XMin,mtrain,1))./(repmat(XMax,mtrain,1) - repmat(XMin,mtrain,1));
  32. Xtest = (Xtest - repmat(XMin,mtest,1))./(repmat(XMax,mtest,1) - repmat(XMin,mtest,1));
  33. Ytrain = (Ytrain - repmat(YMin,mtrain,1))./(repmat(YMax,mtrain,1) - repmat(YMin,mtrain,1));
  34. Ytest = (Ytest - repmat(YMin,mtest,1))./(repmat(YMax,mtest,1) - repmat(YMin,mtest,1));

  35. fclose(fileID);
  36. max_dimension = 10;

  37. % part a and b
  38. training_error = zeros(max_dimension,1);
  39. test_error = zeros(max_dimension,1);
  40. figure(1); hold on;
  41. for d = 1:max_dimension
  42.         X = [ones(size(Xtrain, 1),1) Xtrain];
  43.         Xtest_s = [ones(size(Xtest, 1),1) Xtest];
  44.         for j=2:8,
  45.                 for k=2:d,
  46.                         X = [X X(:,j).^k];
  47.                         Xtest_s = [Xtest_s Xtest_s(:,j).^k];
  48.                 end;
  49.         end;
  50.        
  51.         theta_closed = pinv(X'*X)*X'*Ytrain;
  52.        
  53.         n = size(X,2)-1;
  54.         alpha = 0.01;
  55.         theta_grad = zeros(n+1,1);
  56.         for k = 1:1000,
  57.                 for i=1:mtrain,
  58.                         theta_grad = theta_grad + alpha * (Ytrain(i) - theta_grad'*X(i,:)') * X(i,:)';
  59.                 end;
  60.         end;
  61.        
  62.         difference = norm(theta_grad - theta_closed)/size(theta_grad,1);
  63.        
  64.         for i=1:mtrain,
  65.                 training_error(d) = training_error(d) + (Ytrain(i) -(theta_grad'*X(i,:)'))^2;
  66.         end;
  67.         for i=1:mtest,
  68.                 test_error(d) = test_error(d) + (Ytest(i) -(theta_grad'*Xtest_s(i,:)'))^2;
  69.         end;
  70. end;
  71. plot (1:max_dimension, training_error);
  72. plot (1:max_dimension, test_error);

  73. % part c and d
  74. training_error = zeros(max_dimension,1);
  75. test_error = zeros(max_dimension,1);
  76. figure (2); hold on;
  77. scatter (Xtrain(:,3), Ytrain);
  78. for d = 1:max_dimension
  79.         X = [ones(size(Xtrain, 1),1) Xtrain(:,3)];
  80.         Xtest_s = [ones(size(Xtest, 1),1) Xtest(:,3)];
  81.         for j=2:2,
  82.                 for k=2:d,
  83.                         X = [X X(:,j).^k];
  84.                         Xtest_s = [Xtest_s Xtest_s(:,j).^k];
  85.                 end;
  86.         end;

  87.         n = size(X,2)-1;

  88.         alpha = 0.01;
  89.         theta_grad = zeros(n+1,1);
  90.         for k = 1:1000,
  91.                 for i=1:mtrain,
  92.                         theta_grad = theta_grad + alpha * (Ytrain(i) - theta_grad'*X(i,:)') * X(i,:)';
  93.                 end;
  94.         end;
  95.        
  96.         if ((d==1)||(d==2)||(d==8))
  97.                 X_plot = 0:0.01:1;
  98.                 Y_plot = [];
  99.                 for i=0:0.01:1,
  100.                         X_matrix = [1];
  101.                         for k = 1:d,
  102.                                 X_matrix = [X_matrix; i.^k];
  103.                         end;
  104.                         Y_plot = [Y_plot, theta_grad'*X_matrix];
  105.                 end;
  106.                 plot (X_plot,Y_plot);
  107.         end;
  108.        
  109.         for i=1:mtrain,
  110.                 training_error(d) = training_error(d) + (Ytrain(i) -(theta_grad'*X(i,:)'))^2;
  111.         end;
  112.         for i=1:mtest,
  113.                 test_error(d) = test_error(d) + (Ytest(i) -(theta_grad'*Xtest_s(i,:)'))^2;
  114.         end;
  115. end;

  116. figure(3); hold on;
  117. plot (1:max_dimension, training_error);
  118. plot (1:max_dimension, test_error);

  119. %part e
  120. training_error = zeros(6,1);
  121. test_error = zeros(6,1);
  122. figure (4); hold on;
  123. d=8;
  124. gn=0;
  125. for gamma = [0.01, 0.1, 1, 10, 100, 1000]
  126.         gn = gn+1;
  127.         X = [ones(size(Xtrain, 1),1) Xtrain(:,3)];
  128.         Xtest_s = [ones(size(Xtest, 1),1) Xtest(:,3)];
  129.         for j=2:2,
  130.                 for k=2:d,
  131.                         X = [X X(:,j).^k];
  132.                         Xtest_s = [Xtest_s Xtest_s(:,j).^k];
  133.                 end;
  134.         end;

  135.         n = size(X,2)-1;

  136.         alpha = 0.01;
  137.         theta_grad = zeros(n+1,1);
  138.         for k = 1:1000,
  139.                 for i=1:mtrain,
  140.                         theta_grad = theta_grad + alpha * ((Ytrain(i) - theta_grad'*X(i,:)') * X(i,:)' - gamma*theta_grad);
  141.                 end;
  142.         end;
  143.        
  144.         X_plot = 0:0.01:1;
  145.         Y_plot = [];
  146.         for i=0:0.01:1,
  147.                 X_matrix = [1];
  148.                 for k = 1:d,
  149.                         X_matrix = [X_matrix; i.^k];
  150.                 end;
  151.                 Y_plot = [Y_plot, theta_grad'*X_matrix];
  152.         end;
  153.         plot (X_plot,Y_plot);

  154.        
  155.         for i=1:mtrain,
  156.                 training_error(gn) = training_error(gn) + (Ytrain(i) -(theta_grad'*X(i,:)'))^2;
  157.         end;
  158.         for i=1:mtest,
  159.                 test_error(gn) = test_error(gn) + (Ytest(i) -(theta_grad'*Xtest_s(i,:)'))^2;
  160.         end;
  161. end;
  162. figure(5);
  163. semilogx ([0.01, 0.1, 1, 10, 100, 1000], training_error, [0.01, 0.1, 1, 10, 100, 1000], test_error);


  164. %part f
  165. XYdata = [Xtrain Ytrain];
  166. I_rand = XYdata(randperm(size(Xtrain,1)), :);
  167. Xtrain = I_rand(:,1:7);
  168. Ytrain = I_rand(:,8);
  169. test_error = zeros(6,5);
  170. test_error_avg = zeros(6,1);
  171. mtrain_new = 80;
  172. mtest_new = 20;

  173. figure (6); hold on;
  174. scatter (Xtrain(:,3), Ytrain);
  175. d=8;
  176. gn=0;
  177. for gamma = [0.01, 0.1, 1, 10, 100, 1000]
  178.         gn = gn+1;
  179.         for cn = 1:5
  180.                 if (cn==5)
  181.                         Xnewtrain = [Xtrain(1:20*(cn-1),:)];
  182.                 else
  183.                         Xnewtrain = [Xtrain(1:20*(cn-1),:);Xtrain(20*cn+1:100,:)];
  184.                 end
  185.                 Xnewtest = [Xtrain(1+20*(cn-1):20*cn,:)];
  186.                
  187.                 X = [ones(size(Xnewtrain, 1),1) Xnewtrain(:,3)];
  188.                 Xtest_s = [ones(size(Xnewtest, 1),1) Xnewtest(:,3)];
  189.                 for j=2:2,
  190.                         for k=2:d,
  191.                                 X = [X X(:,j).^k];
  192.                                 Xtest_s = [Xtest_s Xtest_s(:,j).^k];
  193.                         end;
  194.                 end;

  195.                 n = size(X,2)-1;

  196.                 alpha = 0.01;
  197.                 theta_grad = zeros(n+1,1);
  198.                 for k = 1:1000,
  199.                         for i=1:mtrain_new,
  200.                                 theta_grad = theta_grad + alpha * ((Ytrain(i) - theta_grad'*X(i,:)') * X(i,:)' - gamma*theta_grad);
  201.                         end;
  202.                 end;

  203.                 for i=1:mtest_new,
  204.                         test_error(gn,cn) = test_error(gn) + (Ytest(i) -(theta_grad'*Xtest_s(i,:)'))^2;
  205.                 end;
  206.         end;
  207.         test_error_avg(gn) = ( test_error(gn,1) + test_error(gn,2) + test_error(gn,3) + test_error(gn,4) + test_error(gn,5) ) / 5;
  208. end;
  209. figure(7);
  210. semilogx ([0.01, 0.1, 1, 10, 100, 1000], test_error_avg);
复制代码

报纸
cxzbb 发表于 2016-4-25 07:53:54

Spam Classification

  1. clc; clear; close all;

  2. % Note: Run wrapper.pl first

  3. [Ytrain, Xtrain] = libsvmread('q4train_mod.dat');
  4. [Ytest, Xtest] = libsvmread('q4test_mod.dat');

  5. mtrain = size(Xtrain,1);
  6. mtest = size(Xtest,1);
  7. n = size(Xtrain,2);

  8. % part a
  9. % learn perceptron
  10. Xtrain_perceptron = [ones(mtrain,1) Xtrain];
  11. Xtest_perceptron = [ones(mtest,1) Xtest];
  12. alpha = 0.1;
  13. %initialize
  14. theta_perceptron = zeros(n+1,1);
  15. trainerror_mag = 100000;
  16. iteration = 0;
  17. %loop
  18. while (trainerror_mag>1000)
  19.         iteration = iteration+1;
  20.         for i = 1 : mtrain
  21.                 Ypredict_temp = sign(theta_perceptron'*Xtrain_perceptron(i,:)');
  22.                 theta_perceptron = theta_perceptron + alpha*(Ytrain(i)-Ypredict_temp)*Xtrain_perceptron(i,:)';
  23.         end
  24.         Ytrainpredict_perceptron = sign(theta_perceptron'*Xtrain_perceptron')';
  25.         trainerror_mag = (Ytrainpredict_perceptron - Ytrain)'*(Ytrainpredict_perceptron - Ytrain)
  26. end
  27. Ytestpredict_perceptron = sign(theta_perceptron'*Xtest_perceptron')';
  28. testerror_mag = (Ytestpredict_perceptron - Ytest)'*(Ytestpredict_perceptron - Ytest)

  29. %part b
  30. theta_svm = svmtrain(Ytrain, Xtrain, '-t 0');
  31. [Ytestpredict_svm, svm_accuracy, decision_values] = svmpredict(Ytest, Xtest, theta_svm);

  32. %part c
  33. accuracy = zeros(9,1);
  34. for mtemp = 1000:1000:9000
  35.         theta_svm = svmtrain(Ytrain(1:mtemp,:), Xtrain(1:mtemp,:), '-t 0');
  36.         [Ytestpredict_svm, accuracy_temp, decision_values] = svmpredict(Ytest, Xtest, theta_svm);
  37.         accuracy(mtemp/1000) = accuracy_temp(1,1);
  38. end
  39. plot (1000:1000:9000, accuracy);

  40. %part d
  41. numberofSV = theta_svm.nSV;

  42. [spamValues,spamIndex] = sort(theta_svm.sv_coef,'ascend');
  43. hamindices = theta_svm.sv_indices(spamIndex(1:5));
  44. spamindices = theta_svm.sv_indices(spamIndex(theta_svm.totalSV-4:theta_svm.totalSV))

  45. for i=1:numberofSV
  46.         Y(i) = Ytrain(theta_svm.sv_indices(i));
  47.         alpha_svm(i) = theta_svm.sv_coef(i)./Y(i);
  48. end

  49. %part e
  50. w = theta_svm.SVs' * theta_svm.sv_coef;
  51. b = -theta_svm.rho;
  52. if theta_svm.Label(1) == -1
  53.   w = -w;
  54.   b = -b;
  55. end
  56. distance = 1/norm(w);

  57. %part f
  58. theta_svm = svmtrain(Ytrain, Xtrain, '-t 2');
  59. [Ytestpredict_svm, rbf_accuracy, decision_values] = svmpredict(Ytest, Xtest, theta_svm);

  60. %part g
  61. XtrainMean = repmat(mean(Xtrain),mtrain,1);
  62. XtrainVariance = repmat(std(Xtrain),mtrain,1);
  63. Xtrain_n = (Xtrain - XtrainMean)./(XtrainVariance);
  64. XtestMean = repmat(mean(Xtest),mtest,1);
  65. XtestVariance = repmat(std(Xtest),mtest,1);
  66. Xtest_n = (Xtest - XtrainMean)./(XtrainVariance);

  67. theta_svm = svmtrain(Ytrain, Xtrain_n, '-t 0');
  68. [Ytestpredict_svm, norm_svm_accuracy, decision_values] = svmpredict(Ytest, Xtest_n, theta_svm);
  69. theta_svm = svmtrain(Ytrain, Xtrain_n, '-t 2');
  70. [Ytestpredict_svm, norm_rbf_accuracy, decision_values] = svmpredict(Ytest, Xtest_n, theta_svm);

  71. %part h
  72. rbf_accuracy = zeros(19,1);
  73. i = 0;
  74. for gamma_i = 0.00010:0.00005:0.00100
  75.         i = i+1;
  76.         arguments_i  = strcat('-t 2 -g',{' '},num2str(gamma_i))
  77.         theta_svm = svmtrain(Ytrain, Xtrain, arguments_i{1,1})
  78.         [Ytestpredict_svm, temp_rbf_accuracy, decision_values] = svmpredict(Ytest, Xtest, theta_svm);
  79.         rbf_accuracy(i) = temp_rbf_accuracy(1,1);
  80. end
  81. plot (0.00010:0.00005:0.00100, rbf_accuracy);

  82. %part i
  83. %in report
复制代码

已有 1 人评分论坛币 收起 理由
Nicolle + 20 鼓励积极发帖讨论

总评分: 论坛币 + 20   查看全部评分

地板
yangbing1008 发表于 2016-4-25 08:09:46
感谢分享
已有 1 人评分论坛币 收起 理由
Nicolle + 20 鼓励积极发帖讨论

总评分: 论坛币 + 20   查看全部评分

7
CRRAO 发表于 2016-4-25 08:17:27
look it thanks

8
fengyg 企业认证  发表于 2016-4-25 08:26:46
kankan

9
Nicolle 学生认证  发表于 2016-4-25 08:36:18

Decision Trees for Classification

提示: 作者被禁止或删除 内容自动屏蔽

10
Nicolle 学生认证  发表于 2016-4-25 08:42:24
提示: 作者被禁止或删除 内容自动屏蔽

您需要登录后才可以回帖 登录 | 我要注册

本版微信群
加好友,备注jltj
拉您入交流群
GMT+8, 2025-12-27 04:20