楼主: Reader's
1774 9

Face Detection and Recognition: Theory and Practice [推广有奖]

  • 0关注
  • 0粉丝

已卖:1521份资源

博士生

59%

还不是VIP/贵宾

-

TA的文库  其他...

可解釋的機器學習

Operations Research(运筹学)

国际金融(Finance)

威望
0
论坛币
41198 个
通用积分
2.6173
学术水平
7 点
热心指数
5 点
信用等级
5 点
经验
2201 点
帖子
198
精华
1
在线时间
36 小时
注册时间
2015-6-1
最后登录
2024-3-3

楼主
Reader's 发表于 2016-1-18 01:46:28 |AI写论文
1论坛币


https://www.crcpress.com/Face-Detection-and-Recognition-Theory-and-Practice/Datta-Datta-Banerjee/9781482226546?source=igodigital

Features
  • Explains the theory and practice of face detection and recognition systems currently in vogue
  • Offers a general review of the available face detection and recognition methods, as well as an indication of future research using cognitive neurophysiology
  • Provides a single source for cutting-edge information on the major approaches, algorithms, and technologies used in automated face detection and recognition
Summary

Face detection and recognition are the nonintrusive biometrics of choice in many security applications. Examples of their use include border control, driver’s license issuance, law enforcement investigations, and physical access control.

Face Detection and Recognition: Theory and Practice elaborates on and explains the theory and practice of face detection and recognition systems currently in vogue. The book begins with an introduction to the state of the art, offering a general review of the available methods and an indication of future research using cognitive neurophysiology. The text then:

  • Explores subspace methods for dimensionality reduction in face image processing, statistical methods applied to face detection, and intelligent face detection methods dominated by the use of artificial neural networks
  • Covers face detection with colour and infrared face images, face detection in real time, face detection and recognition using set estimation theory, face recognition using evolutionary algorithms, and face recognition in frequency domain
  • Discusses methods for the localization of face landmarks helpful in face recognition, methods of generating synthetic face images using set estimation theory, and databases of face images available for testing and training systems
  • Features pictorial descriptions of every algorithm as well as downloadable source code (in MATLAB®/PYTHON) and hardware implementation strategies with code examples
  • Demonstrates how frequency domain correlation techniques can be used supplying exhaustive test results

关键词:Recognition cognition detection Practice practic currently available research practice general

回帖推荐

fumingxu 发表于8楼  查看完整内容

**** 本内容被作者隐藏 ****

本帖被以下文库推荐

沙发
ReneeBK 发表于 2016-1-18 01:49:22
  1. MATLAB code for eigenface for face recognition
  2. % Principal Component Analysis for face recognition
  3. % M training images, sized N pixels wide by N pixels tall
  4. % c recognition images, also sized N by N pixels
  5. % Mp = desired number of principal components
  6. % Feature Extraction:
  7. % merge column vector for each training face
  8. X = [x1 x2 ... xm]
  9. % compute the average face
  10. me = mean(X,2)
  11. A = X - [me me ... me]
  12. % avoids N^2 by N^2 matrix computation of [V,D]=eig(A*transpose(A))
  13. % only computes M columns of U: A=U*E*transpose(V)
  14. [U,E,V] = svd(A,0)
  15. eigVals = diag(E)
  16. lmda = eigVals(1:Mp)
  17. % pick face-space principal components (eigenfaces)
  18. P = U(:,1:Mp)
  19. % store weights of training data projected into eigenspace
  20. train_wt = transpose(P)*A
  21. Nearest-Neighbor Classification:
  22. % A2 created from the recog data (in similar manner to A)
  23. recog_wt = transpose(P)*A2
  24. % euclidean distance for ith recog face, jth train face
  25. euDis(i,j) = sqrt((recog_wt(:,j)-train_wt(:,i)).^2)
复制代码

藤椅
ReneeBK 发表于 2016-1-18 01:51:07
  1. MATLAB code for implementation of two-dimensional PCA
  2. function [ WA,WB ] = pca2d( A,B,D )
  3. %perform 2-dimensional PCA on training set A
  4. of size mxnxN and test set B of size mxnxP
  5. % i.e. there are N training and P test samples (images)
  6. each of size mxn
  7. % D is the dimension to which A will be reduced
  8. A=double(A);B=double(B);
  9. [m n N]=size(A);[m n P]=size(B);
  10. total=A(:,:,1);
  11. for k=2:N
  12. total=total+A(:,:,k);
  13. end
  14. miu=total/N; % mean of A
  15. for k=1:N
  16. A(:,:,k)=A(:,:,k)-miu; % adjust A
  17. end
  18. G=zeros(n,n);
  19. for k=1:N
  20. G=G+transpose(A(:,:,k))*A(:,:,k);
  21. end
  22. G=G/N;
  23. [y,l]=eig(G);% find eigen value and eigen vector
  24. l=diag(l);
  25. % find first D highest Eigen values
  26. and store the associated Eigen
  27. % vectors in Y
  28. [val,ind]=sort(l,"descend");
  29. % sort Eigen values in descending order
  30. Y=[];
  31. for j=1:D
  32. Y=[Y y(:,ind(j))];
  33. end
  34. Y=Y./D; % normalize Y
  35. for k=1:N
  36. X(:,:,k)=A(:,:,k)*Y;
  37. end
  38. WA=X
  39. % find space projection projB of test set B
  40. for k=1:P
  41. B(:,:,k)=B(:,:,k)-miu; % adjust A
  42. end
  43. for k=1:P
  44. WB(:,:,k)=B(:,:,k)*Y;
  45. end
  46. end
复制代码

板凳
ReneeBK 发表于 2016-1-18 01:52:51
  1. #MATLAB code for implementation of Kernel PCA
  2. function [ WA,WB ] = pcaKernel( A,B,D )}
  3. \texttt{%perform Kernel PCA on training set A and test set B
  4. % D is the dimension to which A will be reduced
  5. A=double(A);B=double(B);
  6. [M N]=size(A);[M P]=size(B);
  7. miu=mean(transpose(transpose(A)));}
  8. \texttt{% find row-wise mean of A
  9. for j=1:N
  10. A(:,j)=A(:,j)-miu; % adjust A
  11. end
  12. KA=((transpose(A)*A)+4).^2; % kernel of A
  13. Kmiu=mean(transpose(transpose(KA)));
  14. for j=1:N
  15. KA(:,j)=KA(:,j)-Kmiu; % adjust KA
  16. end
  17. oneA=ones(N,N)./N;
  18. KA=KA-oneA*KA-KA*oneA+oneA*KA*oneA;
  19. [y,l]=eig(KA/N);
  20. % find eigen value and eigen vector
  21. l=diag(l);
  22. % find first D highest Eigen values
  23. and store the associated Eigen
  24. % vectors in Y
  25. [val,ind]=sort(l,"descend");
  26. % sort Eigen values in descending order
  27. Y=[];
  28. D=D;
  29. for j=1:D
  30. Y=[Y y(:,ind(j))];
  31. end
  32. Y=Y./D; % normalize Y
  33. X=KA*Y;
  34. WA=X*transpose(KA);
  35. % D-dimensional space projection of training images
  36. KB=((transpose(B)*A)+4).^2;
  37. oneB=ones(P,N)./N;
  38. KB=(KB-(oneB*KA)-(KB*oneA)+(oneB*KA*oneA));
  39. WB=X*transpose(KB);
  40. end
复制代码

报纸
ReneeBK 发表于 2016-1-18 01:53:48
  1. #To Be Continued
  2. % Fisherface
  3. %% same training & recognition images, also sized N by N pixels
  4. % P1 = eigenface result
  5. % Feature Extraction:
  6. % same as eigenface
  7. A = X - [me me ... me]
  8. % compute N^2 by N^2 between-class scatter matrix
  9. for i=1:c
  10. Sb = Sb + clsMeani*transpose(clsMeani)
  11. % compute N^2 by N^2 within-class scatter matrix
  12. for i=1:c, j=1:ci
  13. Sw = Sw + (X(j)-clsMeani)*transpose(X(j)-clsMeani)
  14. % project into (N-c) by (N-c) subspace using PCA
  15. Sbb = transpose(P1)*Sb*P1
  16. Sww = transpose(P1)*Sw*P1
  17. % generalized eigenvalue decomposition
  18. % solves Sbb*V = Sww*V*D
  19. [V,D] = eig(Sbb,Sww)
  20. eigVals = diag(D)
  21. lmda = eigVals(1:Mp)
  22. P = P1*V(:,1:Mp)
  23. % store training weights
  24. train_wt = transpose(P)*A
  25. %% Nearest-Neighbor Classification:
  26. % same as eigenface
复制代码

地板
ReneeBK 发表于 2016-1-18 01:55:57

Principal Component Analysis using Python

  1. # Classification for two class case using PCA
  2. import numpy as np
  3. from matplotlib import pyplot as plt
  4. from operator import itemgetter
  5. plt.rc("font", family="serif",size=18,weight="light")
  6. #plt.rc("text",usetex=True)
  7. #class1 = np.array([[2.5,2.4],[2.2,2.9],[3.1,3.0],[2.3,2.7],[1.9,2.2]])
  8. #class2 = np.array([[0.5,0.7],[1,1.1],[1.5,1.6],[1.1,0.9],[2,1.6]])
  9. plt.close("all")
  10. class1 =np.array([[1.,2.],[2.,3.],[3.,3.],[4.,5.],[5.,5.]])
  11. N1 = len(class1)
  12. class2 = np.array([[1.,0.],[2.,1.],[3.,1.],[3.,2.],[5.,3.],[6.,5.]])
  13. N2 = len(class2)
  14. data = np.vstack((class1,class2))
  15. plt.figure(1)
  16. plt.scatter(data[0:N1,0],data[0:N1,1],s=240,c=[0.9,0.9,0.9],\
  17. marker="o",label="original class-1",alpha=0.9)
  18. plt.scatter(data[N1:N1+N2,0],data[N1:N1+N2,1],\
  19. s=240,c=[0.0,0.0,0.0],marker="4",label="original class-2")
  20. plt.grid(axis="both")
  21. plt.legend(loc=0,prop={"size":14})
  22. plt.title("Original data")
  23. plt.xlim(-1,1.5*data.max())
  24. plt.ylim(-1,1.5*data.max())
  25. plt.xlabel("variable-1")
  26. plt.ylabel("variable-2")
  27. plt.show()
  28. m = np.array([data.mean(axis=0)])
  29. M = np.tile(m,(data.shape[0],1))
  30. D = data - M
  31. Cov = np.cov(D.T)
  32. CovMat = float(1./(D.shape[0]-1.)) * np.dot(D.T,D)
  33. val,vec = np.linalg.eig(CovMat)
  34. tmp = np.zeros((val.shape))
  35. tmpvec = np.zeros((vec.shape))
  36. for i in range(len(val)):
  37. a = max(enumerate(val), key=itemgetter(1))[0]
  38. tmp[i] = val[a]
  39. tmpvec[:,i] = vec[:,a]
  40. val[a]=0
  41. plt.figure(2)
  42. plt.scatter(D[0:N1,0],D[0:N1,1],s=240,c=[0.9,0.9,0.9],\
  43. marker="o",label="class-1,MS",alpha=0.9)
  44. plt.scatter(D[N1:N1+N2,0],D[N1:N1+N2,1],s=240,c=[0,0,0],\
  45. marker="4",label="class-2,MS")
  46. plt.grid(axis="both")
  47. plt.plot([-5*tmpvec[0,0],5*tmpvec[0,0]] ,[-5*tmpvec[1,0],\
  48. 5*tmpvec[1,0]],"--k",lw=2,label="eigvec_1")
  49. plt.plot([-5*tmpvec[0,1],5*tmpvec[0,1]] ,[-5*tmpvec[1,1],\
  50. 5*tmpvec[1,1]],"-k",lw=2,label="eigvec_2")
  51. plt.xlim(-data.max(),data.max())
  52. plt.ylim(-data.max(),data.max())
  53. plt.legend(loc=0,prop={"size":14})
  54. plt.title("Mean subtracted data")
  55. plt.show()
  56. ## Data reconstruction with all eigen vectors
  57. transData = np.dot(D,tmpvec) # taking all eigen vectors
  58. plt.figure(3)
  59. pc = tmpvec
  60. reconstructed = np.dot(transData,pc.T) + M
  61. plt.scatter(reconstructed[0:N1,0],reconstructed[0:N1,1],\
  62. s=240,c=[0.9,0.9,0.9],marker="o",label="class-1 reconstructed")
  63. plt.scatter(reconstructed[N1:N1+N2,0],\
  64. reconstructed[N1:N1+N2,1],s=240,c=[0.0,0.0,0.0],\
  65. marker="4",label="class-2 reconstructed")
  66. plt.grid(axis="both")
  67. plt.legend(loc=0,prop={"size":14})
  68. plt.xlim(0,10)
  69. plt.ylim(-1,10)
  70. plt.title("Reconstructed with all eigenvectors")
  71. plt.show()
  72. ## Data reconstruction with eigen vector having maximum variance
  73. plt.figure(4)
  74. pc = np.array([tmpvec[:,0]]).T
  75. rec = np.dot(D,pc) + M
  76. plt.scatter(rec[0:N1,0],rec[0:N1,1],s=240,c=[0.9,0.9,0.9],\
  77. marker="o",label="class-1 reconstructed",alpha=0.5)
  78. plt.scatter(rec[N1:N1+N2,0],rec[N1:N1+N2,1],s=240,c=[0.0,0.0,0.0],\
  79. marker="4",label="class-2 reconstructed")
  80. plt.plot([-10*tmpvec[0,0],10*tmpvec[0,0]] ,[-10*tmpvec[1,0],\
  81. 10*tmpvec[1,0]],"--k",lw=2,label="eigvec_1")
  82. plt.grid(axis="both")
  83. plt.legend(loc=0,prop={"size":14})
  84. plt.xlim(0,8)
  85. plt.ylim(-1,8)
  86. plt.title("Reconstructed with one eigenvector")
  87. plt.show()
  88. plt.figure(5)
  89. rec = rec -M
  90. rec[:,1] = 0
  91. plt.scatter(rec[0:N1,0],rec[0:N1,1],s=200,c=[1,1,1],\
  92. marker="o",label="Reduced Space-class-1")
  93. plt.scatter(rec[N1:N1+N2,0],rec[N1:N1+N2,1],s=160,c=[1,1,1],\
  94. marker="*",label="Reduced Space-class-2")
  95. plt.xlim(-(rec.max()+0.5),rec.max()+0.5)
  96. plt.ylim(-(rec.max()+0.5),rec.max()+0.5)
  97. plt.grid(axis="both")
  98. plt.legend(loc=0,prop={"size":12})
  99. plt.show()
复制代码

7
ReneeBK 发表于 2016-1-18 01:58:02

Fisher Linear Discriminant Analysis using Python

  1. PYTHON code for implementation of Fisher linear discriminant analysis
  2. # Classification for two class case using FLDA
  3. import numpy as np
  4. from matplotlib import pyplot as plt
  5. from operator import itemgetter
  6. plt.rc("font", family="serif",size=12,weight="light")
  7. plt.close("all")
  8. class1 =np.array([[1.,2.],[2.,3.],[3.,3.],[4.,5.],[5.,5.]])
  9. #class1 =np.array([[4,2],[2,4],[2,3],[3,6],[4,4]])
  10. N1 = len(class1)
  11. class2 = np.array([[1.,0.],[2.,1.],[3.,1.],[3.,2.],[5.,3.],[6.,5.]])
  12. #class2 = np.array([[9,10],[6,8],[9,5],[8,7],[10,8]])
  13. N2 = len(class2)
  14. m1 = np.array([class1.mean(axis=0)])
  15. m2 = np.array([class2.mean(axis=0)])
  16. d1 = class1 - np.tile(m1,(class1.shape[0],1))
  17. d2 = class2 - np.tile(m2,(class2.shape[0],1))
  18. S1 = float(1./(class1.shape[0]-1.)) * np.dot(d1.T,d1)
  19. S2 = float(1./(class2.shape[0]-1)) * np.dot(d2.T,d2)
  20. Sw = S1 + S2
  21. Sb = np.dot((m1-m2).T,(m1-m2))
  22. invSw = np.linalg.inv(Sw)
  23. invSwSb = np.dot( invSw , Sb)
  24. val,vec = np.linalg.eig(invSwSb)
  25. tmp = np.zeros((val.shape))
  26. tmpvec = np.zeros((vec.shape))
  27. for i in range(len(val)):
  28. a = max(enumerate(val), key=itemgetter(1))[0]
  29. tmp[i] = val[a]
  30. tmpvec[:,i] = vec[:,a]
  31. val[a]=0
  32. w = np.array([tmpvec[:,0]]).T
  33. #"""" eigenvector with largest eigenvalue """"
  34. #plt.plot([-5*vec[0,1],5*vec[0,1]] ,[-5*vec[1,1],5*vec[1,1]],
  35. "-k",lw=2,label="eigvec_1")
  36. plt.figure(1)
  37. plt.scatter(class1[:,0],class1[:,1],s=240,c=[0.9,0.9,0.9],\
  38. marker="o",label="class-1",alpha=0.9)
  39. plt.scatter(class2[:,0],class2[:,1],s=240,c=[0.,0.,0.],\
  40. marker="4",label="class-2")
  41. plt.xlim(-5,15)
  42. plt.ylim(-5,15)
  43. plt.plot([-5*w[0],20*w[0]] ,[-5*w[1],20*w[1]],\
  44. "--k",lw=2,label="Optimal eig_vec")
  45. plt.legend(loc=0,prop={"size":14})
  46. plt.grid(axis="both")
  47. plt.title("Direction of optimal eigen vector")
  48. plt.show()
  49. p1 = np.dot(class1,tmpvec) # projection of class-1 on optimal eigenvector
  50. p2 = np.dot(class2,tmpvec)
  51. p1[:,1] = 0 # taking the projected values only on optimal vector
  52. p2[:,1] = 0
  53. rec = np.vstack((p1,p2))
  54. plt.figure(2)
  55. plt.scatter(rec[0:N1,0],rec[0:N1,1],s=250,c=[1,1,1],\
  56. marker="o",label="Reduced Space-class-1")
  57. plt.scatter(rec[N1:N1+N2,0],rec[N1:N1+N2,1],s=250,\
  58. c=[1,1,1],marker="*",label="Reduced Space-class-2")
  59. plt.xlim(-(rec.max()+0.5),rec.max()+0.5)
  60. plt.ylim(-(rec.max()+0.5),rec.max()+0.5)
  61. plt.grid(axis="both")
  62. plt.legend(loc=0,prop={"size":12})
  63. plt.title("Projected data in reduced space using FLDA")
  64. plt.show()
复制代码

8
fumingxu 发表于 2016-1-18 10:03:32

本帖隐藏的内容

Asit Kumar Datta, Madhura Datta, Pradipta Kumar Banerjee-Face Detection and Reco.pdf (10.38 MB, 需要: 1 个论坛币)


已有 1 人评分经验 论坛币 学术水平 热心指数 信用等级 收起 理由
Nicolle + 100 + 100 + 1 + 1 + 1 精彩帖子

总评分: 经验 + 100  论坛币 + 100  学术水平 + 1  热心指数 + 1  信用等级 + 1   查看全部评分

9
lm972 发表于 2016-2-8 09:14:54
谢谢分享

10
soccy 发表于 2016-2-8 18:36:50
......

您需要登录后才可以回帖 登录 | 我要注册

本版微信群
加好友,备注jltj
拉您入交流群
GMT+8, 2025-12-9 03:28