楼主: Lisrelchen
3027 16

Neural Network Toolbox User's Guide 2017 [推广有奖]

  • 0关注
  • 62粉丝

VIP

已卖:4194份资源

院士

67%

还不是VIP/贵宾

-

TA的文库  其他...

Bayesian NewOccidental

Spatial Data Analysis

东西方数据挖掘

威望
0
论坛币
50288 个
通用积分
83.6306
学术水平
253 点
热心指数
300 点
信用等级
208 点
经验
41518 点
帖子
3256
精华
14
在线时间
766 小时
注册时间
2006-5-4
最后登录
2022-11-6

楼主
Lisrelchen 发表于 2017-9-24 01:47:02 |AI写论文

+2 论坛币
k人 参与回答

经管之家送您一份

应届毕业生专属福利!

求职就业群
赵安豆老师微信:zhaoandou666

经管之家联合CDA

送您一个全额奖学金名额~ !

感谢您参与论坛问题回答

经管之家送您两个论坛币!

+2 论坛币

本帖隐藏的内容

Neural Network Toolbox User’s Guide 2017.pdf (4.65 MB)


二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

关键词:toolbox network Neural Guide User

本帖被以下文库推荐

沙发
Lisrelchen 发表于 2017-9-24 01:49:59
Get Started with Transfer Learning
  1. Unzip and load the new images as an image datastore. Divide the data into training and validation data sets. Use 70% of the images for training and 30% for validation.

  2. unzip('MerchData.zip');
  3. images = imageDatastore('MerchData','IncludeSubfolders',true,'LabelSource','foldernames');
  4. [trainingImages,validationImages] = splitEachLabel(images,0.7,'randomized');
复制代码
  1. Load the pretrained AlexNet network. If Neural Network Toolbox™ Model for AlexNet Network is not installed, then the software provides a download link. AlexNet is trained on more than one million images and can classify images into 1000 object categories.

  2. net = alexnet;
复制代码
  1. To retrain AlexNet to classify new images, replace the last three layers of the network. Set the final fully connected layer to have the same size as the number of classes in the new data set (5, in this example). To learn faster in the new layers than in the transferred layers, increase the learning rate factors of the fully connected layer.

  2. layersTransfer = net.Layers(1:end-3);
  3. numClasses = numel(categories(trainingImages.Labels));
  4. layers = [
  5.     layersTransfer
  6.     fullyConnectedLayer(numClasses,'WeightLearnRateFactor',20,'BiasLearnRateFactor',20)
  7.     softmaxLayer
  8.     classificationLayer];
复制代码
  1. options = trainingOptions('sgdm',...
  2.     'MiniBatchSize',10,...
  3.     'MaxEpochs',4,...
  4.     'InitialLearnRate',1e-4,...
  5.     'Verbose',false,...
  6.     'Plots','training-progress',...
  7.     'ValidationData',validationImages,...
  8.     'ValidationFrequency',5);
复制代码
  1. Train the network using the training data.

  2. netTransfer = trainNetwork(trainingImages,layers,options);
复制代码



藤椅
auirzxp 学生认证  发表于 2017-9-24 01:54:48
提示: 作者被禁止或删除 内容自动屏蔽

板凳
Lisrelchen 发表于 2017-9-24 01:55:43
Transfer Learning Using AlexNet
  1. #Load Data

  2. unzip('MerchData.zip');
  3. images = imageDatastore('MerchData',...
  4.     'IncludeSubfolders',true,...
  5.     'LabelSource','foldernames');
复制代码
  1. #Divide the data into training and validation data sets. Use 70% of the images for training and 30% for validation. splitEachLabel splits the images datastore into two new datastores.

  2. [trainingImages,validationImages] = splitEachLabel(images,0.7,'randomized');
复制代码
  1. #Display some sample images.

  2. numTrainImages = numel(trainingImages.Labels);
  3. idx = randperm(numTrainImages,16);
  4. figure
  5. for i = 1:16
  6.     subplot(4,4,i)
  7.     I = readimage(trainingImages,idx(i));
  8.     imshow(I)
  9. end
复制代码
  1. #Load Pretrained Network
  2. net = alexnet;
  3. #Display the network architecture. The network has five convolutional layers and three fully connected layers.

  4. net.Layers
复制代码
  1. #Transfer Layers to New Network. The last three layers of the pretrained network net are configured for 1000 classes. These three layers must be fine-tuned for the new classification problem. Extract all layers, except the last three, from the pretrained network.

  2. layersTransfer = net.Layers(1:end-3);

  3. numClasses = numel(categories(trainingImages.Labels))
  4. layers = [
  5.     layersTransfer
  6.     fullyConnectedLayer(numClasses,'WeightLearnRateFactor',20,'BiasLearnRateFactor',20)
  7.     softmaxLayer
  8.     classificationLayer];
复制代码
  1. #Train Network

  2. miniBatchSize = 10;
  3. numIterationsPerEpoch = floor(numel(trainingImages.Labels)/miniBatchSize);
  4. options = trainingOptions('sgdm',...
  5.     'MiniBatchSize',miniBatchSize,...
  6.     'MaxEpochs',4,...
  7.     'InitialLearnRate',1e-4,...
  8.     'Verbose',false,...
  9.     'Plots','training-progress',...
  10.     'ValidationData',validationImages,...
  11.     'ValidationFrequency',numIterationsPerEpoch);
复制代码
  1. netTransfer = trainNetwork(trainingImages,layers,options);
复制代码



报纸
MouJack007 发表于 2017-9-24 02:09:36
谢谢楼主分享!

地板
MouJack007 发表于 2017-9-24 02:10:03

7
Lisrelchen 发表于 2017-9-24 02:20:16
Transfer Learning Using GoogLeNet
  1. unzip('MerchData.zip');
  2. images = imageDatastore('MerchData','IncludeSubfolders',true,'LabelSource','foldernames');
  3. images.ReadFcn = @(loc)imresize(imread(loc),[224,224]);
  4. [trainImages,valImages] = splitEachLabel(images,0.7,'randomized');
复制代码
  1. net = googlenet;
复制代码
  1. #Extract the layer graph from the trained network and plot the layer graph.
  2. lgraph = layerGraph(net);
  3. figure('Units','normalized','Position',[0.1 0.1 0.8 0.8]);
  4. plot(lgraph)
复制代码
  1. #To retrain GoogLeNet to classify new images, replace the last three layers of the network. These three layers of the network, with the #names 'loss3-classifier', 'prob', and 'output', contain the information of how to combine the features that the network extracts into class #probabilities and labels. Add three new layers, a fully connected layer, a softmax layer, and a classification output layer, to the layer #graph. Set the final fully connected layer to have the same size as the number of classes in the new data set (5, in this example). To #learn faster in the new layers than in the transferred layers, increase the learning rate factors of the fully connected layer.

  2. lgraph = removeLayers(lgraph, {'loss3-classifier','prob','output'});

  3. numClasses = numel(categories(trainImages.Labels));
  4. newLayers = [
  5.     fullyConnectedLayer(numClasses,'Name','fc','WeightLearnRateFactor',20,'BiasLearnRateFactor', 20)
  6.     softmaxLayer('Name','softmax')
  7.     classificationLayer('Name','classoutput')];
  8. lgraph = addLayers(lgraph,newLayers);
复制代码
  1. #Connect the last of the transferred layers remaining in the network ('pool5-drop_7x7_s1') to the new layers. To check that the new layers #are correctly connected, plot the new layer graph and zoom in on the last layers of the network.

  2. lgraph = connectLayers(lgraph,'pool5-drop_7x7_s1','fc');

  3. figure('Units','normalized','Position',[0.3 0.3 0.4 0.4]);
  4. plot(lgraph)
  5. ylim([0,10])
复制代码
  1. #Specify the training options, including learning rate, mini-batch size, and validation data.
  2. options = trainingOptions('sgdm',...
  3.     'MiniBatchSize',10,...
  4.     'MaxEpochs',3,...
  5.     'InitialLearnRate',1e-4,...
  6.     'VerboseFrequency',1,...
  7.     'ValidationData',valImages,...
  8.     'ValidationFrequency',3);
复制代码
  1. #Train the network using the training data.
  2. net = trainNetwork(trainImages,lgraph,options);
复制代码
  1. #Classify the validation images using the fine-tuned network, and calculate the classification accuracy.
  2. predictedLabels = classify(net,valImages);
  3. accuracy = mean(predictedLabels == valImages.Labels)
复制代码



8
Lisrelchen 发表于 2017-9-24 02:25:54
  1. #Load Data
  2. unzip('MerchData.zip');
  3. images = imageDatastore('MerchData',...
  4.     'IncludeSubfolders',true,...
  5.     'LabelSource','foldernames');
  6. [trainingImages,testImages] = splitEachLabel(images,0.7,'randomized');
复制代码
  1. #Display some sample images.
  2. numTrainImages = numel(trainingImages.Labels);
  3. idx = randperm(numTrainImages,16);
  4. figure
  5. for i = 1:16
  6. subplot(4,4,i)
  7. I = readimage(trainingImages,idx(i));
  8. imshow(I)
  9. end
复制代码
  1. #Load Pretrained Network
  2. net = alexnet;
  3. #Display the network architecture.
  4. net.Layers
复制代码
  1. #Extract the class labels from the training and test data.
  2. trainingLabels = trainingImages.Labels;
  3. testLabels = testImages.Labels;
复制代码
  1. #Use the features extracted from the training images as predictor variables and fit a multiclass support vector machine (SVM) using #fitcecoc (Statistics and Machine Learning Toolbox).
  2. classifier = fitcecoc(trainingFeatures,trainingLabels);
复制代码
  1. #Classify the test images using the trained SVM model the features extracted from the test images.
  2. predictedLabels = predict(classifier,testFeatures);
复制代码
  1. #Display four sample test images with their predicted labels.
  2. idx = [1 5 10 15];
  3. figure
  4. for i = 1:numel(idx)
  5.     subplot(2,2,i)
  6.     I = readimage(testImages,idx(i));
  7.     label = predictedLabels(idx(i));
  8.     imshow(I)
  9.     title(char(label))
  10. end
复制代码


9
tmdxyz 发表于 2017-9-24 04:09:34
Neural Network Toolbox User's Guide 2017

10
longitudinal 发表于 2017-9-24 05:08:44
提示: 作者被禁止或删除 内容自动屏蔽

您需要登录后才可以回帖 登录 | 我要注册

本版微信群
加好友,备注jltj
拉您入交流群
GMT+8, 2025-12-27 20:57