MATLAB SVM tutorial (fitcsvm)

preview_player
Показать описание
I am sorry for everyone that I did not actually write code in the description.

--

clear; close all; clc;

%% preparing dataset

load fisheriris

species_num = grp2idx(species);
%%

% binary classification 형태로 만들기 위해 100개만...
X = randn(100,10);
X(:,[1,3,5,7]) = meas(1:100,:); % 1, 3, 5, 7 번 feature가 분류에 유용한 feature일 것임.
y = species_num(1:100);

rand_num = randperm(size(X,1));
X_train = X(rand_num(1:round(0.8*length(rand_num))),:);
y_train = y(rand_num(1:round(0.8*length(rand_num))),:);

X_test = X(rand_num(round(0.8*length(rand_num))+1:end),:);
y_test = y(rand_num(round(0.8*length(rand_num))+1:end),:);
%% CV partition

c = cvpartition(y_train,'k',5);
%% feature selection

opts = statset('display','iter');
classf = @(train_data, train_labels, test_data, test_labels)...
sum(predict(fitcsvm(train_data, train_labels,'KernelFunction','rbf'), test_data) ~= test_labels);

[fs, history] = sequentialfs(classf, X_train, y_train, 'cv', c, 'options', opts,'nfeatures',2);
%% Best hyperparameter

X_train_w_best_feature = X_train(:,fs);

Md1 = fitcsvm(X_train_w_best_feature,y_train,'KernelFunction','rbf','OptimizeHyperparameters','auto',...
'HyperparameterOptimizationOptions',struct('AcquisitionFunctionName',...
'expected-improvement-plus','ShowPlots',true)); % Bayes' Optimization 사용.

%% Final test with test set
X_test_w_best_feature = X_test(:,fs);
test_accuracy_for_iter = sum((predict(Md1,X_test_w_best_feature) == y_test))/length(y_test)*100

%% hyperplane 확인

figure;
hgscatter = gscatter(X_train_w_best_feature(:,1),X_train_w_best_feature(:,2),y_train);
hold on;
h_sv=plot(Md1.SupportVectors(:,1),Md1.SupportVectors(:,2),'ko','markersize',8);

% test set의 data를 하나 하나씩 넣어보자.

gscatter(X_test_w_best_feature(:,1),X_test_w_best_feature(:,2),y_test,'rb','xx')

% decision plane
XLIMs = get(gca,'xlim');
YLIMs = get(gca,'ylim');
[xi,yi] = meshgrid([XLIMs(1):0.01:XLIMs(2)],[YLIMs(1):0.01:YLIMs(2)]);
dd = [xi(:), yi(:)];
pred_mesh = predict(Md1, dd);
redcolor = [1, 0.8, 0.8];
bluecolor = [0.8, 0.8, 1];
pos = find(pred_mesh == 1);
h1 = plot(dd(pos,1), dd(pos,2),'s','color',redcolor,'Markersize',5,'MarkerEdgeColor',redcolor,'MarkerFaceColor',redcolor);
pos = find(pred_mesh == 2);
h2 = plot(dd(pos,1), dd(pos,2),'s','color',bluecolor,'Markersize',5,'MarkerEdgeColor',bluecolor,'MarkerFaceColor',bluecolor);
uistack(h1,'bottom');
uistack(h2,'bottom');
legend([hgscatter;h_sv],{'setosa','versicolor','support vectors'})
Рекомендации по теме
Комментарии
Автор

2 years later and this has saved me today 🙏🏿

lukundokampeshi
Автор

Excellent video. I recommend you. Basic tips and all its perfectly explained. Really thanks.

FELIPEPALTA
Автор

Wow, best matlab video I think I have ever seen.

davidross
Автор

Very excellent work work. I need more information about svm

shunmugakumari.d
Автор

Thanks for posting. Feature selection and hyperparameter optimization in MATLAB seems to be pretty cumbersome. There are a few lines of code in here that no one wants to write :)

pcomitz
Автор

thank you very much..u have made my life a lot more easier as a beginner

jezzaminejacob
Автор

Great tutorial!
can you make a video on One-Class SVM and give an example on how to do leave-one-subject-out?
Thank you

murtadhaaldeer
Автор

Very helping video. I need to know how can we get confusion matrix here? Early response is wanted, please.

Nbbgtygrrtghb
Автор

Thank you very much for this detailed explanation and for your efforts

montazermnhrmohsen
Автор

Wow thank you so much for this. But where do I find the weights 'w' of the support vectors, for w(x)+b=0 hyperplane formula? I only see the bias 'b'. It will be a big help. Thank you so much

elbertabrea
Автор

finding out the best parameters using cross validation, is it a mandatory step?
Btw your video is great, I am doing my project using svm, but couldn't find a single video which explained t implementation in MATLAB, and this was very helpful, and I nw have an idea, hw my project can be done.Thank you once again, and please keep creating more content like this.

poojithaborra
Автор

Thank you very much for your help. I love you.

tinAbraham_Indy
Автор

Can you please explain what is the significance of margin in the case of support vector regression(SVR), and can you also provide an example for SVR?

manojbhatt
Автор

Hi Cool video!, how can I modify the last part of the code so it plots correctly the hyperplane when the features are 4 for example?

lasher
Автор

이거는 혹시 label(class)가 binary인경우만 활용 가능한 코드인가요? 제가 지금 class 가 3개인 거로 machine learning을 해야만 하는데 어떻게 변형 적용할 수 있는지 조언 주시면 감사하겠습니다ㅜㅜ

서의진-cg
Автор

I believe SVM can also be used for mulit calss classification other than only binary classification. Can you make a video using Mulitclass classification.

Mkuladeep
Автор

you should make more videos!

THANKS!!

gabrielafonso
Автор

is this classifier learner app is dividing features in training and testing by itself ? which is the accuracy of the model ? i mean on what basis on training data or testing?

krishnachauhan
Автор

thank you much 👌👌👌👍👍 it was really helpful

tannamaleki
Автор

Can you post a video solving the nonbinary problem with the same dataset (fisheriris)

EDIT: There is an error in this video. There is no need to separate test/training sets prior to k-fold. cvpartition does that for you.

Pmaisterify