Quantcast
Channel: Machine Learning
Viewing all articles
Browse latest Browse all 63278

SVM classifier - grid search on C and gamma no affect

$
0
0

I'm working to find the best C and gamma values for the RBF on my dataset. This is, unfortunately, also my first time using an SVM classifier. Specifically, I'm using LibSVM + MATLAB.

g=[2e-5,2e-4,2e-3,2e-2,2e- 1,2e0,2e1,2e2,2e3,2e4,2e5,2e6,2e7,2e8,2e9,2e10,2e11,2e12,2e13,2e14,2e15]; C=g'; matrix_gC_aust=zeros(21,21); matrix_gC_germ=zeros(21,21); %%crossvalidate SVM test for i=1:21 for j=1:21 model_1=svmtrain(aust_train(:,15),aust_train(:,1:14),['-g',num2str(g(i)), '-c',num2str(C(j)),'-h 0 -t 2']); [pre1,acc1,decv1] = svmpredict(aust_test(:,15),aust_test(:,1:14),model_1); model_2=svmtrain(germ_numeric_train(:,25),germ_numeric_train(:,1:24),['-g',num2str(g(i)), '-c',num2str(C(j)),'-h 0 -t 2']); [pre2,acc2,decv2] = svmpredict(germ_numeric_test(:,25),germ_numeric_test(:,1:24),model_2); matrix_gC_aust(i,j)=acc1(1,1); matrix_gC_germ(i,j)=acc2(1,1); end end 

The results from this show that altering C and gamma do not change the accuracy of the SVM classifier. Is this typical/possible? Has anyone else experienced this?

One of my datasets is 1000 samples with 24 features and the other is 690 samples with 14 features. Since I'm a bit inexperienced, I'm not sure how these sample/feature ratios compare to others.

I'm guessing that since I have so many features for the relative size of my samples, that they are easily separable and so the nonlinear RBF kernel doesn't do much and so altering the parameters has no influence. Although, this is just my hypothesis. I was hoping to run this by some people that may be able to give me some feedback. If there is a better "ask Machine Learning questions" subreddit, let me know.

Thanks all

submitted by mcjoness
[link][6 comments]

Viewing all articles
Browse latest Browse all 63278

Trending Articles