Quantcast
Channel: Machine Learning
Viewing all articles
Browse latest Browse all 63716

Confusion matrix with leave-one-out cross validation

$
0
0

I have started working on a project where we are using a nearest mean classifier on a noisy data set to evaluate different features.

We do leave one out - where we leave out one sample, fit the classifier to the remaining data, and then predict for the left out sample. We repeat this with each data point and build the confusion matrix in this way.

I am trying to get a deeper understanding of the theory and I get the impression the usually the confusion matrix is generated from a whole set of test data (not used for fitting) and is for a single classifier - whereas here we are building the matrix from all the many different (leave-one-out) classifiers.

Is this still called a confusion matrix or is there any other terms for this procedure (I am a beginner in the ML field). Is there any reference I could read for more details of the theory behind this kind of procedure or how it could be improved.

submitted by thrope
[link] [comment]

Viewing all articles
Browse latest Browse all 63716

Trending Articles