Quantcast
Channel: Machine Learning
Viewing all articles
Browse latest Browse all 62611

Best Binary Classifier In This Situation?

$
0
0

The problem: I have a vector of 30 inputs (all binary except one) and a single output (binary). Over 1 million samples are 'negative' (i.e. output = 0) and around 5000 are 'positive' (i.e. output = 1).

I tried training a few different neural nets (30I-15H-1O, 30I-15H-15H-1O) and they didn't seem to want to converge or they'd get stuck on the solution of 'always output 0' which would result in a low error because only a fraction of the dataset has 'positive' outputs.

I then tried training a network on a 'handcrafted' training set composed of 33% 'positive' and 66% 'negative' samples. In this case, only the two hidden layer network achieved any sort of predictive power and it generalized fairly poorly.

Edit: So, I tried resampling the training set. First, a 1:1 split (of each class) as a proof of concept. Then, doubling the number of 'zero class' samples in my training set a few times until the model would no longer converge on an appropriate solution. Next step is to play around with using cost functions that penalize false negatives to see if I can improve the accuracy further (while using more 'zero class' training samples).

SVM's certainly did not work. Random forests didn't work well with any reasonable number of trees. I'll give some of the algos in Vowpal Wabbit a try (since they'll let me use a heavily biased dataset and apply weights where necessary).

submitted by fx101
[link][12 comments]

Viewing all articles
Browse latest Browse all 62611

Trending Articles