Quantcast
Channel: Machine Learning
Viewing all articles
Browse latest Browse all 62700

What kind of criteria to optimize when building a tree for Boosting algorithms?

$
0
0

For example, For AdaBoost, at each iteration, we need to select the weak classifier that gives the smallest weighted error rate. If we want to build tree with depth larger than 1, it is desired to train the tree node by node greedily . What kind of criteria should we use to find the best split for the nodes? Select the split that gives the smallest weighted error or other rules like gini value?

Thanks a lot!

submitted by zl1zl
[link][2 comments]

Viewing all articles
Browse latest Browse all 62700

Trending Articles