Hi,
Reading about adaboost it says that it has an exponential loss function. The sklearn version one can edit the loss functions, this is something only gradient boosting could do. So what is going on ?
[link][4 comments]
Hi,
Reading about adaboost it says that it has an exponential loss function. The sklearn version one can edit the loss functions, this is something only gradient boosting could do. So what is going on ?