|
Is there a way to implement 'Stochastic Gradient Boosting (SGB)' as described by Friedman (2002) using WEKA? Basically, 'Stochastic Gradient Boosting' is equal to 'Gradient Boosting', but for each iteration a random subsample is used. 'Gradient Boosting' is implemented in WEKA as LogitBoost as far as I understood it (or at least Friedman mentioned that it is very similar to his 'Gradient Boosting' algorithm. The issue is, that there is no option to use random subsamples in each iteration to match the algorithm needed. Are there any implementations of this algorithm or one that is at least very similar to it? |
|
There are quite a few packages in R which are variations of boosting algorithms. Have a look here which discusses the Friedman stochastic gradient boosting. If you want to subsample things, have a look at random forests also. Thanks! Already looks similar to what I'm looking for, but I actually need the algorithm to run on WEKA/Java as it will form part of a large automated setup running several algorithms on several data sets. RandomForest is already included and working perfectly. I'm actually surprised there's no implementation to be found easily as I guess changes to the LogitBoost code would be minor. I'm not familiar enough with the algorithm coding though, so I can't do it.
(Aug 25 '13 at 05:19)
Axel Teich
|