blackboost {mboost} | R Documentation |
Gradient boosting for optimizing arbitrary loss functions where regression trees are utilized as base-learners.
blackboost(formula, data = list(), tree_controls = ctree_control( teststat = "max", testtype = "Teststatistic", mincriterion = 0, maxdepth = 2, savesplitstats = FALSE), ...)
formula |
a symbolic description of the model to be fit. |
data |
a data frame containing the variables in the model. |
tree_controls |
an object of class |
... |
additional arguments passed to |
This function implements the ‘classical’
gradient boosting utilizing regression trees as base-learners.
Essentially, the same algorithm is implemented in package
gbm
. The
main difference is that arbitrary loss functions to be optimized
can be specified via the family
argument to blackboost
whereas
gbm
uses hard-coded loss functions.
Moreover, the base-learners (conditional
inference trees, see ctree
) are a little bit more flexible.
The regression fit is a black box prediction machine and thus hardly interpretable.
An object of class mboost
with print
and predict
methods being available.
Peter Buehlmann and Torsten Hothorn (2007), Boosting algorithms: regularization, prediction and model fitting. Statistical Science, 22(4), 477–505.
Torsten Hothorn, Kurt Hornik and Achim Zeileis (2006). Unbiased recursive partitioning: A conditional inference framework. Journal of Computational and Graphical Statistics, 15(3), 651–674.
Yoav Freund and Robert E. Schapire (1996), Experiments with a new boosting algorithm. In Machine Learning: Proc. Thirteenth International Conference, 148–156.
Jerome H. Friedman (2001), Greedy function approximation: A gradient boosting machine. The Annals of Statistics, 29, 1189–1232.
Greg Ridgeway (1999), The state of boosting. Computing Science and Statistics, 31, 172–181.
mboost
for the generic boosting function and
glmboost
for boosted linear models and
gamboost
for boosted additive models. See
cvrisk
for cross-validated stopping iteration.
Furthermore see boost_control
, Family
and
methods
### a simple two-dimensional example: cars data cars.gb <- blackboost(dist ~ speed, data = cars, control = boost_control(mstop = 50)) cars.gb ### plot fit plot(dist ~ speed, data = cars) lines(cars$speed, predict(cars.gb), col = "red")