cforest {party} | R Documentation |
An implementation of the random forest and bagging ensemble algorithms utilizing conditional inference trees as base learners.
cforest(formula, data = list(), subset = NULL, weights = NULL, controls = ctree_control(teststattype = "maxabs", testtype = "Teststatistic", mincriterion = qnorm(0.9), mtry = 5, savesplitstats = FALSE), xtrafo = ptrafo, ytrafo = ptrafo, scores = NULL, ntree = 500)
formula |
a symbolic description of the model to be fit. |
data |
an data frame containing the variables in the model. |
subset |
an optional vector specifying a subset of observations to be used in the fitting process. |
weights |
an optional vector of weights to be used in the fitting process. Only non-negative integer valued weights are allowed. |
controls |
an object of class TreeControl , which can be
obtained using ctree_control . |
xtrafo |
a function to be applied to all input variables.
By default, the ptrafo function is applied. |
ytrafo |
a function to be applied to all response variables.
By default, the ptrafo function is applied. |
scores |
an optional named list of scores to be attached to ordered factors. |
ntree |
number of trees to grow. |
This implementation of the random forest (and bagging) algorithm differs
from the reference implementation in randomForest]
with respect to the base learner used and the aggregation scheme applied.
Conditional inference trees, see ctree
, are fitted to each
of the B
bootstrap samples of the learning sample. There are many
hyper parameters that can be controlled, see ctree_control
.
You MUST NOT change anything you don't understand completely.
The aggregation scheme works by averaging observation weights extracted
from each of the B
trees and NOT by averaging predictions directly.
See Hothorn et al. (2004) for a description.
Ensembles of conditional inference trees have not yet been extensively
tested, so this routine is meant for the expert user only and its current
state is rather experimental. However, there are some things that can't be
done with randomForest
, for example fitting
forests to censored response variables.
By default, raw test statitics are maximized and five inputs are randomly examined for possible splits in each node.
An object of class RandomForest-class
.
Leo Breiman (2001). Random Forests. Machine Learning, 45(1), 5–32.
Torsten Hothorn, Berthold Lausen, Axel Benner and Martin Radespiel-Troeger (2004). Bagging Survival Trees. Statistics in Medicine, 23(1), 77–91.
### honest (i.e., out-of-bag) cross-classification of ### true vs. predicted classes table(mammoexp$ME, predict(cforest(ME ~ ., data = mammoexp, ntree = 50), OOB = TRUE)) ### fit forest to censored response if (require("ipred")) { data("GBSG2", package = "ipred") bst <- cforest(Surv(time, cens) ~ ., data = GBSG2, ntree = 50) ### estimate conditional Kaplan-Meier curves treeresponse(bst, newdata = GBSG2[1:2,], OOB = TRUE) ### if you can't resist to look at individual trees ... party:::prettytree(bst@ensemble[[1]], names(bst@data@get("input"))) }