train {caret}R Documentation

Fit Predictive Models over Different Tuning Parameters

Description

This function sets up a grid of tuning parameters for a number of classification and regression routines, fits each model and calculates a resampling based performance measure.

Usage

train(x, ...)

## Default S3 method:
train(x, y, 
      method = "rf",  
      ..., 
      metric = ifelse(is.factor(y), "Accuracy", "RMSE"),   
      maximize = ifelse(metric == "RMSE", FALSE, TRUE),
      trControl = trainControl(), 
      tuneGrid = NULL, 
      tuneLength = 3)

## S3 method for class 'formula':
train(form, data, ..., subset, na.action, contrasts = NULL)

Arguments

x a data frame containing training data where samples are in rows and features are in columns.
y a numeric or factor vector containing the outcome for each sample.
form A formula of the form y ~ x1 + x2 + ...
data Data frame from which variables specified in formula are preferentially to be taken.
subset An index vector specifying the cases to be used in the training sample. (NOTE: If given, this argument must be named.)
na.action A function to specify the action to be taken if NAs are found. The default action is for the procedure to fail. An alternative is na.omit, which leads to rejection of cases with missing values on any required variable. (NOTE: If given, this argument must be named.)
contrasts a list of contrasts to be used for some or all of the factors appearing as variables in the model formula.
method a string specifying which classification or regression model to use. Possible values are: ada, bagEarth, bagFDA, blackboost, cforest, ctree, ctree2, earth, enet, fda, gamboost, gaussprPoly, gaussprRadial, gbm, glm, glmboost, glmnet, gpls, J48, JRip, knn, lars, lasso, lda, lm, lmStepAIC, LMT, logitBoost, lssvmPoly, lssvmRadial, lvq, M5Rules, mda, multinom, nb, nnet, OneR, pam, pcaNNet, pda, pda2, penalized, pls, ppr, qda, rda, rf, rpart, rvmPoly, rvmRadial, sda, sddaLDA, sddaQDA, slda, sparseLDA, spls, superpc, svmPoly, svmRadial and treebag. See the Details section below.
... arguments passed to the classification or regression routine (such as randomForest). Errors will occur if values for tuning parameters are passed here.
metric a string that specifies what summary metric will be used to select the optimal model. By default, possible values are "RMSE" and "Rsquared" for regression and "Accuracy" and "Kappa" for classification. If custom performance metrics are used (via the summaryFunction argument in trainControl, the value of metric should match one of the arguments. If it does not, a warning is issued and the first metric given by the summaryFunction is used. (NOTE: If given, this argument must be named.)
maximize a logical: should the metric be maximized or minimized?
trControl a list of values that define how this function acts. See trainControl. (NOTE: If given, this argument must be named.)
tuneGrid a data frame with possible tuning values. The columns are named the same as the tuning parameters in each method preceded by a period (e.g. .decay, .lambda). See the function createGrid in this package for more details. (NOTE: If given, this argument must be named.)
tuneLength an integer denoting the number of levels for each tuning parameters that should be generated by createGrid. (NOTE: If given, this argument must be named.)

Details

train can be used to tune models by picking the complexity parameters that are associated with the optimal resampling statistics. For particular model, a grid of parameters (if any) is created and the model is trained on slightly different data for each candidate combination of tuning parameters. Across each data set, the performance of held-out samples is calculated and the mean and standard deviation is summarized for each combination. The combination with the optimal resampling statistic is chosen as the final model and the entire training set is used to fit a final model.

A variety of models are currently available. The table below enumerates the models and the values of the method argument, as well as the complexity parameters used by train.

Model method Value Package Tuning Parameter(s)
Generalized linear model glm stats none
Recursive partitioning rpart rpart maxdepth
ctree party mincriterion
ctree2 party maxdepth
Boosted trees gbm gbm interaction depth,
n.trees, shrinkage
blackboost mboost maxdepth, mstop
ada ada maxdepth, iter, nu
Boosted regression models glmboost mboost mstop
gamboost mboost mstop
logitboost caTools nIter
Random forests rf randomForest mtry
cforest party mtry
Bagged trees treebag ipred None
Elastic net (glm) glmnet glmnet alpha, lambda
Neural networks nnet nnet decay, size
Projection pursuit regression ppr stats nterms
Partial least squares pls pls, caret ncomp
Sparse partial least squares spls spls, caret K, eta, kappa
Support vector machines (RBF) svmradial kernlab sigma, C
Support vector machines (polynomial) svmpoly kernlab scale, degree, C
Relevance vector machines (RBF) rvmradial kernlab sigma
Relevance vector machines (polynomial) rvmpoly kernlab scale, degree
Least squares support vector machines (RBF) lssvmradial kernlab sigma
Gaussian processes (RBF) guassprRadial kernlab sigma
Gaussian processes (polynomial) guassprPoly kernlab scale, degree
Linear least squares lm stats None
Multivariate adaptive regression splines earth earth degree, nprune
Bagged MARS bagEarth caret, earth degree, nprune
M5 rules M5Rules RWeka pruned
Elastic net enet elasticnet lambda, fraction
Least Angle Regression lars lars fraction
lars2 lars steps
The Lasso enet elasticnet fraction
Penalized linear models penalized penalized lambda1, lambda2
Supervised principal components superpc superpc n.components, threshold
Linear discriminant analysis lda MASS None
Quadratic discriminant analysis qda MASS None
Stabilised Linear discriminant analysis slda ipred None
Stepwise diagonal discriminant analysis sddaLDA, sddaQDA SDDA None
Shrinkage discriminant analysis sda sda diagonal
Regularized discriminant analysis rda klaR lambda, gamma
Mixture discriminant analysis mda mda subclasses
Penalized discriminant analysis pda mda lambda
pda2 mda df
Stabilised linear discriminant analysis slda ipred None
Flexible discriminant analysis (MARS) fda mda, earth degree, nprune
Bagged FDA bagFDA caret, earth degree, nprune
Logistic/multinomial regression multinom nnet decay
C4.5 decision trees J48 RWeka C
Single Rule OneR RWeka None
PART PART RWeka threshold, pruned
k nearest neighbors knn3 caret k
Nearest shrunken centroids pam pamr threshold
Naive Bayes nb klaR usekernel
Generalized partial least squares gpls gpls K.prov
Learned vector quantization lvq class k

By default, the function createGrid is used to define the candidate values of the tuning parameters. The user can also specify their own. To do this, a data fame is created with columns for each tuning parameter in the model. The column names must be the same as those listed in the table above with a leading dot. For example, ncomp would have the column heading .ncomp. This data frame can then be passed to createGrid.

In some cases, models may require control arguments. These can be passed via the three dots argument. Note that some models can specify tuning parameters in the control objects. If specified, these values will be superseded by those given in the createGrid argument.

The vignette entitled "caret Manual – Model Building" has more details and examples related to this function.

train can be used with "explicit parallelism", where different resamples (e.g. cross-validation group) can be split up and run on multiple machines or processors. By default, train will use a single processor on the host machine. To use more, the computeFunction and computeArgs arguments in trainControl can be used. computeFunction is used to pass a function that takes arguments named X and FUN. Internally, train will pass the data and modeling functions through using these arguments. By default, train uses lapply. Alternatively, any function that emulates lapply but distributes jobs across multiple machines/processors can be used. Arguments to such a function can be passed (if needed) via the computeArgs argument in trainControl. Examples are given below using the Rmpi package (via snow) and NetworkSpaces (via the nws package).

Value

A list is returned of class train containing:

modelType an identifier of the model type.
results a data frame the training error rate and values of the tuning parameters.
call the (matched) function call with dots expanded
dots a list containing any ... values passed to the original call
metric a string that specifies what summary metric will be used to select the optimal model.
trControl the list of control parameters.
finalModel an fit object using the best parameters
trainingData a data frame
resample A data frame with columns for each performance metric. Each row corresponds to each resample. If leave-one-out cross-validation or out-of-bag estimation methods are requested, this will be NULL. The returnResamp argument of trainControl controls how much of the resampled results are saved.
perfNames a character vector of performance metrics that are produced by the summary function
maximize a logical recycled from the function arguments.

Author(s)

Max Kuhn (the guts of train.formula were based on Ripley's nnet.formula)

References

Kuhn (2008), ``Building Predictive Models in R Using the caret'' (http://www.jstatsoft.org/v28/i05/)

See Also

trainControl, createGrid, createFolds

Examples

#######################################
## Classification Example

data(iris)
TrainData <- iris[,1:4]
TrainClasses <- iris[,5]

knnFit1 <- train(TrainData, TrainClasses,
                 "knn",
                 tuneLength = 10,
                 trControl = trainControl(method = "cv"))

knnFit2 <- train(TrainData, TrainClasses,
                 "knn", tuneLength = 10, 
                 trControl = trainControl(method = "boot"))

library(MASS)
nnetFit <- train(TrainData, TrainClasses,
                 "nnet", 
                 tuneLength = 2,
                 trace = FALSE,
                 maxit = 100)

#######################################
## Regression Example

library(mlbench)
data(BostonHousing)

lmFit <- train(medv ~ . + rm:lstat,
               data = BostonHousing, 
               "lm")

library(rpart)
rpartFit <- train(medv ~ .,
                  data = BostonHousing,
                  "rpart",
                  tuneLength = 9)

#######################################
## Example with a custom metric

madSummary <- function (data,
                        lev = NULL,
                        model = NULL) 
{
  out <- mad(data$obs - data$pred, 
             na.rm = TRUE)  
  names(out) <- "MAD"
  out
}

robustControl <- trainControl(summaryFunction = madSummary)
marsGrid <- expand.grid(.degree = 1,
                        .nprune = (1:10) * 2)

earthFit <- train(medv ~ .,
                  data = BostonHousing, 
                  "earth",
                  tuneGrid = marsGrid,
                  metric = "MAD",
                  maximize = FALSE,
                  trControl = robustControl)

#######################################
## Parallel Processing Example via MPI

## Not run: 

## A function to emulate lapply in parallel
mpiClacs <- function(X, FUN, ...)
  {
    theDots <- list(...)
    parLapply(theDots$cl, X, FUN)
  }

library(snow)
cl <- makeCluster(5, "MPI")

## 50 bootstrap models distributed across 5 workers
mpiControl <- trainControl(workers = 5,
                           number = 50,
                           computeFunction = mpiClacs,
                           computeArgs = list(cl = cl))
set.seed(1)
usingMPI <-  train(medv ~ .,
                   data = BostonHousing, 
                   "glmboost",
                   trControl = mpiControl)

stopCluster(cl)
## End(Not run)

#######################################
## Parallel Processing Example via NWS
## Not run: 

nwsClacs <- function(X, FUN, ...)
  {
    theDots <- list(...)
    eachElem(theDots$sObj,
             fun = FUN,
             elementArgs = list(X))
  }

library(nws)
sObj <- sleigh(workerCount = 5)

nwsControl <- trainControl(workers = 5,
                           number = 50,
                           computeFunction = nwsClacs,
                           computeArgs = list(sObj = sObj))
set.seed(1)
usingNWS <-  train(medv ~ .,
                   data = BostonHousing, 
                   "glmboost",
                   trControl = nwsControl)

close(sObj)

## End(Not run)


[Package caret version 4.10 Index]