linear.pls {plsdof} | R Documentation |
This function computes the Partial Least Squares solution and the first derivative of the regression coefficients.
linear.pls(X, y, m = ncol(X),model.selection="aic")
X |
matrix of predictor observations. |
y |
vector of response observations. The length of y is the same as the number of rows of X .
|
m |
maximal number of Partial Least Squares components. Default is m =ncol(X).
|
model.selection |
Which model selection criterion should be used? Element from c("aic","bic","gmdl") .
|
We first standardize X
to zero mean and unit variance.
The function returns an object of class "plsdof".
coefficients |
matrix of regression coefficients |
intercept |
vector of regression intercepts |
DoF |
Degrees of Freedom |
sigmahat |
vector of estimated model error |
dBeta |
array of the first derivative of coefficients |
covariance |
array of the covariance matrices of coefficients |
m.opt |
optimal number of PLS components, as determined by model.selection |
Nicole Kraemer
Kraemer, N., Braun, M.L. (2007) "Kernelizing PLS, Degrees of Freedom, and Efficient Model Selection", Proceedings of the 24th International Conference on Machine Learning, Omni Press, 441 - 448
kernel.pls.ic
, kernel.pls.cv
,kernel.pls
n<-50 # number of observations p<-5 # number of variables X<-matrix(rnorm(n*p),ncol=p) y<-rnorm(n) pls.object<-linear.pls(X,y,m=5,model.selection="bic")