kpca {kernlab} | R Documentation |
Kernel Principal Components Analysis is a nonlinear form of principal component analysis.
## S4 method for signature 'formula': kpca(x, data = NULL, na.action, ...) ## S4 method for signature 'matrix': kpca(x, kernel = "rbfdot", kpar = list(sigma = 0.1), features = 0, th = 1e-4, ...)
x |
The data matrix indexed by row or a formula descibing the model. Note, that an intercept is always included, whether given in the formula or not. |
data |
an optional data frame containing the variables in the model (when using a formula). |
kernel |
the kernel function used in training and predicting.
This parameter can be set to any function, of class kernel, which computes a dot product between two
vector arguments. kernlab provides the most popular kernel functions
which can be used by setting the kernel parameter to the following
strings:
|
kpar |
the list of hyper-parameters (kernel parameters).
This is a list which contains the parameters to be used with the
kernel function. For valid parameters for existing kernels are :
|
features |
Number of features (principal components) to return. (default: 0 , all) |
th |
the value of the eigenvalue under which principal components are ignored (only valid when features = 0). (default : 0.0001) |
na.action |
A function to specify the action to be taken if NA s are
found. The default action is na.omit , which leads to rejection of cases
with missing values on any required variable. An alternative
is na.fail , which causes an error if NA cases
are found. (NOTE: If given, this argument must be named.) |
... |
additional parameters |
By the use of kernel functions one can efficiently compute principal components in high-dimensional feature spaces, related to input space by some non-linear map.
An S4 object containing the principal component vectors along with the corresponding eigenvalues.
pcv |
a matrix containing the principal component vectors (column wise) |
eig |
The corresponding eigenvalues |
rotated |
The original data projected (rotated) on the principal components |
xmatrix |
The original data matrix |
all the slots of the object can be accessed by accessor functions.
Alexandros Karatzoglou
alexandros.karatzoglou@ci.tuwien.ac.at
Schoelkopf B., A. Smola, K.-R. Mueller :
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation 10, 1299-1319
http://mlg.anu.edu.au/~smola/papers/SchSmoMul98.pdf
kcca
, pca
# another example using the iris data(iris) test <- sample(1:50,20) kpc <- kpca(~.,data=iris[-test,-5],kernel="rbfdot",kpar=list(sigma=0.2),features=2) #print the principal component vectors pcv(kpc) #plot the data projection on the components plot(rotated(kpc),col=as.integer(iris[-test,5]),xlab="1st Principal Component",ylab="2nd Principal Component") #embed remaining points emb <- predict(kpc,iris[test,-5]) points(emb,col=iris[test,5])