condentropy {infotheo} | R Documentation |
condentropy
takes two random vectors, X and Y, as input and returns the
conditional entropy, H(X|Y), in nats (base e), according to the entropy estimator method
.
If Y is not supplied the function returns the entropy of X - see entropy
.
condentropy(X, Y=NULL, method="emp")
X |
data.frame denoting a random variable or random vector where columns contain variables/features and rows contain outcomes/samples. |
Y |
data.frame denoting a conditioning random variable or random vector where columns contain variables/features and rows contain outcomes/samples. |
method |
The name of the entropy estimator. The package implements four estimators :
"emp", "mm", "shrink", "sg" (default:"emp") - see details.
These estimators require discrete data values - see discretize . |
condentropy
returns the conditional entropy, H(X|Y), of X given Y in nats.
Patrick E. Meyer
Meyer, P. E. (2008). Information-Theoretic Variable Selection and Network Inference from Microarray Data. PhD thesis of the Universite Libre de Bruxelles.
Cover, T. M. and Thomas, J. A. (1990). Elements of Information Theory. John Wiley, New York.
entropy
, mutinformation
, natstobits
data(USArrests) dat<-discretize(USArrests) H <- condentropy(dat[,1], dat[,2], method = "mm")