kl.norm {monomvn} | R Documentation |
Returns the Kullback-Leibler (KL) divergence (a.k.a. distance) between two multivariate normal (MVN) distributions described by their mean vector and covariance matrix
kl.norm(mu1, S1, mu2, S2, quiet=FALSE, symm=FALSE)
mu1 |
mean vector of first MVN |
S1 |
covariance matrix of first MVN |
mu2 |
mean vector of second MVN |
S2 |
covariance matrix of second MVN |
quiet |
when FALSE (default), gives a warning if
posdef.approx finds a non-positive definite
covariance matrix |
symm |
when TRUE a symmetrized version of the
K-L divergence is used. See the note below |
The KL divergence is given by the formula:
0.5 (log(det(S2)/det(S1)) + tr(inv(S2)S1) + t(mu2-m1)inv(S2)(mu2-mu1) - N)
where N is length(mu1)
.
As a preprocessing step S1
and S2
are checked
to be positive definite via the posdef.approx
. If
not, and if the accuracy package is installed, then
they can be coerced to the nearest positive definite matrix.
Returns a positive real number giving the KL divergence between the two normal distributions.
The KL-divergence is not symmetric. Therefore
kl.norm(mu1,S1,mu2,S2) != kl.norm(mu2,S2,mu1,S1).
But a symmetric metric can be constructed from
0.5 * (kl.norm(mu1,S1,mu2,S2) + kl.norm(mu2,S2,mu1,S1))
which is the default (when symm = TRUE
)
Robert B. Gramacy bobby@statslab.cam.ac.uk
http://www.statslab.cam.ac.uk/~bobby/monomvn.html
mu1 <- rnorm(5) s1 <- matrix(rnorm(100), ncol=5) S1 <- t(s1) %*% s1 mu2 <- rnorm(5) s2 <- matrix(rnorm(100), ncol=5) S2 <- t(s2) %*% s2 ## not symmetric kl.norm(mu1,S1,mu2,S2) kl.norm(mu2,S2,mu1,S1) ## symmetric 0.5 *(kl.norm(mu1,S1,mu2,S2) + kl.norm(mu2,S2,mu1,S1))