kl.norm {monomvn}R Documentation

KL Divergence Between Two Multivariate Normal Distributions

Description

Returns the Kullback-Leibler (KL) divergence (a.k.a. distance) between two multivariate normal (MVN) distributions described by their mean vector and covariance matrix

Usage

kl.norm(mu1, S1, mu2, S2, quiet=FALSE)

Arguments

mu1 mean vector of first MVN
S1 covariance matrix of first MVN
mu2 mean vector of second MVN
S2 covariance matrix of second MVN
quiet when FALSE (default), gives a warning if posdef.approx finds a non-positive definite covariance matrix

Details

The KL divergence is given by the formula:

0.5 (log(det(S2)/det(S1)) + tr(inv(S2)S1) + t(mu2-m1)inv(S2)(mu2-mu1) - N)

where N is length(mu1).

As a preprocessing step S1 and S2 are checked to be positive definite via the posdef.approx. If not, and if the accuracy package is installed, then they can be coerced to the nearest positive definite matrix.

Value

Returns a positive real number giving the KL divergence between the two normal distributions.

Note

The KL-divergence is not symmetric. Therefore

kl.norm(mu1,S1,mu2,S2) != kl.norm(mu2,S2,mu1,S1).

But a symmetric metric can be constructed from

0.5 * (kl.norm(mu1,S1,mu2,S2) + kl.norm(mu2,S2,mu1,S1))

Author(s)

Robert B. Gramacy bobby@statslab.cam.ac.uk

References

http://www.statslab.cam.ac.uk/~bobby/monomvn.html

See Also

posdef.approx

Examples

mu1 <- rnorm(5)
s1 <- matrix(rnorm(100), ncol=5)
S1 <- t(s1) %*% s1

mu2 <- rnorm(5)
s2 <- matrix(rnorm(100), ncol=5)
S2 <- t(s2) %*% s2

## not symmetric
kl.norm(mu1,S1,mu2,S2)
kl.norm(mu2,S2,mu1,S1)

## symmetric
0.5 *(kl.norm(mu1,S1,mu2,S2) + kl.norm(mu2,S2,mu1,S1))

[Package monomvn version 1.1-2 Index]