The hardware and bandwidth for this mirror is donated by METANET, the Webhosting and Full Service-Cloud Provider.
If you wish to report a bug, or if you are interested in having us mirror your free-software or open-source project, please feel free to contact us at mirror[@]metanet.ch.
The goal of kldest
is to estimate Kullback-Leibler (KL) divergence \(D_{KL}(P||Q)\) between two probability distributions \(P\) and \(Q\) based on:
The distributions \(P\) and \(Q\) may be uni- or multivariate, and they may be discrete, continuous or mixed discrete/continuous.
Different estimation algorithms are provided for continuous distributions, either based on nearest neighbour density estimation or kernel density estimation. Confidence intervals for KL divergence can also be computed, either via subsampling (preferred) or bootstrapping.
You can install kldest from CRAN:
Alternatively, can install the development version of kldest from GitHub with:
KL divergence estimation based on nearest neighbour density estimates is the most flexible approach.
Set a seed for reproducibility
Analytical KL divergence:
Estimate based on two samples from these Gaussians:
Estimate based on a sample from the first Gaussian and the density of the second:
Uncertainty quantification via subsampling:
kld_ci_subsampling(X, q = q)
#> $est
#> [1] 0.6374628
#>
#> $ci
#> 2.5% 97.5%
#> 0.2601375 0.9008446
Analytical KL divergence between an uncorrelated and a correlated Gaussian:
kld_gaussian(mu1 = rep(0,2), sigma1 = diag(2),
mu2 = rep(0,2), sigma2 = matrix(c(1,1,1,2),nrow=2))
#> [1] 0.5
Estimate based on two samples from these Gaussians:
X1 <- rnorm(100)
X2 <- rnorm(100)
Y1 <- rnorm(100)
Y2 <- Y1 + rnorm(100)
X <- cbind(X1,X2)
Y <- cbind(Y1,Y2)
kld_est_nn(X, Y)
#> [1] 0.3358918
These binaries (installable software) and packages are in development.
They may not be fully stable and should be used with caution. We make no claims about them.