The elementwise Kullback-Leibler divergence, \(x\log(x/y) - x + y\).
kl_div(x, y)
An Expression, vector, or matrix.
An Expression, vector, or matrix.
An Expression representing the KL-divergence of the input.
n <- 5
alpha <- seq(10, n-1+10)/n
beta <- seq(10, n-1+10)/n
P_tot <- 0.5
W_tot <- 1.0
P <- Variable(n)
W <- Variable(n)
R <- kl_div(alpha*W, alpha*(W + beta*P)) - alpha*beta*P
obj <- sum(R)
constr <- list(P >= 0, W >= 0, sum(P) == P_tot, sum(W) == W_tot)
prob <- Problem(Minimize(obj), constr)
result <- solve(prob)
result$value
#> [1] -2.451312
result$getValue(P)
#> [,1]
#> [1,] 2.078807e-09
#> [2,] 2.458762e-09
#> [3,] 3.965245e-09
#> [4,] 2.404350e-08
#> [5,] 5.000000e-01
result$getValue(W)
#> [,1]
#> [1,] 1.039364e-08
#> [2,] 1.084280e-08
#> [3,] 1.423584e-08
#> [4,] 5.213990e-08
#> [5,] 9.999999e-01