Approximate Inference by Kullback-Leibler Tensor Belief Propagation

Probabilistic programming provides a structured approach to sig- nal processing algorithm design. The design task is formulated as a generative model, and the algorithm is derived through auto- matic inference. Efficient inference is a major challenge; e.g., the Shafer-Shenoy algorithm (SS) performs badly on models with large treewidth, which arise from various real-world problems. We focus on reducing the size of these models by storing intermediate factors in compressed form, thereby decoupling the variables through con- ditioning on introduced weights.

This work proposes pruning of these weights using Kullback- Leibler divergence. We adapt a strategy from the Gaussian mix- ture reduction literature, leading to Kullback-Leibler Tensor Belief Propagation (KL-TBP), in which we use agglomerative hierarchi- cal clustering to subsequently merge pairs of weights. Experiments using benchmark problems show KL-TBP consistently outperforms existing methods with competitive runtime.