Title
Efficient Probabilistic Inference with Orthogonal Tensor Train Decomposition
Abstract
Even though PGMs reduce memory complexity of full joint distributions and can therefore make inference algorithms like variable elimination or the junction tree algorithm tractable in some cases, the runtime complexity of these algorithms is still dependent on the number of variables and their dimension. The computation of queries during inference tasks can still take a long time, if the considered model has many variables with potentially high dimensionality. This can be necessary for PGMs, since high numbers of variables can ensure a system is accurately represented. The same is true for a high dimensionality of the variables, which might be necessary to represent continuous parameters through discretization. A PGM might not be able to reduce computational expense by a lot if a system has few conditional independencies, which increases the size of the CPDs within the model, thereby exponentially increasing the runtime of the variable elimination algorithm.
The objective of this thesis is to explore how orthogonal decompositions of PGMs can be used to reduce the computational effort necessary to perform inference algorithms on those models. This way, performance of these algorithms might be improved by running them on an orthogonally decomposed tensor representation of a PGM rather than on the PGM itself. Of particular interest are the special properties of orthogonal decompositions like the additive structure of the tensor train carriages and how these properties can contribute to making inference algorithms on decomposed PGMs more efficient. The thesis will mainly examine the possibilities for inference on Bayesian networks provided by orthogonal tensor decomposition.
Requirements
Basic understanding of probabilistic inference, linear algebra
Person working on it
Frederik Schittny
Category
Bachelor thesis