Consider a vector (Xi)i ∈ I of random variables which follow a multivariate normal distribution. To such a vector, we can associate the collection of all — in the context of this vector — true statements of the form "Xi is independent of Xj given the vector XK". These statements are called conditional independence (CI) statements and they are abbreviated to "Xi ⫫ Xj | XK". There are certain inference rules for CI statements which hold universally for all gaussian distributions, e.g. if X1 ⫫ X2 and X1 ⫫ X2 | X3 hold and the distribution is gaussian, then at least one of the statements X1 ⫫ X3 and X2 ⫫ X3 must also hold.
These conditional independences are encoded in the vanishing almost-principal minors of the distribution's covariance matrix. Determining the CI structures of gaussians is therefore equivalent to the task of finding which subsets of a certain set of polynomials in the entries of a positive-definite matrix can simultaneously vanish.
Since these sets are known to be fairly complicated, one tries to approximate them with formal methods. A tractable subset of gaussian CI inference rules are considered as axioms and any combinatorial structure on formal triples (ij|K), representing CI statements, which fulfills the axioms is called a gaussoid. Gaussoids are in many ways similar to matroids, for example in their respective notions of duality, minor and realization. The defining axioms for gaussoids have polynomial counterparts which are analogous to the quadratic Grassmann-Plücker relations which give rise to oriented matroids. Similar extensions of gaussoids are possible and lead, in turn, to natural objects in statistics, e.g. positively orientable gaussoids correspond to MTP2 distributions.
Unlike matroid theory, however, the study of gaussoids has barely left its child's shoes. The goal of this project is to develop the theory of gaussoids and strengthen its links to matroid theory, algebraic statistics, optimization and complexity theory.