Skip to main content

On Sparse‚ Spectral and Other Parameterizations of Binary Probabilistic Models

David Buchman‚ Mark W. Schmidt‚ Shakir Mohamed‚ David Poole and Nando de Freitas

Abstract

This paper studies issues relating to the parameterization of probability distributions over binary data sets. Several such parameterizations of models for binary data are known, including the Ising, generalized Ising, canonical and full parameterizations. We also discuss a parameterization that we call the ``spectral parameterization'', which has received significantly less coverage in existing literature. We provide this parameterization with a spectral interpretation by casting log-linear models in terms of orthogonal Walsh-Hadamard harmonic expansions. Using various standard and group sparse regularizers for structural learning, we provide a comprehensive theoretical and empirical comparison of these parameterizations. We show that the spectral parameterization, along with the canonical, has the best performance and sparsity levels, while the spectral does not depend on any particular reference state. The spectral interpretation also provides a new starting point for analyzing the statistics of binary data sets; we measure the magnitude of higher order interactions in the underlying distributions for several data sets.

Journal
Journal of Machine Learning Research − Proceedings Track for Artificial Intelligence and Statistics (AISTATS)
Pages
173–181
Volume
22
Year
2012