Skip to main content

Neural Networks for Approximate DNF Counting: An Abridged Report

Ralph Abboud‚ İsmail İlkan Ceylan and Thomas Lukasiewicz

Abstract

Weighted model counting (WMC) has emerged as a prevalent approach for probabilistic inference. In its most general form, WMC is #P-hard. Weighted DNF counting (weighted #DNF) is a special case where approximations with probabilistic guarantees are obtained in O(nm), where n denotes the number of variables, and m the number of clauses of the input DNF, but this is not scalable in practice. In this paper, we propose a novel approach for weighted #DNF that combines approximate model counting with deep learning, and accurately approximates model counts in linear time when width is bounded. We conduct experiments to validate our method, and show that our model learns and generalizes very well to large-scale #DNF instances.

Book Title
Proceedings of the 1st International Workshop on Deep Learning on Graphs: Methodologies and Applications‚ DLGMA 2020‚ New York‚ New York‚ USA‚ February 8‚ 2020
Editor
Lingfei Wu‚ Jian Tang‚ Yinglong Xia‚ and Charu Aggarwal
Month
February
Year
2020