A Theoretical and Numerical Analysis of the Worst-Case Size of Reduced Ordered Binary Decision Diagrams

From LRDE

Revision as of 12:34, 28 August 2018 by Bot (talk | contribs) (Created page with "{{Publication | published = true | date = 2018-08-28 | authors = Jim Newton, Didier Verna | title = A Theoretical and Numerical Analysis of the Worst-Case Size of Reduced Orde...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Abstract

Binary Decision Diagrams (BDDs) and in particular ROBDDs (Reduced Ordered BDDs) are a common data structure for manipulating Boolean expressions, integrated circuit design, type inferencers, model checkers, and many other applications. Although the ROBDD is a lightweight data structure to implement, the behavior, in terms of memory allocation, may not be obvious to the program architect. We explore experimentally, numerically, and theoretically the typical and worst-case ROBDD sizes in terms of number of nodes and residual compression ratios, as compared to unreduced BDDs. While our theoretical results are not surprising, as they are in keeping with previously known results, we believe our method contributes to the current body of research by our experimental and statistical treatment of ROBDD sizes. In addition, we provide an algorithm to calculate the worst-case size. Finally, we present an algorithm for constructing a worst-case ROBDD of a given number of variables. Our approach may be useful to projects deciding whether the ROBDD is the appropriate data structure to use, and in building worst-case examples to test their code.


Bibtex (lrde.bib)

@Article{	  newton.18.tocl,
  author	= {Jim Newton and Didier Verna},
  title		= {A Theoretical and Numerical Analysis of the Worst-Case
		  Size of Reduced Ordered Binary Decision Diagrams},
  journal	= {ACM Transactions on Computational Logic},
  year		= 2018,
  abstract	= { Binary Decision Diagrams (BDDs) and in particular ROBDDs
		  (Reduced Ordered BDDs) are a common data structure for
		  manipulating Boolean expressions, integrated circuit
		  design, type inferencers, model checkers, and many other
		  applications. Although the ROBDD is a lightweight data
		  structure to implement, the behavior, in terms of memory
		  allocation, may not be obvious to the program architect. We
		  explore experimentally, numerically, and theoretically the
		  typical and worst-case ROBDD sizes in terms of number of
		  nodes and residual compression ratios, as compared to
		  unreduced BDDs. While our theoretical results are not
		  surprising, as they are in keeping with previously known
		  results, we believe our method contributes to the current
		  body of research by our experimental and statistical
		  treatment of ROBDD sizes. In addition, we provide an
		  algorithm to calculate the worst-case size. Finally, we
		  present an algorithm for constructing a worst-case ROBDD of
		  a given number of variables. Our approach may be useful to
		  projects deciding whether the ROBDD is the appropriate data
		  structure to use, and in building worst-case examples to
		  test their code.},
  lrdestatus	= {accepted}
}