Difference between revisions of "Publications/newton.18.tocl"
From LRDE
Line 7: | Line 7: | ||
| volume = 20 |
| volume = 20 |
||
| number = 1 |
| number = 1 |
||
− | | pages = |
+ | | pages = 1 to 36 |
| lrdeprojects = Climb |
| lrdeprojects = Climb |
||
| abstract = Binary Decision Diagrams (BDDs) and in particular ROBDDs (Reduced Ordered BDDs) are a common data structure for manipulating Boolean expressions, integrated circuit design, type inferencers, model checkers, and many other applications. Although the ROBDD is a lightweight data structure to implement, the behavior, in terms of memory allocation, may not be obvious to the program architect. We explore experimentally, numerically, and theoretically the typical and worst-case ROBDD sizes in terms of number of nodes and residual compression ratios, as compared to unreduced BDDs. While our theoretical results are not surprising, as they are in keeping with previously known results, we believe our method contributes to the current body of research by our experimental and statistical treatment of ROBDD sizes. In addition, we provide an algorithm to calculate the worst-case size. Finally, we present an algorithm for constructing a worst-case ROBDD of a given number of variables. Our approach may be useful to projects deciding whether the ROBDD is the appropriate data structure to use, and in building worst-case examples to test their code. |
| abstract = Binary Decision Diagrams (BDDs) and in particular ROBDDs (Reduced Ordered BDDs) are a common data structure for manipulating Boolean expressions, integrated circuit design, type inferencers, model checkers, and many other applications. Although the ROBDD is a lightweight data structure to implement, the behavior, in terms of memory allocation, may not be obvious to the program architect. We explore experimentally, numerically, and theoretically the typical and worst-case ROBDD sizes in terms of number of nodes and residual compression ratios, as compared to unreduced BDDs. While our theoretical results are not surprising, as they are in keeping with previously known results, we believe our method contributes to the current body of research by our experimental and statistical treatment of ROBDD sizes. In addition, we provide an algorithm to calculate the worst-case size. Finally, we present an algorithm for constructing a worst-case ROBDD of a given number of variables. Our approach may be useful to projects deciding whether the ROBDD is the appropriate data structure to use, and in building worst-case examples to test their code. |
||
Line 25: | Line 25: | ||
number = <nowiki>{</nowiki>1<nowiki>}</nowiki>, |
number = <nowiki>{</nowiki>1<nowiki>}</nowiki>, |
||
month = jan, |
month = jan, |
||
− | pages = |
+ | pages = <nowiki>{</nowiki>1--36<nowiki>}</nowiki>, |
+ | abstract = <nowiki>{</nowiki>Binary Decision Diagrams (BDDs) and in particular ROBDDs |
||
+ | (Reduced Ordered BDDs) are a common data structure for |
||
+ | manipulating Boolean expressions, integrated circuit |
||
+ | design, type inferencers, model checkers, and many other |
||
+ | applications. Although the ROBDD is a lightweight data |
||
+ | structure to implement, the behavior, in terms of memory |
||
+ | allocation, may not be obvious to the program architect. We |
||
+ | explore experimentally, numerically, and theoretically the |
||
+ | typical and worst-case ROBDD sizes in terms of number of |
||
+ | nodes and residual compression ratios, as compared to |
||
+ | unreduced BDDs. While our theoretical results are not |
||
+ | surprising, as they are in keeping with previously known |
||
+ | results, we believe our method contributes to the current |
||
+ | body of research by our experimental and statistical |
||
+ | treatment of ROBDD sizes. In addition, we provide an |
||
+ | algorithm to calculate the worst-case size. Finally, we |
||
+ | present an algorithm for constructing a worst-case ROBDD of |
||
+ | a given number of variables. Our approach may be useful to |
||
+ | projects deciding whether the ROBDD is the appropriate data |
||
+ | structure to use, and in building worst-case examples to |
||
+ | test their code.<nowiki>}</nowiki>, |
||
+ | lrdestatus = <nowiki>{</nowiki>accepted<nowiki>}</nowiki> |
||
<nowiki>}</nowiki> |
<nowiki>}</nowiki> |
||
Revision as of 15:23, 24 June 2020
- Authors
- Jim Newton, Didier Verna
- Journal
- ACM Transactions on Computational Logic
- Type
- article
- Projects
- Climb
- Date
- 2018-08-28
Abstract
Binary Decision Diagrams (BDDs) and in particular ROBDDs (Reduced Ordered BDDs) are a common data structure for manipulating Boolean expressions, integrated circuit design, type inferencers, model checkers, and many other applications. Although the ROBDD is a lightweight data structure to implement, the behavior, in terms of memory allocation, may not be obvious to the program architect. We explore experimentally, numerically, and theoretically the typical and worst-case ROBDD sizes in terms of number of nodes and residual compression ratios, as compared to unreduced BDDs. While our theoretical results are not surprising, as they are in keeping with previously known results, we believe our method contributes to the current body of research by our experimental and statistical treatment of ROBDD sizes. In addition, we provide an algorithm to calculate the worst-case size. Finally, we present an algorithm for constructing a worst-case ROBDD of a given number of variables. Our approach may be useful to projects deciding whether the ROBDD is the appropriate data structure to use, and in building worst-case examples to test their code.
Documents
Bibtex (lrde.bib)
@Article{ newton.18.tocl, author = {Jim Newton and Didier Verna}, title = {A Theoretical and Numerical Analysis of the Worst-Case Size of Reduced Ordered Binary Decision Diagrams}, journal = {ACM Transactions on Computational Logic}, year = 2019, volume = {20}, number = {1}, month = jan, pages = {1--36}, abstract = {Binary Decision Diagrams (BDDs) and in particular ROBDDs (Reduced Ordered BDDs) are a common data structure for manipulating Boolean expressions, integrated circuit design, type inferencers, model checkers, and many other applications. Although the ROBDD is a lightweight data structure to implement, the behavior, in terms of memory allocation, may not be obvious to the program architect. We explore experimentally, numerically, and theoretically the typical and worst-case ROBDD sizes in terms of number of nodes and residual compression ratios, as compared to unreduced BDDs. While our theoretical results are not surprising, as they are in keeping with previously known results, we believe our method contributes to the current body of research by our experimental and statistical treatment of ROBDD sizes. In addition, we provide an algorithm to calculate the worst-case size. Finally, we present an algorithm for constructing a worst-case ROBDD of a given number of variables. Our approach may be useful to projects deciding whether the ROBDD is the appropriate data structure to use, and in building worst-case examples to test their code.}, lrdestatus = {accepted} }