diff --git a/docs/getting_started.md b/docs/getting_started.md index ea97207..dbbfd25 100644 --- a/docs/getting_started.md +++ b/docs/getting_started.md @@ -428,6 +428,48 @@ A: Use UAI format if you want to: - Perform exact inference or marginal probability calculations - Use tensor network methods for decoding +## Benchmark Results + +This section shows the performance benchmarks for the decoders included in BPDecoderPlus. + +### Decoder Threshold Comparison + +The threshold is the physical error rate below which increasing the code distance reduces the logical error rate. Our benchmarks compare BP and BP+OSD decoders: + +![Threshold Plot](images/threshold_plot.png) + +The threshold plot shows logical error rate vs physical error rate for different code distances. Lines that cross indicate the threshold point. + +### BP vs BP+OSD Comparison + +![Threshold Comparison](images/threshold_comparison.png) + +BP+OSD (Ordered Statistics Decoding) significantly improves upon standard BP, especially near the threshold region. + +### Decoding Examples + +**BP Failure Case:** + +![BP Failure Demo](images/bp_failure_demo.png) + +This shows a case where standard BP fails to find the correct error pattern. + +**OSD Success Case:** + +![OSD Success Demo](images/osd_success_demo.png) + +The same syndrome decoded successfully with BP+OSD post-processing. + +### Benchmark Summary + +| Decoder | Threshold (approx.) | Notes | +|---------|---------------------|-------| +| BP (damped) | ~8% | Fast, but limited by graph loops | +| BP+OSD | ~10% | Higher threshold, slightly slower | +| MWPM (reference) | ~10.3% | Gold standard for comparison | + +The BP+OSD decoder achieves near-MWPM performance while being more scalable to larger codes. + ## Next Steps 1. **Generate your first dataset** using the Quick Start command diff --git a/docs/images/bp_failure_demo.png b/docs/images/bp_failure_demo.png new file mode 100644 index 0000000..52674da Binary files /dev/null and b/docs/images/bp_failure_demo.png differ diff --git a/docs/images/osd_success_demo.png b/docs/images/osd_success_demo.png new file mode 100644 index 0000000..ae92927 Binary files /dev/null and b/docs/images/osd_success_demo.png differ diff --git a/docs/images/threshold_comparison.png b/docs/images/threshold_comparison.png new file mode 100644 index 0000000..e29877d Binary files /dev/null and b/docs/images/threshold_comparison.png differ diff --git a/docs/images/threshold_plot.png b/docs/images/threshold_plot.png new file mode 100644 index 0000000..767986f Binary files /dev/null and b/docs/images/threshold_plot.png differ diff --git a/docs/javascripts/mathjax.js b/docs/javascripts/mathjax.js new file mode 100644 index 0000000..632954c --- /dev/null +++ b/docs/javascripts/mathjax.js @@ -0,0 +1,8 @@ +window.MathJax = { + tex: { + inlineMath: [["\\(", "\\)"]], + displayMath: [["\\[", "\\]"]], + processEscapes: true, + processEnvironments: true + } +}; diff --git a/docs/mathematical_description.md b/docs/mathematical_description.md index d2b284b..f055c76 100644 --- a/docs/mathematical_description.md +++ b/docs/mathematical_description.md @@ -7,35 +7,53 @@ See https://github.com/TensorBFS/TensorInference.jl for the Julia reference. ### Factor Graph Notation -- Variables are indexed by x_i with domain size d_i. -- Factors are indexed by f and connect a subset of variables. -- Each factor has a tensor (potential) phi_f defined over its variables. +- Variables are indexed by \(x_i\) with domain size \(d_i\). +- Factors are indexed by \(f\) and connect a subset of variables. +- Each factor has a tensor (potential) \(\phi_f\) defined over its variables. ### Messages -Factor to variable message: +**Factor to variable message:** -mu_{f->x}(x) = sum_{all y in ne(f), y != x} phi_f(x, y, ...) * product_{y != x} mu_{y->f}(y) +\[ +\mu_{f \to x}(x) = \sum_{\{y \in \text{ne}(f), y \neq x\}} \phi_f(x, y, \ldots) \prod_{y \neq x} \mu_{y \to f}(y) +\] -Variable to factor message: +**Variable to factor message:** -mu_{x->f}(x) = product_{g in ne(x), g != f} mu_{g->x}(x) +\[ +\mu_{x \to f}(x) = \prod_{g \in \text{ne}(x), g \neq f} \mu_{g \to x}(x) +\] ### Damping To improve stability on loopy graphs, a damping update is applied: -mu_new = damping * mu_old + (1 - damping) * mu_candidate +\[ +\mu_{\text{new}} = \alpha \cdot \mu_{\text{old}} + (1 - \alpha) \cdot \mu_{\text{candidate}} +\] + +where \(\alpha\) is the damping factor (typically between 0 and 1). ### Convergence -We use an L1 difference threshold between consecutive factor->variable -messages to determine convergence. +We use an \(L_1\) difference threshold between consecutive factor-to-variable +messages to determine convergence: + +\[ +\max_{f,x} \| \mu_{f \to x}^{(t)} - \mu_{f \to x}^{(t-1)} \|_1 < \epsilon +\] ### Marginals After convergence, variable marginals are computed as: -b(x) = (1 / Z) * product_{f in ne(x)} mu_{f->x}(x) +\[ +b(x) = \frac{1}{Z} \prod_{f \in \text{ne}(x)} \mu_{f \to x}(x) +\] + +The normalization constant \(Z\) is obtained by summing the unnormalized vector: -The normalization constant Z is obtained by summing the unnormalized vector. +\[ +Z = \sum_x \prod_{f \in \text{ne}(x)} \mu_{f \to x}(x) +\] diff --git a/mkdocs.yml b/mkdocs.yml index 335e670..d8d0f29 100644 --- a/mkdocs.yml +++ b/mkdocs.yml @@ -47,6 +47,12 @@ markdown_extensions: - pymdownx.details - attr_list - md_in_html + - pymdownx.arithmatex: + generic: true + +extra_javascript: + - javascripts/mathjax.js + - https://unpkg.com/mathjax@3/es5/tex-mml-chtml.js nav: - Home: index.md