Skip to content
Merged
Show file tree
Hide file tree
Changes from 18 commits
Commits
Show all changes
24 commits
Select commit Hold shift + click to select a range
b9e1a2e
Add noisy circuit dataset for BP decoding demonstration
ChanceSiyuan Jan 17, 2026
d6573a4
Refactor to proper Python package structure
ChanceSiyuan Jan 17, 2026
133c617
Add Makefile and uv support for automated workflow
ChanceSiyuan Jan 18, 2026
504d0ee
Add GitHub Actions CI/CD workflow for automated testing
ChanceSiyuan Jan 18, 2026
85aedc0
Add test coverage reporting and README badges
ChanceSiyuan Jan 18, 2026
4b8961f
Fix CI: allow uv cache without lock file
ChanceSiyuan Jan 18, 2026
fdcf068
Fix CI: disable uv caching
ChanceSiyuan Jan 18, 2026
86dad3b
Remove PNG visualization files from dataset
ChanceSiyuan Jan 18, 2026
e258fd3
Add syndrome database generation (Issue #5)
ChanceSiyuan Jan 18, 2026
3230244
Add detector error model generation (Issue #4)
ChanceSiyuan Jan 18, 2026
9bddaae
Fix CI: accept bool dtype in syndrome tests
ChanceSiyuan Jan 18, 2026
369de2b
Add comprehensive syndrome dataset documentation
ChanceSiyuan Jan 18, 2026
7472fe7
Add minimum working example and pipeline illustration
ChanceSiyuan Jan 18, 2026
57dd24f
Add getting started guide and demo dataset generator
ChanceSiyuan Jan 19, 2026
c2bdf21
Organize datasets into subdirectories and complete Issues #4 and #5
ChanceSiyuan Jan 19, 2026
bf9d31f
Add UAI format support for probabilistic inference (Issue #4)
ChanceSiyuan Jan 19, 2026
9dbf70f
Consolidate documentation into unified getting started guide
ChanceSiyuan Jan 19, 2026
bc5d57f
Organize UAI files into separate datasets/uais/ directory
ChanceSiyuan Jan 19, 2026
b561964
Organize demonstration code into examples/ directory
ChanceSiyuan Jan 20, 2026
cedee04
Update settings.local.json to expand allowed Bash commands and modify…
ChanceSiyuan Jan 20, 2026
ad71d08
add a notebook
ChanceSiyuan Jan 20, 2026
1458f69
Merge branch 'main' into feat/add-noisy-circuits-dataset
GiggleLiu Jan 20, 2026
d1dc69f
Organize scripts into dedicated scripts/ directory
GiggleLiu Jan 20, 2026
8aa716d
Set up MkDocs documentation with GitHub Pages deployment
GiggleLiu Jan 20, 2026
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
9 changes: 9 additions & 0 deletions .claude/settings.local.json
Comment thread
GiggleLiu marked this conversation as resolved.
Outdated
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
{
"permissions": {
"allow": [
"Bash(gh issue view:*)",
"Bash(gh pr comment:*)",
"Bash(gh run view:*)"
]
}
}
37 changes: 37 additions & 0 deletions .github/workflows/test.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
name: Tests

on:
push:
branches: [ main, feat/* ]
pull_request:
branches: [ main ]

jobs:
test:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: ["3.10", "3.11", "3.12"]

steps:
- uses: actions/checkout@v4

- name: Install uv
uses: astral-sh/setup-uv@v4

- name: Set up Python ${{ matrix.python-version }}
run: uv python install ${{ matrix.python-version }}

- name: Install dependencies
run: uv sync --dev

- name: Run tests with coverage
run: uv run pytest --verbose --cov=bpdecoderplus --cov-report=xml --cov-report=term

- name: Upload coverage to Codecov
uses: codecov/codecov-action@v4
with:
file: ./coverage.xml
flags: unittests
name: codecov-umbrella
fail_ci_if_error: false
27 changes: 27 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -1,7 +1,34 @@
# macOS
.DS_Store

# Jupyter
.ipynb_checkpoints/

# Julia
Manifest.toml

# IDE
.vscode/
.idea/

# Python
.venv/
__pycache__/
*.py[cod]
*$py.class
*.egg-info/
dist/
build/
.eggs/
*.egg
.pytest_cache/
.coverage
coverage.xml
htmlcov/
.uv/
uv.lock

# LaTeX
*.aux
*.fls
*.log
Expand Down
45 changes: 45 additions & 0 deletions Makefile
Original file line number Diff line number Diff line change
@@ -0,0 +1,45 @@
.PHONY: help install setup test test-cov generate-dataset generate-dem generate-syndromes clean

help:
@echo "Available targets:"
@echo " install - Install uv package manager"
@echo " setup - Set up development environment with uv"
@echo " generate-dataset - Generate noisy circuit dataset"
@echo " generate-dem - Generate detector error models"
@echo " generate-syndromes - Generate syndrome database (1000 shots)"
@echo " test - Run tests"
@echo " test-cov - Run tests with coverage report"
@echo " clean - Remove generated files and caches"

install:
@command -v uv >/dev/null 2>&1 || { \
echo "Installing uv..."; \
curl -LsSf https://astral.sh/uv/install.sh | sh; \
}

setup: install
uv sync --dev

generate-dataset:
uv run generate-noisy-circuits --distance 3 --p 0.01 --rounds 3 5 7 --task z --output datasets/noisy_circuits

generate-dem:
uv run generate-noisy-circuits --distance 3 --p 0.01 --rounds 3 5 7 --task z --output datasets/noisy_circuits --generate-dem

generate-syndromes:
uv run generate-noisy-circuits --distance 3 --p 0.01 --rounds 3 5 7 --task z --output datasets/noisy_circuits --generate-syndromes 1000

test:
uv run pytest

test-cov:
uv run pytest --cov=bpdecoderplus --cov-report=html --cov-report=term

clean:
rm -rf .pytest_cache
rm -rf __pycache__
rm -rf htmlcov
rm -rf .coverage
rm -rf coverage.xml
find . -type d -name "__pycache__" -exec rm -rf {} + 2>/dev/null || true
find . -type f -name "*.pyc" -delete
3 changes: 3 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,8 @@
# BPDecoderPlus: Quantum Error Correction with Belief Propagation

[![Tests](https://github.com/GiggleLiu/BPDecoderPlus/actions/workflows/test.yml/badge.svg)](https://github.com/GiggleLiu/BPDecoderPlus/actions/workflows/test.yml)
[![codecov](https://codecov.io/gh/GiggleLiu/BPDecoderPlus/branch/main/graph/badge.svg)](https://codecov.io/gh/GiggleLiu/BPDecoderPlus)

A winter school project on circuit-level decoding of surface codes using belief propagation and integer programming decoders, with extensions for atom loss in neutral atom quantum computers.

## Project Goals
Expand Down
222 changes: 222 additions & 0 deletions datasets/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,222 @@
# Noisy Circuit Dataset (Surface Code, d=3)

Circuit-level surface-code memory experiments generated with Stim for **Belief Propagation (BP) decoding** demonstrations.

## Dataset Organization

The dataset is organized into subdirectories by file type:

```
datasets/
├── circuits/ # Noisy quantum circuits (.stim)
├── dems/ # Detector error models (.dem)
├── uais/ # UAI format for probabilistic inference (.uai)
└── syndromes/ # Syndrome databases (.npz)
```

## Overview

| Parameter | Value |
|-----------|-------|
| Code | Rotated surface code |
| Distance | d = 3 |
| Noise model | i.i.d. depolarizing |
| Error rate | p = 0.01 |
| Task | Z-memory experiment |
| Rounds | 3, 5, 7 |

### Noise Application Points
- Clifford gates (`after_clifford_depolarization`)
- Data qubits between rounds (`before_round_data_depolarization`)
- Resets (`after_reset_flip_probability`)
- Measurements (`before_measure_flip_probability`)

## Files

| File | Description |
|------|-------------|
| `sc_d3_r3_p0010_z.stim` | 3 rounds, p=0.01, Z-memory |
| `sc_d3_r5_p0010_z.stim` | 5 rounds, p=0.01, Z-memory |
| `sc_d3_r7_p0010_z.stim` | 7 rounds, p=0.01, Z-memory |

## Using This Dataset for BP Decoding

### Step 1: Load Circuit and Extract Detector Error Model (DEM)

The Detector Error Model is the key input for BP decoding. It describes which errors trigger which detectors.

```python
import stim
import numpy as np

# Load circuit
circuit = stim.Circuit.from_file("datasets/circuits/sc_d3_r3_p0010_z.stim")

# Extract DEM - this is what BP needs
dem = circuit.detector_error_model(decompose_errors=True)
print(f"Detectors: {dem.num_detectors}") # 24
print(f"Error mechanisms: {dem.num_errors}") # 286
print(f"Observables: {dem.num_observables}") # 1
```

### Step 2: Build Parity Check Matrix H

BP operates on the parity check matrix where `H[i,j] = 1` means error `j` triggers detector `i`.

```python
def build_parity_check_matrix(dem):
"""Convert DEM to parity check matrix H and prior probabilities."""
errors = []
for inst in dem.flattened():
if inst.type == 'error':
prob = inst.args_copy()[0]
dets = [t.val for t in inst.targets_copy() if t.is_relative_detector_id()]
obs = [t.val for t in inst.targets_copy() if t.is_logical_observable_id()]
errors.append({'prob': prob, 'detectors': dets, 'observables': obs})

n_detectors = dem.num_detectors
n_errors = len(errors)

# Parity check matrix
H = np.zeros((n_detectors, n_errors), dtype=np.uint8)
# Prior error probabilities (for BP initialization)
priors = np.zeros(n_errors)
# Which errors flip the logical observable
obs_flip = np.zeros(n_errors, dtype=np.uint8)

for j, e in enumerate(errors):
priors[j] = e['prob']
for d in e['detectors']:
H[d, j] = 1
if e['observables']:
obs_flip[j] = 1

return H, priors, obs_flip

H, priors, obs_flip = build_parity_check_matrix(dem)
print(f"H shape: {H.shape}") # (24, 286)
```

### Step 3: Sample Syndromes (Detection Events)

```python
# Compile sampler
sampler = circuit.compile_detector_sampler()

# Sample detection events + observable flip
n_shots = 1000
samples = sampler.sample(n_shots, append_observables=True)

# Split into syndrome and observable
syndromes = samples[:, :-1] # shape: (n_shots, n_detectors)
actual_obs_flips = samples[:, -1] # shape: (n_shots,)

print(f"Syndrome shape: {syndromes.shape}")
print(f"Example syndrome: {syndromes[0]}")
```

### Step 4: BP Decoding (Pseudocode)

```python
def bp_decode(H, syndrome, priors, max_iter=50, damping=0.5):
"""
Belief Propagation decoder (min-sum variant).

Args:
H: Parity check matrix (n_detectors, n_errors)
syndrome: Detection events (n_detectors,)
priors: Prior error probabilities (n_errors,)
max_iter: Maximum BP iterations
damping: Message damping factor

Returns:
estimated_errors: Most likely error pattern (n_errors,)
soft_output: Log-likelihood ratios (n_errors,)
"""
n_checks, n_vars = H.shape

# Initialize LLRs from priors: LLR = log((1-p)/p)
llr_prior = np.log((1 - priors) / priors)

# Messages: check-to-variable and variable-to-check
# ... BP message passing iterations ...

# Hard decision
estimated_errors = (soft_output < 0).astype(int)

return estimated_errors, soft_output

# Decode each syndrome
for i in range(n_shots):
syndrome = syndromes[i]
estimated_errors, _ = bp_decode(H, syndrome, priors)

# Predict observable flip
predicted_obs_flip = np.dot(estimated_errors, obs_flip) % 2

# Check if decoding succeeded
success = (predicted_obs_flip == actual_obs_flips[i])
```

### Step 5: Evaluate Decoder Performance

After decoding, compare predicted vs actual observable flips to measure logical error rate.

```python
def evaluate_decoder(decoder_fn, circuit, n_shots=10000):
"""Evaluate decoder logical error rate."""
dem = circuit.detector_error_model(decompose_errors=True)
H, priors, obs_flip = build_parity_check_matrix(dem)

sampler = circuit.compile_detector_sampler()
samples = sampler.sample(n_shots, append_observables=True)
syndromes = samples[:, :-1]
actual_obs = samples[:, -1]

errors = 0
for i in range(n_shots):
est_errors, _ = decoder_fn(H, syndromes[i], priors)
pred_obs = np.dot(est_errors, obs_flip) % 2
if pred_obs != actual_obs[i]:
errors += 1

return errors / n_shots

# logical_error_rate = evaluate_decoder(bp_decode, circuit)
```

## Regenerating the Dataset

```bash
# Install the package with uv
uv sync

# Generate circuits using the CLI
python -m bpdecoderplus.cli \
--distance 3 \
--p 0.01 \
--rounds 3 5 7 \
--task z \
--generate-dem \
--generate-uai \
--generate-syndromes 10000
```

## Extending the Dataset

```bash
# Different error rates
python -m bpdecoderplus.cli --p 0.005 --rounds 3 5 7 --generate-dem --generate-uai

# Different distances
python -m bpdecoderplus.cli --distance 5 --rounds 5 7 9 --generate-dem --generate-uai

# X-memory experiment
python -m bpdecoderplus.cli --task x --rounds 3 5 7 --generate-dem --generate-uai
```

## References

- [Stim Documentation](https://github.com/quantumlib/Stim)
- [BP+OSD Decoder Paper](https://arxiv.org/abs/2005.07016)
- [Surface Code Decoding Review](https://quantum-journal.org/papers/q-2024-10-10-1498/)
Loading