Skip to content

Shu-Feather/Alternating-Back-Propagation

Repository files navigation

Learning a Deep Generative Model by Alternating Back-Propagation (ABP)

This repo implements a top-down deep generative model trained with Alternating Back-Propagation (ABP).
Given a low-dimensional latent variable z (here 2D), a generator network g(z; θ) synthesizes an image.
Training alternates between:

  1. Inferential back-propagation: infer z for each training image using Langevin dynamics (approximate sampling from p(z | x, θ)).
  2. Learning back-propagation: update generator parameters θ by back-propagation given inferred z.

The experiments are conducted on a lion–tiger image dataset resized to 128×128.


Repository Structure


.
├── abp.py                   # Main training script (ABP + generator)
├── run_all_experiments.sh   # Runs warm-start + cold-start experiments
├── report.tex               # LaTeX report (placeholders reference output images)
├── images/                  # Dataset folder (not tracked by git)
└── .gitignore

Note: your main script may be named adp.py in some setups. The provided run_all_experiments.sh will auto-detect abp.py or adp.py.


Requirements

  • Python 3.10+ recommended
  • PyTorch
  • torchvision
  • numpy
  • matplotlib
  • pillow

Example installation (conda):

conda create -n abp python=3.10 -y
conda activate abp
pip install torch torchvision numpy matplotlib pillow

Dataset Setup

Place all lion/tiger images under:

./images/

The script loads all files inside images/ as training data. Images are resized to 128×128 and normalized to [-1, 1].


Running Experiments

1) Warm-start

python abp.py \
  --start warm \
  --lr 4e-4 \
  --langevin_step_size 0.05 \
  --langevin_num_steps 120 \
  --mse_sigma 1 \
  --prior_sigma 1 \
  --n_epochs 2000 \
  --n_log 100 \
  --n_stats 100 \
  --n_plot 200 \
  --seed 1

2) Cold-start

python abp.py \
  --start cold \
  --lr 4e-4 \
  --langevin_step_size 0.05 \
  --langevin_num_steps 120 \
  --mse_sigma 1 \
  --prior_sigma 1 \
  --n_epochs 2000 \
  --n_log 100 \
  --n_stats 100 \
  --n_plot 200 \
  --seed 1

Outputs (What to Submit)

The project requires:

  1. Reconstructed images (using inferred z):

    • XXXX_recon.png
  2. Randomly generated images (sampling z ~ N(0, I)):

    • XXXX_sampled.png
  3. Latent interpolation grid (2D interpolation):

    • XXXX_interp.png
  4. Loss plot over iterations:

    • stat.png and stat.pdf

Each experiment folder also includes:

  • output.log — training logs (loss, z mean/std)

Example output files:

warm_0.0004_0.05_120_1/
  0200_recon.png
  0200_sampled.png
  0200_interp.png
  ...
  2000_recon.png
  2000_sampled.png
  2000_interp.png
  stat.png
  stat.pdf
  output.log

Notes / Tips

  • Warm-start vs Cold-start:

    • Warm-start reuses previous z values and often converges faster / more stably.
    • Cold-start reinitializes from the prior each epoch and can be noisier.
  • If samples look noisy or unstable:

    • Reduce --langevin_step_size (e.g., 0.03)
    • Increase --langevin_num_steps (e.g., 200)
    • Try adjusting --mse_sigma (e.g., 0.5)

Reproducibility

All runs are controlled by --seed. To reproduce results exactly, use the same seed and hyperparameters.


License

MIT License

About

This is the project 5 for computer vision in Peking University

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors