src.experiments.saasgd module
Test SA-ASGD against standard SGD on linear regression with synthetic overparameterized data.
From the base repository directory: python -m src.experiments.saasgd
The implemented SA-ASGD algorithm is taken from: “Staleness-aware Asynchronous SGD for Distributed Deep Learning” (https://arxiv.org/pdf/1511.05950).
- src.experiments.saasgd.main()[source]
Main function for running the experiments comparing ASAP-SGD algorithm and standard SGD.
In the script a comparative experiment between ASAP-SGD and standard SGD is performed by: 1. Generate a synthetic linear regression datasets (with specified overparameterization). 2. Trains models using both SGD and ASAP-SGD across multiple random seeds (200 by default). 3. Evaluates trained models across multiple metrics: test loss, weight properties (L2 norm, sparsity, kurtosis) and convergence statistics. 4. Compares results via statistical tests (paired t-tests) 5. Visualize results: loss distributions, staleness patterns, weight characteristics.
Strong reproducibility is ensured by using a fixed master seed. Checkpoints are created to save losses, weight properties, and staleness distributions for both ASAP-SGD and SGD training.