src.experiments.asap_sgd module

Test ASAP-SGD against standard SGD on linear regression with synthetic overparameterized data.

From the base repository directory: python -m src.experiments.asap_sgd

The implemented ASAP-SGD algorithm is taken from: “Instance-based Adaptiveness to Staleness in Asynchronous SGD” (https://proceedings.mlr.press/v162/backstrom22a/backstrom22a.pdf).

src.experiments.asap_sgd.main()[source]

Main function for running the experiments comparing ASAP-SGD algorithm and standard SGD.

In the script a comparative experiment between ASAP-SGD and standard SGD is performed by: 1. Generate a synthetic linear regression datasets (with specified overparameterization). 2. Trains models using both SGD and ASAP-SGD across multiple random seeds (200 by default). 3. Evaluates trained models across multiple metrics: test loss, weight properties (L2 norm, sparsity, kurtosis) and convergence statistics. 4. Compares results via statistical tests (paired t-tests) 5. Visualize results: loss distributions, staleness patterns, weight characteristics.

Strong reproducibility is ensured by using a fixed master seed. Checkpoints are created to save losses, weight properties, and staleness distributions for both ASAP-SGD and SGD training.