Skip to content

ravirahman/neurips_blockflow1

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

BlockFLow Supplementary materials

Contents

Our supplementary materials submission contains the following:

  1. Paper with Appendix
  2. Data for all experiments in the results folder
  3. Code to reproduce figures that appear in the paper in the experiments_and_figures folder
  4. Source code in the [src] folder
  5. Intructions to reproduce results, below:

Result Reproduction Instructions

  1. Install Python 3.8 on Linux. We have not tested BlockFLow on other platforms.
  2. Run pip install -r src/requirements.txt
  3. Run all cells in src/examples/logistic_regression/adult/generate_datasets.ipynb, specifying paths as appropriate
  4. Run all cells in src/examples/logistic_regression/kdd99/generate_datasets.ipynb, specifying paths as appropriate
  5. Install Docker
  6. Results in experiments 1-4 can be reprocduced through off-chain simulation of BlockFLow. To do so,
    1. Run all cells in experiments_and_figures/exp1_thru_4.ipynb, specifying paths as appropriate
    2. For experiments 1, 2, or 4:
      1. Run all cells in experiments_and_figures/exp{NUMBER}.ipynb, where NUMBER is 1, 2, or 4. Specify paths as appropriate
    3. For experiment 3:
      1. Run all cells in experiments_and_figures/score_exp3.ipynb, specifying paths as appropriate
      2. Run all cells in experiments_and_figures/exp3.ipynb, specifying paths as appropriate
  7. Results in experiment 5 can only be reporduced through an on-chain simulation of BlockFLow. To do so,
    1. Install Docker Compose
    2. Copy src/examples/example.config.py to src/examples/config.py. Set RESULTS_FOLDER_PATH to some empty folder
    3. Run all cells in src/infra/geth/generate_gensis.ipynb
    4. Copy the JSON-stringification of genesis to src/infra/geth/genesis.json
    5. Copy the value of the private_keys variable to the value of PRIVATE_KEYS in src/examples/config.py
    6. From src/infra/local, run docker-compose up --build
    7. From src, run python3 -m examples.logistic_regression.run_multi_clients --exp_name adult --dataset_folder /path/to/data/root/adult/split_sym_eq_1_validation_fraction_0.2 --ground_truth_dataset /path/to/data/root/adult/test.dat
    8. From src, run python3 -m examples.logistic_regression.run_multi_clients --exp_name adult --dataset_folder /path/to/data/root/adult/split_sym_eq_3_validation_fraction_0.2 --ground_truth_dataset /path/to/data/root/adult/test.dat
    9. From src, run python3 -m examples.logistic_regression.run_multi_clients --exp_name adult --dataset_folder /path/to/data/root/adult/split_sym_eq_5_validation_fraction_0.2 --ground_truth_dataset /path/to/data/root/adult/test.dat
    10. Inside RESULTS_FOLDER_PATH/tensorboard, run tensorboard --logdir .
    11. Tensorboard visualizes the gas consumption by federated learning round and provides links to export such data into CSV or JSON format. Copy data into statistical software to perform linear regressions.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published