Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add README.md for dp, mf and optimization #325

Merged
merged 4 commits into from
Aug 15, 2022
Merged
Show file tree
Hide file tree
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
69 changes: 69 additions & 0 deletions scripts/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,10 +2,43 @@
We provide some scripts for reproducing existing algorithms with FederatedScope, which are constantly being updated.
We greatly appreciate any [contribution](https://federatedscope.io/docs/contributor/) to FederatedScope!

- [Federated Optimization Algorithms](#fed-optimization)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

an invalid link here: #fed-optimization -> #federated-optimization-algorithm

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

modified accordingly

- [Distribute Mode](#distribute-mode)
- [Asynchronous Training Strategy](#asynchronous-training-strategy)
- [Graph Federated Learning](#graph-federated-learning)
- [Attacks in Federated Learning](#attacks-in-FL)
- [Differential Privacy in Federated Learning](#dp-in-FL)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Invalid link

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

modified accordingly

- [Matrix Factorization in Federated Learning](#mf-in-FL)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Invalid link


### Federated Optimization Algorithm
Users can replace the fedavg algorithm by other federated optimization algorithms.
In the following we provide some running scripts for FedOpt[1] and FedProx[2] on different dataset.

#### FedOpt
Run fedopt on different dataset via
```bash
# on femnist
bash optimization_exp_scripts/fedopt_exp_scripts/run_fedopt_femnist.sh
# on synthetic
bash optimization_exp_scripts/fedopt_exp_scripts/run_fedopt_lr.sh
# on shakespeare
bash optimization_exp_scripts/fedopt_exp_scripts/run_fedopt_shakespeare.sh
```

#### FedProx
Run fedprox on different dataset via
```bash
# on femnist
bash optimization_exp_scripts/fedprox_exp_scripts/run_fedprox_femnist.sh
# on lr
bash optimization_exp_scripts/fedprox_exp_scripts/run_fedprox_lr.sh
# on shakespeare
bash optimization_exp_scripts/fedprox_exp_scripts/run_fedprox_shakespeare.sh
```

[1] Asad M, Moustafa A, Ito T. "FedOpt: Towards communication efficiency and privacy preservation in federated learning". Applied Sciences, 2020, 10(8): 2864.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Move all the references to the end of this docs?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

modified accordingly


[2] Anit Kumar Sahu, Tian Li, Maziar Sanjabi, Manzil Zaheer, Ameet Talwalkar, Virginia Smith. "On the Convergence of Federated Optimization in Heterogeneous Networks." ArXiv abs/1812.06127 (2018).

### Distribute Mode
Users can train an LR on generated toy data with distribute mode via:
Expand Down Expand Up @@ -98,4 +131,40 @@ Run the BadNet attack:
python federatedscope/main.py --cfg scripts/attack_exp_scripts/backdoor_attack/backdoor_badnet_fedavg_convnet2_on_femnist.yaml
```

### Differential Privacy in Federated Learning

Users can train models with protection of differential privacy.
Taking the dataset FEMNIST as an example, execute the running scripts via:
```bash
bash dp_exp_scripts/run_femnist_dp_standalone.sh
```
You can also enable DP algorithm with other dataset and models by adding the following configurations:
```yaml
nbafl:
use: True
mu: 0.1
epsilon: 10
constant: 30
w_clip: 0.1
federate:
join_in_info: ["num_sample"]
```

### Matrix Factorization in Federated Learning
We support federated matrix factorization tasks in both vertical and horizontal federated learning scenario.
Users can run matrix factorization tasks on MovieLen dataset via
```bash
# vfl
bash mf_exp_scripts/run_movielens1m_vfl_standalone.sh
# hfl
bash mf_exp_scripts/run_movielens1m_hfl_standalone.sh
```
Also, we support SGDMF[1] algorithm in federated learning, and users can run it via
```bash
# hfl
bash mf_exp_scripts/run_movielens1m_hflsgdmf_standalone.sh
# vfl
bash mf_exp_scripts/run_movielens1m_vflsgdmf_standalone.sh
```

[1] Zitao Li, Bolin Ding, Ce Zhang, Ninghui Li, Jingren Zhou. "Federated Matrix Factorization with Privacy Guarantee." Proceedings of the VLDB Endowment, 15(4): 900-913 (2021).
52 changes: 0 additions & 52 deletions scripts/dp_exp_scripts/parse_nbafl_results.py

This file was deleted.

59 changes: 9 additions & 50 deletions scripts/dp_exp_scripts/run_femnist_dp_standalone.sh
Original file line number Diff line number Diff line change
@@ -1,54 +1,13 @@
set -e

cudaid=$1
cd ..

if [ ! -d "out_nbafl" ];then
mkdir out_nbafl
fi

echo "NbAFL starts..."

clips=(0.1)
epsilons=(10. 50. 100.)
mus=(0.01)
constants=(1. 2. 3.)

for ((iw=0; iw<${#clips[@]}; iw++ ))
do
for ((ie=0; ie<${#epsilons[@]}; ie++ ))
do
for ((im=0; im<${#mus[@]}; im++ ))
do
for ((ic=0; ic<${#constants[@]}; ic++ ))
do
python federatedscope/main.py --cfg federatedscope/cv/baseline/fedavg_convnet2_on_femnist.yaml device ${cudaid} nbafl.use True \
data.root /mnt/gaodawei.gdw/data/ \
nbafl.mu ${mus[$im]} \
nbafl.epsilon ${epsilons[$ie]} \
nbafl.constant ${constants[$ic]} \
nbafl.w_clip ${clips[$iw]} \
>>out_nbafl/temp.out \
2>>out_nbafl/clip_${clips[$iw]}_eps_${epsilons[$ie]}_mu_${mus[$im]}_const_${constants[$ic]}.log
done
done
done
done

for ((iw=0; iw<${#clips[@]}; iw++ ))
do
for ((ie=0; ie<${#epsilons[@]}; ie++ ))
do
for ((im=0; im<${#mus[@]}; im++ ))
do
for ((ic=0; ic<${#constants[@]}; ic++ ))
do
python federatedscope/../scripts/dp_exp_scripts/parse_nbafl_results.py --input out_nbafl/clip_${clips[$iw]}_eps_${epsilons[$ie]}_mu_${mus[$im]}_const_${constants[$ic]}.log \
--round 300\
>>out_nbafl/parse.log
done
done
done
done

echo "Ends."
echo "Run NbAFL on femnist."

python federatedscope/main.py --cfg federatedscope/cv/baseline/fedavg_convnet2_on_femnist.yaml\
nbafl.use True \
nbafl.mu 0.1 \
nbafl.epsilon 20. \
nbafl.constant 1. \
nbafl.w_clip 0.1 \
federate.join_in_info ["num_sample"]
17 changes: 0 additions & 17 deletions scripts/dp_exp_scripts/run_femnist_standard_standalone.sh

This file was deleted.

52 changes: 0 additions & 52 deletions scripts/fedopt_exp_scripts/parse_fedopt_results.py

This file was deleted.

31 changes: 0 additions & 31 deletions scripts/fedopt_exp_scripts/run_fedopt_femnist.sh

This file was deleted.

31 changes: 0 additions & 31 deletions scripts/fedopt_exp_scripts/run_fedopt_lr.sh

This file was deleted.

31 changes: 0 additions & 31 deletions scripts/fedopt_exp_scripts/run_fedopt_shakespeare.sh

This file was deleted.

Loading