Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can this work be federated? #169

Open
njuptlht opened this issue Apr 13, 2023 · 1 comment
Open

Can this work be federated? #169

njuptlht opened this issue Apr 13, 2023 · 1 comment

Comments

@njuptlht
Copy link

I read your another work "federated xgboost", which stated that it does not support differential privacy. If I implement this work in a distributed system, can it be considered federated?

@chester-leung
Copy link
Member

Hi @njuptlht thanks for your interest in our project.

The Secure XGBoost model is that of multiparty outsourced computation, i.e. where all parties' data are individually encrypted and entirely transferred to a central cluster for processing. This is different from federated learning, in which only data summaries are exchanged between parties and a central aggregator service, and in which (in its vanilla form) summaries are unencrypted. The XGBoost training algorithm, however, natively exchange summaries -- see section 3 of the paper here for more.

If you are looking for differential privacy, this project also does not yet support it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants