A incomplete survey for Split Learning (comprehensive enough) and Federated Learning (only most representative works).
This list covers most Split Learning works. We also include some Federated Learning works for a easy comparison with SL - are just a sub-list of https://github.com/chaoyanghe/Awesome-Federated-Learning.
*** Updated at 2022/3/7 ***
*** Updated at 2022/1/26 ***
-
Distributed learning of deep neural network over multiple agents
-
Split learning for health: Distributed deep learning without sharing raw patient data
-
Split Learning for collaborative deep learning in healthcare
-
(NeurIPS '21) Federated Split Task-Agnostic Vision Transformer for COVID-19 CXR Diagnosis
-
Combining split and federated architectures for efficiency and privacy in deep learning
-
Comparison of Privacy-Preserving Distributed Deep Learning Methods in Healthcare
-
Server-Side Local Gradient Averaging and Learning Rate Acceleration for Scalable Split Learning
-
Efficient Privacy Preserving Edge Intelligent Computing Framework for Image Classification in IoT
-
Flexible Parallel Learning in Edge Scenarios: Communication, Computational and Energy Cost
-
(ICLR '21 workshop) Pyvertical: A vertical federated learning framework for multi-headed splitnn
-
(ICLR '22) Label Leakage and Protection in Two-party Split Learning
-
Gradient Inversion Attack: Leaking Private Labels in Two-Party Split Learning
-
FedV: Privacy-Preserving Federated Learning over Vertically Partitioned Data
-
End-to-End Evaluation of Federated Learning and Split Learning for Internet of Things
-
Detailed comparison of communication efficiency of split learning and federated learning
-
Communication and Computation Reduction for Split Learning using Asynchronous Training
-
Communication-Efficient Multimodal Split Learning for mmWave Received Power Prediction
-
Communication-Efficient Split Learning Based on Analog Communication and Over the Air Aggregation
-
FedLite: A Scalable Approach for Federated Learning on Resource-constrained Clients
-
Improving the Communication and Computation Efficiency of Split Learning for IoT Applications
-
(CCS '15) Model Inversion Attacks that Exploit Confidence Information and Basic Countermeasures
-
(Asia-CCS '20) Can We Use Split Learning on 1D CNN Models for Privacy Preserving Training?
-
(ACSAC '19) Model inversion attacks against collaborative inference
-
(CCS '21) Unleashing the Tiger: Inference Attacks on Split Learning
-
(PPAI '22 workshop) Feature Space Hijacking Attacks against Differentially Private Split Learning
-
(Asia-CCS '20, increasing depth and differential privacy) Can We Use Split Learning on 1D CNN Models for Privacy Preserving Training?
-
NoPeek: Information leakage reduction to share activations in distributed deep learning
-
Practical Defences Against Model Inversion Attacks for Split Neural Networks
-
FedAdapt: Adaptive Offloading for IoT Devices in Federated Learning
-
AdaSplit: Adaptive Trade-offs for Resource-constrained Distributed Deep Learning
1.Efficient binarizing split learning based deep models for mobile applications
-
PrivColl: Practical Privacy-Preserving Collaborative Machine Learning
-
Split HE: Fast Secure Inference Combining Split Learning and Homomorphic Encryption
-
FedSL: Federated Split Learning on Distributed Sequential Data in Recurrent Neural Networks
-
LSTMSPLIT: Effective SPLIT Learning based LSTM on Sequential Time-Series Data
-
Deep Models Under the GAN: Information Leakage from Collaborative Deep Learning
-
Beyond Inferring Class Representatives: User-Level Privacy Leakage From Federated Learning