SPU (Secure Processing Unit) aims to be a provable
, measurable
secure computation device,
which provides computation ability while keeping your private data protected.
SPU could be treated as a programmable device, it's not designed to be used directly. Normally we use SecretFlow framework, which use SPU as the underline secure computing device.
Currently, we mainly focus on provable
security. It contains a secure runtime that evaluates
XLA-like tensor operations,
which use MPC as the underline
evaluation engine to protect privacy information.
SPU python package also contains a simple distributed module to demo SPU usage, but it's NOT designed for production due to system security and performance concerns, please DO NOT use it directly in production.
If you would like to contribute to SPU, please check Contribution guidelines.
This documentation also contains instructions for build and testing.
Linux x86_64 | Linux aarch64 | macOS x64 | macOS Apple Silicon | Windows x64 | Windows WSL2 x64 | |
---|---|---|---|---|---|---|
CPU | yes | yes | yes1 | yes | no | yes |
NVIDIA GPU | experimental | no | no | n/a | no | no |
- Due to CI resource limitation, macOS x64 prebuild binary will no longer available since next release (0.9.x).
Please follow Installation Guidelines to install SPU.
General Features | FourQ based PSI | GPU |
---|---|---|
AVX/ARMv8 | AVX2/ARMv8 | CUDA 11.8+ |
If you think SPU is helpful for your research or development, please consider citing our paper:
@inproceedings {spu,
author = {Junming Ma and Yancheng Zheng and Jun Feng and Derun Zhao and Haoqi Wu and Wenjing Fang and Jin Tan and Chaofan Yu and Benyu Zhang and Lei Wang},
title = {{SecretFlow-SPU}: A Performant and {User-Friendly} Framework for {Privacy-Preserving} Machine Learning},
booktitle = {2023 USENIX Annual Technical Conference (USENIX ATC 23)},
year = {2023},
isbn = {978-1-939133-35-9},
address = {Boston, MA},
pages = {17--33},
url = {https://www.usenix.org/conference/atc23/presentation/ma},
publisher = {USENIX Association},
month = jul,
}
We thank the significant contributions made by Alibaba Gemini Lab.