This paper introduces α-NeuS, a new method for simultaneously reconstructing thin transparent objects and opaque objects based on neural implicit surfaces (NeuS).
Project Page | Paper | Data
This is the official repo for the implementation of From Transparent to Opaque: Rethinking Neural Implicit Surfaces with α-NeuS.
The data is organized as follows:
<case_name>
|-- cameras_xxx.npz # camera parameters
|-- image
|-- 000.png # target image for each view
|-- 001.png
...
|-- mask
|-- 000.png # target mask each view (For unmasked setting, set all pixels as 255)
|-- 001.png
...
Here the cameras_xxx.npz
follows the data format in IDR, where world_mat_xx
denotes the world to image projection matrix, and scale_mat_xx
denotes the normalization matrix.
Set up the environment as specified in NeuS.
For synthetic scene
bash train_synthetic.sh
For real-world scene
bash train_real.sh
Or, you can train it step by step as follows:
- train NeuS
python exp_runner.py --mode train --conf ${config_name} --case ${data_dirname}
- validate mesh of NeuS
python exp_runner.py --is_continue --mode validate_mesh --conf ${config_name} --case ${data_dirname} --mcube_threshold -0.0
- validate mesh by using dcudf
python exp_runner.py --is_continue --mode validate_dcudf --conf ${config_name} --case ${data_dirname} --mcube_threshold 0.005
This work is built upon the foundation of NeuS and DoubleCoverUDF. We offer our most sincere thanks to their outstanding work.
If you find our work useful, please feel free to cite us.
@inproceedings{zhang2024from,
title={From Transparent to Opaque: Rethinking Neural Implicit Surfaces with $\alpha$-NeuS},
author={Zhang, Haoran and Deng, Junkai and Chen, Xuhui and Hou, Fei and Wang, Wencheng and Qin, Hong and Qian, Chen and He, Ying},
booktitle={Proceedings of the Neural Information Processing Systems (NeurIPS)},
year={2024}
}