H36M, MPI-INF-3DHP, MPII and TikTok datasets can be downloaded from the offical websites.
Preprocessing of H36M can refer to this repo.
Annotations of MPII in JSON format can be found in this url.
The file tree should be like:
├── data
│ ├── hm36
│ │ ├── annot
│ │ │ ├── s_01_act_02_subact_01_ca_01
│ │ │ │ ├── matlab_meta.mat
│ │ │ │ ├── matlab_meta.txt
│ │ │ ├── s_01_act_02_subact_01_ca_02
│ │ │ ...
│ │ ├── images
│ ├── mpi_inf_3dhp
│ ├── mpii
│ ├── TikTok_dataset
We provide masks used in our experiments in this link.
Mask processing code can refer to this file, where we use SAM with 2D keypoints to generate masks.
The file tree should be like:
├── data
│ ├── sam_masks
│ │ ├── h36m
│ │ │ ├── s_01_act_02_subact_01_ca_01
│ │ │ │ ├── s_01_act_02_subact_01_ca_01_000001.png
│ │ │ │ ├── ...
│ │ ├── mpi_inf_3dhp
│ │ │ ├── S1
│ │ │ │ ├── Seq1
│ │ │ │ ├── ...
│ │ ├── mpii_val
│ │ │ ├── 000025245.jpg
│ │ │ ├── ...
SMPL models can be downloaded from this link.
The regressor J_regressor_h36m.npy
can be downloaded in this link.
(Optional) The vert segementation file can be found here
The file tree should be like:
├── data
│ ├── smpl_models
│ │ ├── basicModel_f_lbs_10_207_0_v1.0.0.pkl
│ │ ├── basicmodel_m_lbs_10_207_0_v1.0.0.pkl
│ │ ├── basicModel_neutral_lbs_10_207_0_v1.0.0.pkl
│ │ ├── smpl_vert_segmentation.json
│ │ └── J_regressor_h36m.npy
For simplicty, you can download the SURREAL dataset following the instructions in this repo.
You can also process it with our curated distribution by our reimplementation. We refactor the code in custumized_main_part.py
and custumized_main_part2.py
in a more structured form. The absolute paths in config
and run.sh
are required to be modified for your environment.
If successful, you will have the following file tree:
├── data
│ ├── surreal
│ │ ├── test
│ │ │ ├── run0
│ │ │ ├── ...
│ │ ├── train
│ │ └── val
│ │
│ ├── surreal_pseudo
│ │ ├── run0_0
│ │ ├── run0_1
│ │ │ ├── ...
Download smpl_webuser
folder from FLAME and place it under surreal_data_construct
folder.
The SURREAL and our synthetic distribution are commented/uncommented in surreal_reader.py
main function. You should verify it before processsing.
cd surreal_data_construct
python surreal_reader.py
If successful, you will have the final file tree, where check_image visualize the projected 3D keypoints and mesh on the image for every 1000 iterations:
├── data
│ ├── surreal_h36m_pose
│ │ ├── check_image
│ │ │ ├── check_000000_check.png
│ │ │ ├── ...
│ │ ├── image
│ │ │ ├── image_000000.png
│ │ │ ├── ...
│ │ ├── joints
│ │ ├── mask
│ │ └── info.npy
│ │
│ ├── surreal_h36m_pose_pseudo
│ │ ├── ...
For the alternative part-segmentaion synthetic data, you can download the code from this file and run it on Slurm:
./launch_gen.sh <partition> <num_gpu>
If successful, you will have the final file tree:
├── data
│ ├── surreal_h36m_pose
│ │ ├── image
│ │ │ ├── 0_cam_0_0.png
│ │ │ ├── ...
│ │ ├── joints
│ │ ├── mask
│ │ └── info.npy