This is the offical implementation for our CVPR paper.
We reformulate the problem of line segment detection (LSD) as a coupled region coloring problem. Based on this new formulation, we can address the problem of LSD with convolutional neural networks.
Methods | Wireframe Dataset | YorkUrban Dataset | FPS |
---|---|---|---|
LSD | 0.647 | 0.591 | 19.6 |
MCMLSD | 0.566 | 0.564 | 0.2 |
Linelet | 0.644 | 0.585 | 0.14 |
Wireframe Parser | 0.728 | 0.627 | 2.24 |
Ours (U-Net) | 0.752 | 0.639 | 10.3 |
Ours (a-trous) | 0.773 | 0.646 | 6.6 |
Check INSTALL.md for installation instructions.
- Wireframe Dataset: https://github.com/huangkuns/wireframe
- YorkUrban Dataset: http://www.elderlab.yorku.ca/resources/york-urban-line-segment-database-information/
Please follow the above links to download Wireframe and YorkUrban datasets. For Wireframe dataset, we only need the file named pointlines.zip which contains images and line segment annotations for training and testing.
Once the files are downloaded, please unzip them into <AFM_root>/data/wireframe_raw and <AFM_root>/data/york_raw respectively. The structures of wireframe_raw and york_raw folder are as follows:
wireframe_raw/
- pointlines/*.pkl
- train.txt
- test.txt
york_raw/
- filename0_rgb.png
- filename0.mat
...
- filename{N}_rgb.png
- filename{N}.mat
Please run the following commands
cd <AFM_root>/data/
python preparation_wireframe.py
python preparation_york.py
We use the YACS to control the hyper parameters. Our configuration files for U-Net (afm_unet.yaml) and a-trous Residual Unet (afm_atrous.yaml) are saved in the "<AFM_root>/experiments" folder.
In each yaml file, the SAVE_DIR is used to store the network weights and experimental results. The weights are saved in SAVE_DIR/weights and the results are saved in SAVE_DIR/results/DATASET_name.
The TEST configuration is for outputing results in testing phase with different ways (e.g. save or display). We currently provide two output modes "display" and "save". You can custom more output methods in modeling/output/output.py.
The pretrained models for U-Net and atrous Residual U-Net can be downloaded from this link. Please place the weights into "<AFM_root>/experiments/unet/weight" and "<AFM_root>/experiments/atrous/weight" respectively.
- For testing, please run the following command
python test.py --config-file experiments/afm_atrous.yaml --gpu 0
Please run the following command
python train.py --config-file experiments/afm_atrous.yaml --gpu 0
to train a network. To speedup training procedure, our code will save the generated attraction field maps into <AFM_root>/data/wireframe/.cache when you run training code in the first time.
If you find our work useful in your research, please consider citing:
@inproceedings{AFM,
title = "Learning Attraction Field Representation for Robust Line Segment Detection",
author = "Nan Xue and Song Bai and Fudong Wang and Gui-Song Xia and Tianfu Wu and Liangpei Zhang",
booktitle = "IEEE Conference on Computer Vision and Pattern Recognition (CVPR)",
year = {2019},
}