Official Implementation of Stingray Detection of Aerial Images Using Augmented Training Images Generated by A Conditional Generative Model
Created by Yi-Min Chou, Chien-Hung Chen, Keng-Hao Liu, Chu-Song Chen
- Crop the object patches (rotate and flip to augment data) and randomly crop the background patches to establish the dataset for training conditional GLO.
- Training conditional GLO and use the well-trained model to generate the fake stingray image.
- Paste the generated stingray patches to original positions.
- Use the augmented data generated from C-GLO to train detection models.
- Python 2.7
- TensorFlow 1.4.0 or higher
- Clone the Mixed-Bg-Fg-Synthesis repository:
$ git clone --recursive https://github.com/ivclab/ConditionalGLO.git
- Install required packages:
$ pip install -r requirements.txt
- Download Stingray Data:
$ python download.py
- Run the training code:
# The training result will be saved in `./logs/FOLDER_NAME/`
$ python main.py --is_train=True
- Run the testing code:
# The testing result will be saved in `./logs/FOLDER_NAME_test/`
$ python main.py --is_train=False --load_path=FOLDER_NAME
Original background image(top), mixed background and foreground synthesis generated by C-GLO (bottom)
Please cite following paper if these codes help your research:
@inproceedings{chou2018stingray,
title={Stingray Detection of Aerial Images Using Augmented Training Images Generated by a Conditional Generative Model},
author={Chou, Yi-Min and Chen, Chien-Hung and Liu, Keng-Hao and Chen, Chu-Song},
booktitle={Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops},
pages={1403--1409},
year={2018}
}
@inproceedings{
title = {Changing Background to Foreground: An Augmentation Method Based on Conditional Generative Network for Stingray Detection},
Author = {Chou, Yi-Min and Chen, Chien-Hung and Liu, Keng-Hao and Chen, Chu-Song},
booktitle = {IEEE International Conference on Image Processing, ICIP},
year = {2018}
}
Please feel free to leave suggestions or comments to Yi-Min Chou, Chien-Hung Chen([email protected]), Keng-Hao Liu([email protected]), Chu-Song Chen([email protected])