Skip to content

The official implementation of paper "Generalizing to Evolving Domains with Latent Structure-Aware Sequential Autoencoder"

License

Notifications You must be signed in to change notification settings

WonderSeven/LSSAE

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Generalizing to Evolving Domains with Latent Structure-Aware Sequential Autoencoder


This repository provides the official implementations and experiments for our research in evolving domian generalization, including LSSAE and MMD-LSAE. Both of them employ a sequential autoencoder architecture that can be implemented within a same framework, the core codes for implementing these models can be found in network/vae_algorithms.py. Please follow the steps below to prepare the datasets, install the required packages, and run the code.

LSSAE (ICML 2022)



The network architecture for LSSAE.

Generalizing to Evolving Domains with Latent Structure-Aware Sequential Autoencoder
Tiexin Qin, Shiqi Wang and Haoliang Li
Paper: https://proceedings.mlr.press/v162/qin22a/qin22a.pdf

LSSAE is a VAE-based probabilistic framework which incorporates variational inference to identify the continuous latent structures of covariate shift and concept shift in latent space separately and simultaneously for the problem of non-stationary evolving domain generalization.

MMD-LSAE (TPAMI 2023)



The network architecture for MMD-LSAE.

Evolving Domain Generalization Via Latent Structure-Aware Sequential Autoencoder
Tiexin Qin, Shiqi Wang and Haoliang Li
Paper: https://ieeexplore.ieee.org/abstract/document/10268347

MMD-LSAE is built on LSSAE. Different from LSSAE utilizing KL divergence to align the individual posterior distributions with the corresponding prior distributions for the latent codes, MMD-LSAE aligns aggregated posteriors with the priors via minimizing MMD that can yield a tighter lower bound for optimization, better representation learning and stability.

Datasets

We provide the Google Drive links here, so you can download these datasets directly and move them to your own file path for storage.

Requirements

  • python 3.8
  • Pytorch 1.10 or above
  • Pyyaml
  • tqdm

All the required packages can be installed via conda.

🚀 Quick Start

1. Toy Circle/-C

cd ./LSSAE
chmod +x ./scripts/*

# 1. Specify the path for Circle
    --data_path "/data/Toy_Circle/half-circle.pkl" 

# 2. Run script for LSSAE
./scripts/train_circle_lssae.sh or ./scripts/train_circle_c_lssae.sh

# 2. Run script for MMD-LSAE
./scripts/train_circle_mmd.sh or ./scripts/train_circle_c_mmd.sh

2. Other Datasets

To apply our method to other datasets, we can first copy the script of train_circle_*.sh directly, then specify the dataset information and employed network architectures. Here, an example for RMNIST is provided :

# 1. Specify the dataset info for RMNIST
    --data_path "/data/DataSets" 
    --num_classes 10 
    --data_size '[1, 28, 28]' 
    --source-domains 10 
    --intermediate-domains 3 
    --target-domains 6 

# 2. Specify the feature extractor and classifier
    --model-func MNIST_CNN
    --cla-func Linear_Cla

# 3. Run script for LSSAE
./scripts/train_rmnist_lssae.sh

# 3. Run script for MMD-LSAE
./scripts/train_rmnist_mmd.sh

For different datasets, the feature_extractor (model_func in our implementation), classifier (cla_func in our implementation) and hyper-parameters need to be specified. We provide the detailed description of network architectures and most of the hyper-parameters in our Appendix. As this is a reproduced version that implements LSSAE and MMD-LSAE into one unified framework, the results could be a little different. See ./logs for running records.

⭐ Notice

The originally released code for LSSAE only can be found in the ori-lssae branch.

Citations

If you find this repo useful for your research, please cite the following papers:

@inproceedings{Qin2022LSSAE,
title={Generalizing to Evolving Domains with Latent Structure-Aware Sequential Autoencoder},
author={Tiexin Qin and Shiqi Wang and Haoliang Li},
booktitle={ICML},
year={2022}
}

@article{Qin2023MMDLSAE,
author={Qin, Tiexin and Wang, Shiqi and Li, Haoliang},
journal={IEEE Transactions on Pattern Analysis and Machine Intelligence}, 
title={Evolving Domain Generalization Via Latent Structure-Aware Sequential Autoencoder}, 
year={2023},
pages={1-14},
doi={10.1109/TPAMI.2023.3319984}
}

Acknowledgments

Our codes are influenced by the following repos: DomainBed and Disentangled Sequential Autoencoder.

About

The official implementation of paper "Generalizing to Evolving Domains with Latent Structure-Aware Sequential Autoencoder"

Resources

License

Stars

Watchers

Forks

Packages

No packages published