The vision based end-to-end autonomous driving framework for CARLA 0.8 benchmarks. A pytorch implementation for Mixture of Domain-specific Experts (MoDE) framework (paper).
This repository contains the following modules
- Disentanglement_VAE: Disentangling domain-specific feature and domain-general feature from pair images using Cycle-consistent VAE.
- ACTION_MINE: Prediction action values to control an ego-vehicle using representation learning and mixture of experts model.
- Major dependencies
- Python 3.7
- Pytorch 1.6
- cuda 10.2
- Importing an uploaded Anaconda environment (torch.yaml) is recommended.
- Method for acquisition of driving data on CARLA simulator is described in this repository.
- You can download from this document.
- First Stage: Training cycle-consistent VAE
- Collecting pair images using the CARLA datacollector
- Move to the "Disentanglement_VAE"
- Modify the database path variables (train_pair, eval_pair) in train_CycleVAE_lusr_v2.py
- Run the script using below command
python train_CycleVAE_lusr_v2.py --id="ID for this training"
(Download a pre-trained weight file from here)
- The trained weights are saved at save_models/id/id.pth
- Second Stage: Training autonomous driving framework
- Collecting driving dataset using CARLA datacollector
- Mode to the "ACTION_MINE"
- Run the script using below command
python main_wo_weatmask_posi_50_v2_gating.py --id="ID for this training" --train-dir="Training Dataset Path" --eval-dir="Evaluating Dataset Path" --vae-model-dir="Weight path trained by train_CycleVAE_lusr_v2.py"
(Download a pre-trained weight file from here)
- Evaluating using the CARLA benchmark
- Third Stage: Run Benchmark
- Go to the CARLA 0.8.X folder
- Run the CARLA simulator
(Town01) sh CarlaUE4.sh /Game/Maps/Town01 -windowed -world-port=2000 -benchmark -fps=10 -ResX=800 -ResY=600
(Town02) sh CarlaUE4.sh /Game/Maps/Town02 -windowed -world-port=2000 -benchmark -fps=10 -ResX=800 -ResY=600
* You can change the parameters according to the your experimental conditions.
- Run an evaluation script in 'driving-benchmark-AAAI' using below command
python run_representation_action_mine_posi50_gating.py --corl-2017 (or --carla100) --continue-experiment --model-path='Weight path trained by main_wo_weatmask_posi_50_v2_gating.py' --vae-model-dir="Weight path trained by train_CycleVAE_lusr_v2.py"