Non-modular SNN: Spiking Neural Networks for Visual Place Recognition via Weighted Neuronal Assignments
This code is licensed under MIT License. If you use our non-modular SNN code, please cite our paper:
@article{hussaini2022spiking,
title={Spiking Neural Networks for Visual Place Recognition via Weighted Neuronal Assignments},
author={Hussaini, Somayeh and Milford, Michael J and Fischer, Tobias},
journal={IEEE Robotics and Automation Letters},
year={2022},
publisher={IEEE}
}
Our recommended tool to install all dependencies is conda (or better: mamba). Please download and install conda or mamba, if you have not installed it yet.
You can create the conda environment using one of the following options.
Option 1:
conda create -n vprsnn -c conda-forge python numpy matplotlib pathlib opencv tqdm pickle5 brian2 scikit-learn ipykernel numba cudatoolkit pytorch autopep8 pandas seaborn wandb
Option 2 (using the provided environment.yml file):
conda env create -f environment.yml
Option 3 (using the provided requirements.txt file):
conda create --name vprsnn --file requirements.txt -c conda-forge
Activate the conda environment:
conda activate vprsnn
- Please ensure you have created and activated the conda environment.
- Nordland datasets, which can be downloaded from: https://huggingface.co/datasets/Somayeh-h/Nordland (If not already available)
Notes:
- We use the Spring and Fall traverses to train our modular SNN network. We consider these traverses as our reference dataset, and use it to train our network.
- We use the Summer traverse as our query dataset.
- We remove sections were the train is moving at speeds less than 15 km/h, the filtered list of images are provided in
dataset_imagenames/nordland_imageNames.txt
. Please note that this filtered image list file is for the variation of the Nordland dataset provided in the link as mentioned above. - We sample both our reference and query datasets to extract places at approximately every 100 m (every 8th image). Our code is provided in
tools/data_utils.py
-
run
non_modular_snn/single_snn_model_processing.py
withargs.process_mode="train"
to:- Generate the initial synaptic weights of the SNN model using
tools/random_connection_generator.py
. - Train the snn model using
non_modular_snn/snn_model.py
on the reference set. - The trained weights will be stored in a subfolder in the folder "weights", which can be used to test the performance.
- The output will be stored in a subfolder in the folder "outputs", which also contains log files.
- Generate the initial synaptic weights of the SNN model using
Train the non-modular SNN with the default configs locally:
python non_modular_snn/single_snn_model_processing.py --process_mode="train"
The trained weights of our Non-modular SNN on the Nordland dataset, using reference traverses spring and fall, are available here.
-
run
non_modular_snn/single_snn_model_processing.py
withargs.process_mode="test"
to:- Test your trained model on the query set. The trained weights for a model with 100 places (current configuration across all files) is provided in a subfolder in weights folder.
- Evaluate the performance of the model on the query set using
non_modular_snn/snn_model_evaluation.py
.
-
Run
tools/weight_visualisations.py
to visualise the learnt weights. -
The output will be stored in the same subfolder as in the training folder "outputs", which also contains log files.
Test the non-modular SNN with the default configs locally:
python non_modular_snn/single_snn_model_processing.py --process_mode="test"
This work is supported by the Australian Government, Intel Labs, and the Queensland University of Technology (QUT) through the Centre for Robotics.