Rotation-equivariant convolutional neural network for design of visual prosthetic stimulation protocol
The code for Martin Picek's bachelor thesis supervised by Ján Antolík and Luca Baroni
To clone the repository:
git clone --recurse-submodules git@github.com:mpicek/reCNN_visual_prosthesis.gitBachelor thesis is in this github repo.
Use a Docker image from this repository. It can be obtained from the Docker Hub here - more on instalation in the previous repository.
Run the image locally:
docker run --gpus all -it --rm -v local_dir:$(pwd) picekma/csng_docker_dl:0.1
Or on MetaCentrum:
singularity shell --nv -B $SCRATCHDIR /path/to/the/image.img
where you have to specify your path to a builded Singularity container. The build is described in the repository with the Docker file.
In the container, execute source activate csng-dl in order to activate conda environment.
Then run python train_on_lurz.py, the network starts a training.
To run an evaluation on the best models and see the results, run python present_best_models.py --dataset_type both.
Run python experiments/experiments.py to obtain information from experiments as well as generated graphs in img/ directory.
Connect to MetaCentrum, clone this repository, build a Singularity image
of the docker image provided by us (previous section) and specify the path
to this image in metacentrum/qsub_script.sh as well as path to this repository.
Add your wandb API key to file metacentrum/wandb_api_key.yaml in this format:
WANDB_API_KEY: your_api_keyYour wandb API key can be found in your wandb settings.
Configure a sweep in sweep.yaml.
Create a sweep with wandb sweep sweep.yaml and copy the command into metacentrum/cmd
so that the file looks like this (for example):
# run the same command again and again
wandb agent csng-cuni/reCNN_visual_prosthesis/6ggort1b
To run 3 machines on MetaCentrum that connect as sweep agents, use this command.
python3 ./run_commands.py --command_file=cmd --script=qsub_script.sh --wandb_api_key --num_of_command_repetitions=3