This is a comprehensive analysis of 5 HPO algorithms- General Algorithms (GA), Particle Swarm Optimization (PSO), Differential Evolution (DE), PyHopper HPO, Bayesian and HyperBand Optimization (BOHB). Our comparison results are posted in our paper.
To execute the code, run requirements.txt file on your system and download the jupyter notebooks given for each Evolutionary Algorithm and other HyperParameter Optimization algorithms.
As a collaborator on this project, I was primarily responsible for:
Custom crossover, mutation, and selection strategies
Integrated fitness evaluation using ANN performance
Used early-stopping and performance monitoring
Tuned hidden layer size, activation functions, and learning rates
Helped structure solution encoding and bound constraints
Reviewed optimizer parameter tuning and comparison plots
Assisted in design of a combined optimizer architecture
Supported integration testing and analysis
Kabir Grewal
GitHub Profile