Skip to content

Latest commit

 

History

History
24 lines (17 loc) · 1.77 KB

File metadata and controls

24 lines (17 loc) · 1.77 KB

Project Varro

If FPGAs are universal function approximators, can they be used like neural networks?

This project expands upon Adrian Thompson's famous paper using modern FPGAs and advancements in evolutionary algorithms with the goal of universal function approximation (just like a neural network!)

Installation

Training (Fitting)

To train the individual to solve a problem, for example to evolve the neural network over 500 generations using the multiobjective novelty search - reward (nsr-es) just run:

  • python -m varro.algo.experiment --purpose 'fit' --cxpb 0 --ngen 500 --strategy 'nsr-es' --problem_type 'sinx'

Prediction

To predict from a checkpoint using a numpy file of inputs:

  • python -m varro.algo.experiment --purpose 'predict' --ckptfolder ./checkpoint/varro/algo/sinx_2019-Nov-16-19\:00\:41 --strategy 'nsr-es' --X ./varro/algo/X_test.npy

Results

  • python -m varro.algo.experiment --purpose 'fit' --cxpb 0.0 --mutpb 1.0 --imutsigma 0.1 --ngen 100 --popsize 500 --strategy 'sga' --problem_type 'sinx' sinx_evolve_sga

  • python -m varro.algo.experiment --purpose 'fit' --cxpb 0.0 --mutpb 1.0 --imutsigma 0.1 --ngen 100 --popsize 500 --strategy 'ns-es' --problem_type 'sinx' sinx_evolve_ns-es

  • python -m varro.algo.experiment --purpose 'fit' --cxpb 0.0 --mutpb 1.0 --imutsigma 0.1 --ngen 100 --popsize 500 --strategy 'nsr-es' --problem_type 'sinx' sinx_evolve_nsr-es