This is a reusable neural network I made completely from scratch. It is already trained on a couple of problems.
NOTE:
Using the sinewave approximation model requires you to normalize inputs to (0, 1), and reverse the output using
(output * 2) - 1
The hidden layers use tanh activation, while the output layer uses sigmoid activation. These activations can easily
be modified inside the forward_h() and forward_o() functions in the Neuron class.
When testing the result of one input, use the predict() function. This function does not change the network's
weights and bias values.
This is the function that begins the training process. The number of epochs determines the number of training cycles
the network will run for. The outputs are logged every 1000 epochs.
formatOutputs is a bool parameter set to false by default. Setting it to true will round the outputs to 0 or 1.
After finishing the training process, you can save all learning progress into a .txt file using the N.save() function.
If you wish to continue the training, you can use the N.load() function to start from where you left off.
NOTE:
Continuing the training process requires you to use the same architecture as the first time, as the number of weights and biases changes with the number of layers, neurons, inputs and outputs.
Network(numInputs, numOutputs, numHidden, nPerHidden)
This is the Network constructor. It consists of the following elements:
int numInputs--> The number of inputs the Network will receiveint numOutputs--> The number of neurons in the output layerint numHidden--> The number of hidden layersstd::vector<int> nPerHidden--> The number of neurons for each hidden layer
g++ main.cpp funcs.cpp classes.cpp -o main.exe