This are some assignments I have done.
In most cases cost functions, gradient calculations, polynomial features creation, feature normalization and debugging graphs are implemented manually using linear algebra, calculus and theory behind ML.
Now I will describe the content of the folders, to give a high level overview here:
1.) Ridge regression
- cost function and gradient for ridge regression
- learning curves
- polynomial features
- validation curve
- learning curves with randomly selected examples
- model tuning using sklearn(Pipeline, GridSearchCV)
2.) Linear regression:
- cost function
- gradient descent(parameters updating)
- feature normalization
- cost function for multivariate regression
- normal equation
- verifying results using sklearn
3.) Logistic Regression
- sigmoid
- cost function and gradient
- polynomial features
- cost function and gradient for regularized regression
- verifying results using sklearn