This repository contains an implementation of a multi-feature linear regression model built from scratch in Python using Jupyter Notebook.
This project demonstrates the creation of a linear regression model that can handle multiple input features without relying on external machine learning libraries. The model includes the key steps of training via gradient descent and making predictions on new data.
- Used the california housing train dataset from kaggle (https://www.kaggle.com/datasets/ujwal06/california-housing-train-csv)
- Support for multiple features (independent variables)
- Implementation of closed-form linear regression using Matrix Multiplication supported by numpy
- Data Visualisation using matplotplib
- Clear, commented code for educational purposes
- Clone the repository: git clone https://github.com/razancodes/Multifeature-Linear-Regression.git
- Open the Jupyter Notebook file in your preferred environment
- Run the notebook cells sequentially to see the implementation and results step-by-step.
- Python 3.x
- Jupyter Notebook
This project is licensed under the MIT License - see the LICENSE file for details.
Contributions and suggestions are welcome:)
For any questions or feedback, yodo let me know :) inspired by greg hoggs implementation of multi-variable linear regression from scratch: https://www.youtube.com/watch?v=KYNuzfn5Fx0
This repository is ideal for learners wanting to understand the inner workings of linear regression models with multiple features by building one from scratch.