Skip to content

Latest commit

 

History

History
98 lines (71 loc) · 4.71 KB

File metadata and controls

98 lines (71 loc) · 4.71 KB

Ironhack-Final-Project

Nowadays most commpanies rely on demand planning forecasts to avoid surplus shorthages, so that they are able to capture all the available demand and transform it into revenue, as well as prevent any impact on their customer base due to infraestructure capacity.

The main motivation of this project lies heavily on the second reason, that is, allocate the appropiate resources at the necessary time periods that will allow the company to operate business as usual.

Goal

The aim of this project is to train a time series forecasting model to predict the dollar amount that is going to be disbursed in each market over a certain time period. This will help the Operations Analyst to determine the optimum collateral amount in each payout account to deliver the eligible-to-be-advanced payments in 2-3 business days, in a given time period. The collateral figure to be allocated is defined by the below:

Deliverable

The pipeline provides with a .csv file where the prediction (yhat), the actual historic (y), as well as its upper and lower bounderies of the 95% confidence interval (yhat_upper & yhat_lower) are reflected. This is then used to feed a Tableau dashboard where the user can evaluate the market forecast for a given time period and its reliability meassured by the MAPE.

Model explanation

Interpreting the results of the trained model using Prophet is quite intuitive. The three major components of a timeseries model using Prophet are:

  • Trend
  • Holidays or special events
  • Seasonality

The library provides with built-in methods that provide with simple visualizations of the model components as seen below:

Model Components

Below there is a graph containing the previous model components, the actual historic of the time series and the trend changepoints observed:

Model Forecast

Script execution

It is suggested to execute first python main_scrit.py -h to obtain a view of the available parameters to tune in to execute the model. If no parameter is given when executing the program, this uses the default values specfied in each variable within the .env.

Directory Structure

├── cron.log
├── .env
├── .gitignore
├── data
│   ├── processed
│   └── raw
├── final-project-tableua-data-5c8837f9ded0.json
├── images
│   ├── forecast_changepoint.png
│   ├── forecast_components.png
│   └── Ironhack-Data_Flow-Diagram.png
├── main_script.py
├── notebooks
│   ├── notebook_1.ipynb
│   └── notebook_2.ipynb
├── package1
│   ├── acquire.py
│   ├── analysis.py
│   ├── load.py
│   └── wrangle.py
├── package2
│   └── variables.py
├── README.md
├── requirements.txt
└── __trash__

Stack

  • The model is trained using the Python library fbprophet.
  • Other Python libraries used to model the dataset are pandas, numpy, glob, argparse, pygsheets and matplotlib.
  • Tableau is the visualization tool used to show the results of the model.

Next iteration

  • Add consumer confidence index as regression variable to the model.
  • Train new model using ARIMA and show Prophet vs. ARIMA forecast results in Tableau visualizations.
  • Automate the data extraction directly using a database

References

Below Facebook Prophet references:

Below ARIMA references:

Below Tableau references: