This repository contains tools and data for detecting classical chart patterns from OHLCV time series.
algo_dataset/– Automatically generated CSV datasets for six chart patterns.create_dataset/– Python package for creating synthetic chart pattern datasets.backtest/– Notebook demonstrating model predictions on real data.train/– Notebook for training a CNN‑LSTM classifier.save_model/– Pretrained model (chart_pattern_model.h5).notepad.txt– Notes about external Kaggle dataset and pattern codes.
- Ascending Triangle
- Ascending Wedge
- Descending Triangle
- Descending Wedge
- Double Top
- Double Bottom
Synthetic datasets are produced using the utilities in create_dataset. Example usage:
from create_dataset import create_dataset
create_dataset(generation_count=1000, n_min=50, n_max=120)This generates random OHLCV sequences for each pattern type and saves them under algo_dataset/.
The notebook train/train.ipynb loads the generated CSV files, constructs sliding windows and trains a CNN‑LSTM model to classify the six patterns. The resulting model can be found in save_model/chart_pattern_model.h5.
Required libraries include:
- TensorFlow / Keras
- pandas
- numpy
- scikit-learn
- mplfinance
backtest/backtest.ipynb shows how to load the trained model and run predictions on OHLCV data fetched from pykrx. The notebook visualizes each segment with the predicted pattern and probability.
The repository includes over 600 MB of synthetic CSV data under algo_dataset. Original Kaggle data references are listed in notepad.txt.