I design and build modular data automation systems pipelines that ingest, clean, structure, analyze, and forecast data with minimal human intervention. My foundation is in Mechanical Engineering, which taught me the discipline of systems thinking, how to design frameworks that balance logic, precision, and scalability. Today, I apply that mindset to Data Engineering, Machine Learning, Applied AI, and Automation, building tools and infrastructures that turn raw data into continuous, self-learning intelligence. I don’t just analyze data. I build the systems that make intelligence repeatable, scalable, and automatic.
I’m fluent in Python and SQL, and deeply familiar with the major data and AI libraries that power modern analytics from pandas, numpy, scikit-learn, Prophet, and Streamlit, to automation workflows using APIs and schedulers.
My focus is engineering data intelligence systems that operate independently, learn continuously, and serve real-world contexts.
🚀 ADIP (Automated Data Intelligence Platform)
I’m developing a modular engineering framework that automates:
- Data ingestion (APIs, scrapers, batch feeds)
- ETL processing
- Forecasting models
- Analytical summaries and human-readable insights
- Optional dashboards for exploration
The first deployment of this system is the Autonomous EdTech Analyst for Africa — this is a system built to process education datasets, detect patterns, generate narratives, and support real-world decision-making.
ADIP is my long-term foundation for industry-grade data intelligence systems applicable to FinTech, AgriTech, Logistics, and Public Policy analytics.
I’m driven by a long-term vision to build Africa’s Analytical Intelligence Infrastructure — scalable systems that democratize access to intelligence, improve data driven decision making, and empower local innovation. Every project I build is a node in that vision: automation pipelines, modular analysts, forecasting engines, and intelligence frameworks designed for reliability in low-resource environments.
Languages
- Python
- SQL
Data Engineering & Automation
- pandas, numpy
- ETL pipelines
- API integration
- Web scraping
- Scheduling: Cron, GitHub Actions, Render
- Uptime Monitoring
Machine Learning & Forecasting
- scikit-learn
- Prophet
- Feature engineering
- Model evaluation
Visualization & Data Apps
- matplotlib, seaborn, plotly
- Streamlit
- FastAPI / Flask
Software Engineering
- Git & GitHub
- Modular code design
- Environment & dependency management
- Deployment workflows (Render, Streamlit Cloud)
-
Data Engineering
-
Automated ETL Pipelines
-
Machine Learning & Forecasting
-
Full-Stack Data Applications
-
Applied AI & Automation
-
System thinking & Architecture design
-
Cloud Architectures (AWS, GCP)
-
Containerization (Docker)
-
CI/CD for data systems
-
MLOps automation
Every line of code, every pipeline, and every model I deploy serves a singular goal: To create machines that reason, so humans can focus on wisdom.
📧 Email: charleskohwo@gmail.com
💼 LinkedIn: www.linkedin.com/in/charles-onokohwomo
🧠 Portfolio: Coming soon through ADIP and related projects