- Senior engineer delivering data platforms from ingestion to insight and operations.
- Focused on reliability, cost-awareness, and clear docs/automation.
- Currently studying AI/MLOps to broaden impact beyond pipelines.
- Data Engineering and Data Lake/Warehouse (batch + streaming where needed)
- Analytics Engineering (dbt, modeling, testing, documentation)
- Platform/IaC and CI/CD (Terraform, GitHub Actions)
- MLOps (packaging, deployment, monitoring of models and data apps)
- Cloud/IaC: AWS, Terraform, Kubernetes, Docker, GitHub Actions
- Data: Snowflake, Databricks, S3, Parquet, Delta
- Orchestration: Airflow, dbt
- Processing: Python, PySpark, SQL
- Observability: Grafana/Prometheus (where applicable)
Badges:
- Snowflake + Airflow + dbt + Terraform
- Prod-ready TF for warehouses/databases/schemas, Makefile automation, Airflow 3.1 setup, dbt execution via DAGs.
- Databricks + dbt + Airflow
- Orchestrated transformations with provider setup, troubleshooting notes, and ops scripts.
- PySpark API Lab
- Minimal Spark I/O and environment sanity checks for reliable local/dev runs.
- Flink Quickstart
- Local Flink setup, job submission, and UI overview.
Tip: See pinned repos for docs, diagrams, and scripts. Each repo includes a Quickstart and Troubleshooting section.
- Medium: https://medium.com/@thestoneageddeveloper
- From Chemical Engineering to Tech
- Growth in Big Data
- Embracing Failure and Imposter Syndrome
- LinkedIn: https://www.linkedin.com/in/faisalmomoniat/
- Email: faisalmomoniat@gmail.com

