A beginner-friendly series on logging and monitoring FastAPI, Scrapy, Celery, and Docker containers using Prometheus, Grafana, and the EFK stack—all on one self-hosted machine.
This setup gives you centralized logging, metrics, and dashboards with minimal moving parts — ideal for small projects, self-hosted environments, small prod or dev/test workloads.
Visit my medium publication for more detailed explaination:
- Observability on a Single EC2 - Intro & Setup
- Shipping Logs to Elasticsearch with Fluent Bit
- Auto-SSL/TLS with Traefik and Let’s Encrypt
- VM with atleast 2 CPUs and 4GB RAM
- Docker Compose Installed
- Domain (optional)
git clone https://github.com/Valeron-T/devops-for-one.git
cd devops-for-onedocker compose --profile traefik up -d
docker compose --profile services up -dThis will spin up:
- Elasticsearch
- Kibana
- Prometheus
- Grafana
- Traefik (for routing via subdomains like
kibana.localhost,grafana.localhost)
Generally you will have services running from different docker compose files. Traefik makes use of labels specified from these files to automatically detect what configuration should be applied to the container using docker sockets.
🔧 Add entries to your
/etc/hostsif you're running locally:
127.0.0.1 kibana.localhost grafana.localhost prometheus.localhost
- To ensure grafana and prometheus is setup correctly, visit
grafana.localhost - You should see the login page. Login with the admin user whose initial password is admin. (This is configured in the compose file). Set a new safe password as prompted after login.
- Add Prometheus as a data source.
- Ensure grafana can read data from prometheus.
- The official elasticsearch docs https://www.elastic.co/docs/deploy-manage/security/set-up-minimal-security have explained the user setup process clearly. We will refer the same.
- By default, Elasticsearch enables security features like authentication and role-based access control — which is great, but you’ll need to manually reset passwords the first time you start the stack.
docker exec -it elasticsearch bashInside the container, you’ll use a built-in tool to reset user passwords.
./bin/elasticsearch-reset-password -i -u elasticYou’ll be prompted to enter and confirm a new password.
This user is required for Kibana to authenticate with Elasticsearch.
./bin/elasticsearch-reset-password -i -u kibana_systemOnce both are set, you can exit the container.
docker exec -it kibana bashInside the container, we'll create the keystore and set the password for kibana_system user
./bin/kibana-keystore create
./bin/kibana-keystore add elasticsearch.passworddocker restart kibana- You should now see the kibana login page where you can login as the
elasticuser using it's credentials.
My second article describes how to create roles and users in Elasticsearch, and how to configure Fluent Bit to ship logs to Elasticsearch. Please check that out for details: Shipping Logs to Elasticsearch with Fluent Bit
My third article describes how to deploy the stack on an EC2 instance, including setting up Traefik for automatic SSL/TLS with Let's Encrypt. Please check that out for details: Auto-SSL/TLS with Traefik and Let’s Encrypt