Multi container application consisting of mulitple microservices and data sources.
Technologies: Docker, Kubernetes, Helm, Nginx, React, Nodejs, Express, Graphql, Postgres, MongoDB, Elasticsearch, Logstash, Kibana, Jenkins, and Google Cloud,
The nginx web server accepts requests from a client, forwards the request to a server that can fullfill it, and returns the response from the server to the client.
# Client running on
http://localhost:8888
# Upstream person-server
http://localhost:8888/api/person/all
# Upstream address-server
http://localhost:8888/api/address/all
# Upstream company-server
http://localhost:8888/api/company/allExpress Server that fetches addresses from the MongoDB database.
# Running on
http://localhost:5007
# Endpoints
/address/all
/address/:idExpress Server that fetches persons from the Postgres database.
# Running on
http://localhost:5005
# Endpoints
/person/all
/person/:idExpress Server that fetches companies from Elasticsearch.
# Running on
http://localhost:5008
# Endpoints
/company/all
/company/:idApollo Server that fetches data from an external API and exposes the data in a GraphQL API.
# Running on
http://localhost:5009
# Sample request
curl --request POST \
--header 'content-type: application/json' \
--url http://localhost:5009/graphql \
--data '{"query":"query { __typename }"}'Express Server that fetches data from an external API and populates pieces of that data into three different data sources (MongoDB, Postgres and Elasticsearch).
# Running on
http://localhost:5006
# Endpoints
/pour/all # populates all data sources
/pour/address # populates MongoDB
/pour/person # populates Postgres
/pour/company # populates ElasticsearchElasticsearch is used to store, search, and manage data for the company-server (express server).
Obtaining Elasticsearch for Docker is as simply as issuing a docker pull command
against the Elastic Docker registry:
docker pull docker.elastic.co/elasticsearch/elasticsearch:7.10.2Docker Compose is used to start a multi-node Elasticsearch cluster in Docker. This configuration provides a simple method of starting a secured cluster for development before building a distributed deployment with multiple hosts.
Communication with Elasticsearch is done through the Index API's which are used to manage individual indices, index settings, aliases, mappings, and index templates.
# Container running
http://localhost:9200
# Create index
PUT /<index>
# Get index
GET /<target>
# Delete index
DELETE /<index>Elasticsearch provides a full Query DSL (Domain Specific Language) based on JSON to define queries.

Logstash is a service-side data processing pipeline that ingests data and persists it in Elasticsearch.
Run the following script in order to run Logstash with Docker Compose for local development
# Creates a 'logstash/query' directory in 'tmp' with read/write permissions
./scripts/fix-logstash.shAccess the Postgres CLI inside the container:
# Enter container
docker exec -it postgres /bin/sh
# Postgres CLI
psql --username postgres
# Commands
$ \c <dbname> # switch connection to new database
$ \l # list databases
$ \dt # list table
$ \d+ <table> # describe table# Build image
docker build -t person-server -f ./person-server/Dockerfile.dev ./person-server
# Run container
docker run -p 5005:5005 person-server# Start containers
docker-compose up
# Stop containers
docker-compose downThe local Jenkins server is integrated with a local Kubernetes Cluster (Minikube) and Google Cloud.
Continuous Integration and Continuous Delivery using a local Jenkins server running on Kubernetes using Minikube. The Jenkins documentation is located here.
Todo...
Deploy Kubernetes resources using a Helm Chart:
npm run helm:deployAccessing services (ClusterIP)
kubectl port-forward service/person-server-service 5005:5005
# Navigate to
http://localhost:5005TODO...

