The 100 Letters Project is driven by the desire to promote real human interaction in an increasingly digital world and create meaningful connections through handwritten communication. Over the course of a year I will write 100 letters to 100 individuals.
The 100 Letters Project website showcases these exchanges, offering a digital display of the letters with details about the recipients and the reasons behind their selection.
This repository is part of the 100 Letters Project which includes the following repositories:
- 100 Letters Project API: The 100 Letters Project API.
- 100 Letters Project Client: The 100 Letters Project NextJS client.
- 100 Letters Project Lambda@Edge: The 100 Letters Project Lambda@Edge.
- 100 Letters Project Authorizer: The 100 Letters Project authorizer.
- Project Overview
- Environments
- Tech Stack
- Setup Instructions
- Commits & Commitizen
- Linting & Formatting
- Unit Tests & Code Coverage
- Error & Performance Metrics
- Development Website Proxy
- E2E Tests
- Lighthouse
- Accessibility
- Deep Linking & Cognito Authentication
- Build
- Deployment Pipelines
- Cognito Access Token
- Connecting to the Bastion Host
- License
This project uses a NextJS static export to an S3 bucket behind AWS Cloudfront. All read data is fetched at build time from the 100 Letters Project API. Write operations and routes are protected via user login and AWS Cognito authentication.
The 100 Letters Project operates in multiple environments to ensure smooth development, testing, and production workflows. Each environment includes custom DNS and a separate cloudfront distribution.
The test environment is protected via signed cookies and is inaccessible to the public.
https://dev.onehundredletters.comhttps://www.dev.onehundredletters.com
The production environment is open to the public.
https://onehundredletters.comhttps://www.onehundredletters.com
The 100 Letters Project Website is built using modern web technologies to ensure a fast, scalable and cost-effective site.
-
AWS CloudFormation: Infrastructure as Code (IaC) is used to define and provision AWS resources like S3, CloudFront and IAM roles in a consistent and repeatable manner.
-
Next.js: A React-based framework used to build the website, providing server-side rendering (SSR), static site generation (SSG), and optimized performance for SEO and fast load times.
-
AWS CloudFront: A content delivery network (CDN) used to distribute the website globally, reducing latency and improving load times by caching static assets close to users.
-
AWS S3: Hosts the statically generated website, ensuring high availability, durability, and cost-efficient storage for the project's front-end assets.
-
GitHub Actions: A CI/CD pipeline automating the deployment process, including build verification, static analysis, and rollback capabilities in case of failed deployments.
-
Cypress: An end-to-end (E2E) testing framework that ensures the website functions correctly across different user flows by simulating real interactions with the UI.
-
Lighthouse: A performance and accessibility auditing tool that evaluates site speed, best practices, and SEO, helping optimize the user experience.
-
Jest: A JavaScript testing framework used for unit and integration testing, ensuring code reliability and preventing regressions.
-
ESLint & Prettier: Linting and formatting tools that enforce code consistency, reduce syntax errors, and improve maintainability across the codebase.
-
Commitizen: A tool for enforcing a standardized commit message format, improving version control history and making collaboration more structured.
-
HTTP Proxy: Used to sign cookies and securely proxy requests to the development website, enabling local testing and authentication workflows.
-
AWS Lambda@Edge: Provides authentication and request routing logic at CloudFront's edge locations, enabling low-latency security enforcement.
-
Sentry.io: Monitors the website for runtime errors and performance issues in production, helping quickly identify and fix bugs that occur in users' browsers.
This tech stack ensures that the 100 Letters Project website remains performant, secure, and easily maintainable while leveraging AWS infrastructure for scalability and reliability.
To clone the repository, install dependencies, and run the project locally follow these steps:
-
Clone the repository:
git clone https://github.com/jessemull/100-letters-project.git
-
Navigate into the project directory:
cd 100-letters-project -
Install the root dependencies:
npm install
-
Set environment variables:
The development server has a predev script that will check for static data loaded during the build at public/data/data.json. If the file is missing, the script will fetch the data required to run the application. The predev script uses machine user credentials to grab a cognito token using the following environment variables set in a
.env.testand.env.productionfiles in the root of the project:COGNITO_USER_POOL_USERNAME=cognito-user-pool-username COGNITO_USER_POOL_PASSWORD=cognito-user-pool-password NEXT_PUBLIC_COGNITO_USER_POOL_CLIENT_ID=cognito-user-pool-client-id -
Start the dev server:
npm run dev
This project uses Commitizen to ensure commit messages follow a structured format and versioning is consistent. Commit linting is enforced via a pre-commit husky hook.
To make a commit in the correct format, run the following command. Commitzen will walk the user through the creation of a structured commit message and versioning:
npm run commitThis project uses ESLint and Prettier for code quality enforcement. Linting is enforced during every CI/CD pipeline to ensure consistent standards.
Run linting:
npm run lintFormat using prettier:
npm run formatLint-staged is configured to run linting before each commit. The commit will be blocked if linting fails, ensuring code quality at the commit level.
This project uses Jest for testing. Code coverage is enforced during every CI/CD pipeline. The build will fail if any tests fail or coverage drops below 80%.
Run tests:
npm run testRun tests in watch mode:
npm run test:watchFishery factory functions to provide mock data are available for all of the models used by the 100 letters project. Factories are located at src/factories:
- Correspondences
- Letters
- Recipients
Coverage thresholds are enforced at 80% for all metrics. The build will fail if coverage drops below this threshold.
This project uses Sentry.io for client-side error and performance monitoring. Since the website is statically built with Next.js, only the browser runtime is instrumented.
Sentry is initialized in the browser with environment-specific settings. Errors and performance data are reported to Sentry’s test or production environments based on the active deployment stage.
Sourcemaps are uploaded during the build process to allow for readable stack traces in the Sentry dashboard. NextJS static export requires manual upload.
The development/test website environment is protected via signed cookies. The proxy server signs cookies and then proxies to the development domain.
The following environment variables must be set in a .env.local file in the root of the project to run the proxy:
CLOUDFRONT_KEY_PAIR_ID=cloudfront-key-pair-id
CLOUDFRONT_DOMAIN=test-domain
CLOUDFRONT_PRIVATE_KEY_PATH=path-to-private-key
To run the proxy and access the test website:
npm run proxyThis project uses Cypress for end to end testing. The build will fail on end to end test failure.
Lighthouse tests for the development/test domain must be run through the proxy server with the appropriate environment variables set (see running the proxy).
In order to have the proxy work as expected and not impact performance, throttling has been set to devtools in the lighthouse config and the proxy is set up to cache files and use a temporary redirect after setting signed cookies (307).
Start the dev server:
npm run devRun E2E tests:
npm run e2eStart the proxy server:
npm run proxyRun E2E tests:
NODE_ENV=test npm run e2eRun E2E tests:
NODE_ENV=production npm run e2eLighthouse is used for performance, SEO and accessibility metrics. It is fully integrated into the CI/CD pipeline and runs on pull-request, merge and deployment.
Lighthouse tests for the development/test domain must be run through the proxy server with the appropriate environment variables set (see running the proxy).
In order to have the proxy work as expected and not impact performance, throttling has been set to devtools in the lighthouse config and the proxy is set up to cache files and use a temporary redirect after setting signed cookies (307).
Start the dev server:
npm run devRun lighthouse:
npm run lighthouseStart the proxy server:
npm run proxyRun lighthouse:
NODE_ENV=test npm run lighthouseRun lighthouse:
NODE_ENV=production npm run lighthouseCoverage thresholds are enforced at 80% for all metrics. The build will fail if coverage drops below this threshold.
Accessibility metrics are measured using lighthouse. All components are unit tested using the jest-axe library.
This project uses Lambda@Edge to enable deep linking and secure access control for /admin routes, optimizing performance and security.
Lambda at Edge allows direct access to routes like /login, enabling deep linking without redirects. The function runs at CloudFront edge locations to reduce latency.
For /admin routes, the Lambda function checks for a Cognito auth token in request cookies. If valid, access is granted; otherwise, it's denied, ensuring secure access without unnecessary overhead.
This project uses a static export of NextJS. Prior to the build step, a pre-build script is run to fetch any read data from the 100 letters API. A post build script uses next-sitemap to generate robots.txt and site maps. The actual build is run using NextJS internals.
The following environment variables must be set in .env.test and env.production files in the root of the project:
API_URL=one-hundred-letters-api-url
NEXT_PUBLIC_COGNITO_USER_POOL_CLIENT_ID=cognito-user-pool-client-id
COGNITO_USER_POOL_PASSWORD=cognito-user-pool-password
COGNITO_USER_POOL_USERNAME=cognito-user-pool-username
The pre-build script generates a cognito token using machine user credentials, fetches the read data and writes to a static file public/data/data.json.
To run the pre-build script:
npm run prebuildThe post-build script uses the next-sitemap package to generate sitemaps and robots.txt for SEO purposes.
To run the post-build script:
npm run postbuildTo build the project using the test environment:
NODE_ENV=test npm run buildTo build the project using the production environment:
NODE_ENV=production npm run buildThe development server has a predev script that will check for static data loaded during the build at public/data/data.json. If the file is missing, the script will fetch the data and write it. The predev script uses machine user credentials to grab a cognito token.
The following environment variables must be set in .env.test and env.production files in the root of the project:
NEXT_PUBLIC_COGNITO_USER_POOL_CLIENT_ID=cognito-user-pool-client-id
COGNITO_USER_POOL_PASSWORD=cognito-user-pool-password
COGNITO_USER_POOL_USERNAME=cognito-user-pool-username
To run the pre-dev script:
npm run predevTo run the development server:
npm run devThis project uses automated deployment pipelines to ensure a smooth and reliable deployment process utilizing AWS CloudFormation, GitHub Actions and S3.
The deployment process for this project ensures reliability and consistency through a combination of versioned artifacts, automated deployments, and rollback mechanisms. The strategy involves the following key components:
-
Versioned Artifacts: Every deployment is tied to a unique timestamped backup stored in Amazon S3. These backups allow for easy restoration and rollback to a previous state if necessary.
-
GitHub Actions Pipelines: Automated deployment workflows are used to manage the build, test, and deployment processes. These workflows ensure that each change is properly validated, tested, and deployed to either the
testorproductionenvironment based on user input. Manual deployment and rollback are also supported through GitHub Actions. -
CloudFormation: Infrastructure management is handled via AWS CloudFormation, which enables version-controlled deployments and updates. This tool helps ensure that infrastructure changes, such as bucket configurations or IAM roles, are consistently applied across environments.
-
Backup and Rollback: Each deployment to S3 includes a backup of the previous state, enabling easy rollback if any issues arise. If Lighthouse performance tests fail after deployment, the previous version is automatically restored from the backup, and CloudFront cache is invalidated to immediately reflect the changes.
-
Manual and Automated Triggers: Deployments are typically triggered by pushes to the
mainbranch, but manual triggers (via GitHub Actions) are also available for both deployments and rollbacks. This provides flexibility in controlling the deployment process based on the current needs of the team or project.
This strategy ensures that the deployment process is automated, reliable, and easy to manage, with robust rollback options to handle any issues that may arise during deployment.
- AWS CLI: Configures the AWS environment for deployments.
- GitHub Actions: Automates and schedules the deployment and rollback pipelines.
- CloudFormation: Orchestrates infrastructure changes, including deployments and rollbacks.
- S3: Stores function packages for deployment and rollback.
This pipeline automates the validation process for pull requests targeting the main branch. It ensures that new changes are properly built, linted, tested and evaluated for performance before merging.
The pipeline performs the following steps:
- Build Next.js – Checks out the code, installs dependencies, builds the Next.js application, and archives the output.
- Lint Code – Runs ESLint to check for syntax and style issues.
- Run Unit Tests – Executes Jest tests, uploads a coverage report, and ensures test coverage meets the required threshold.
- Run E2E & Lighthouse Tests – Starts a local server, runs Cypress end-to-end tests, and performs Lighthouse performance checks.
This pipeline is defined in the .github/workflows/pull-request.yml file.
This pipeline automates the deployment of the Next.js application to an S3 bucket using a static export. It supports deployment to either the test or production environment based on user input.
The pipeline performs the following steps:
- Build Next.js – Checks out the code, installs dependencies, builds the Next.js application, and archives the output.
- Run Unit Tests – Executes Jest tests, uploads a coverage report, and ensures test coverage meets the required threshold.
- Backup S3 – Creates a timestamped backup of the existing deployment in S3 before deploying new changes.
- Deploy to S3 – Downloads the built application and syncs it to the designated S3 bucket.
This workflow is triggered manually via GitHub Actions using workflow_dispatch, allowing users to specify the target environment. The pipeline is defined in the .github/workflows/deploy.yml file.
This workflow runs automatically when changes are pushed to the main branch. It builds, tests, deploys the Next.js application to an S3 bucket, and runs end-to-end (E2E) and Lighthouse tests. If Lighthouse tests fail, the deployment is rolled back.
The workflow performs the following steps:
- Build Next.js – Checks out the code, installs dependencies, builds the application, and archives the output.
- Run Unit Tests – Executes Jest tests, uploads a coverage report, and ensures test coverage meets the required threshold.
- Backup S3 – Creates a timestamped backup of the current deployment before uploading the new build.
- Deploy to S3 – Downloads the archived build output, syncs it to the S3 bucket, and invalidates the CloudFront cache.
- Run E2E Tests – Executes Cypress tests against the deployed site.
- Run Lighthouse Tests – Performs performance and accessibility audits using Lighthouse.
- Rollback Deployment – If Lighthouse tests fail, the workflow restores the previous deployment from the backup.
This workflow is defined in the .github/workflows/merge.yml file.
This workflow allows manual rollback of a deployed Next.js application by restoring a selected backup from S3. It is triggered via GitHub Actions' workflow dispatch and requires specifying a backup timestamp and environment.
The workflow performs the following steps:
- Restore Selected Backup – Synchronizes the selected backup from the S3 backup bucket to the main deployment bucket, deleting any newer files.
- Invalidate CloudFront Cache – Clears the CloudFront cache to ensure users receive the restored version immediately.
This workflow is defined in the .github/workflows/rollback.yml file.
All write routes are protected via Cognito User Pools. A valid access token is required to use these endpoints and access the UI.
To generate a valid Access token:
npm run tokenTo use the API add the token to the Authorization request header:
curl -X POST "https://bgv89ajo02.execute-api.us-west-2.amazonaws.com/<stage>/<route>" -H "Authorization: Bearer <token>"The following environment variables must be set in a .env file in the root of the project to generate a token:
COGNITO_USER_POOL_ID=cognito-user-pool-id
COGNITO_USER_POOL_USERNAME=cognito-user-pool-username
COGNITO_USER_POOL_PASSWORD=cognito-user-pool-password
COGNITO_USER_POOL_CLIENT_ID=cognito-user-pool-client-id
To connect to the AWS EC2 bastion host and access AWS resources, you can use the following command:
npm run bastionThe following environment variables must be set in a .env.local file in the root of the project:
SSH_PRIVATE_KEY_PATH=/path/to/your/private/key
SSH_USER=your-ssh-username
SSH_HOST=your-ec2-instance-hostname-or-ip
Ensure you have the appropriate permissions set on your SSH key for secure access.
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.