- Run Cache Action
- Overview
- S3-Compatible Storage
- Provider Configuration
- Problems with manual caching?
- Basic Usage
- Cache Key Access Patterns
- Inputs
- Outputs
- Include Stdout Feature
- Examples
- Basic Test Caching
- Build Process with Branch-Specific Caching
- Multi-step Setup with Shared Cache
- Conditional Execution Based on Cache
- Cross-Platform Build Caching
- Python Environment Setup
- Long-running Tests with Different Cache Granularity
- Test Results with stdout Caching
- Dependency Information Caching
The Run Cache Action provides simple command execution caching using S3-compatible storage. When a cache marker exists for a given cache path, the command is skipped entirely. When no cache exists, the command executes normally and creates a cache marker on successful completion (exit code 0).
This action is designed to solve the complexity of manual caching in GitHub Actions workflows, particularly for:
- Long-running test suites that don't need to re-run if nothing changed
- Build processes that produce the same artifacts
- Setup steps that are expensive to repeat
- Any deterministic command that can be safely skipped
- State management and data persistence across workflow steps
Key Behavior:
- Cache hit → Skip command execution entirely, return immediately
- No cache → Execute command, create cache marker if successful (exit code 0)
- Command fails → No cache marker created, normal failure behavior
- Optional stdout caching → Store and retrieve command output for state management
This action uses S3-compatible storage APIs, which means it works with multiple cloud storage providers without code changes. The action requires the following inputs:
| Input | Required | Description |
|---|---|---|
access-key |
Yes | S3-compatible access key |
secret-key |
Yes | S3-compatible secret key |
endpoint |
No | S3-compatible endpoint URL |
region |
No | S3 region (default: auto) |
The following table shows how to configure the action for different S3-compatible storage providers:
| Input | AWS S3 | GCS (HMAC) | MinIO | SeaweedFS |
|---|---|---|---|---|
access-key |
AWS Access Key ID | HMAC Access ID | MinIO Access Key | SeaweedFS Access Key |
secret-key |
AWS Secret Access Key | HMAC Secret | MinIO Secret Key | SeaweedFS Secret Key |
endpoint |
(not required) | https://storage.googleapis.com |
https://minio.example.com |
https://seaweedfs.example.com:8333 |
region |
us-east-1 (or your region) |
auto |
us-east-1 |
us-east-1 |
cache-path |
s3://bucket/prefix |
gs://bucket/prefix |
s3://bucket/prefix |
s3://bucket/prefix |
To use Google Cloud Storage with this action, you need to create HMAC keys:
- Go to Cloud Storage Settings in the Google Cloud Console
- Select the Interoperability tab
- Create a new HMAC key for a service account
- Use the Access Key as
access-keyand Secret assecret-key
For more details, see Managing HMAC keys.
Here's the typical manual approach to command caching in GitHub Actions:
steps:
- name: Check if work already done
id: cache-check
run: |
if aws s3 ls "s3://my-bucket/cache/tests-${{ hashFiles('**/*.test.js') }}"; then
echo "cache_hit=true" >> $GITHUB_OUTPUT
else
echo "cache_hit=false" >> $GITHUB_OUTPUT
fi
- name: Run expensive tests
if: steps.cache-check.outputs.cache_hit != 'true'
run: npm test
- name: Mark work as done
if: steps.cache-check.outputs.cache_hit != 'true' && success()
run: |
echo "completed at $(date)" | aws s3 cp - "s3://my-bucket/cache/tests-${{ hashFiles('**/*.test.js') }}"This manual approach requires:
- Multiple steps with complex conditionals
- Manual cache key generation and management
- Error-prone success/failure handling
- Repetitive boilerplate for every cached operation
- Easy to forget the caching logic
- name: Run tests with caching
uses: ./apps/run-cache
with:
access-key: ${{ secrets.CACHE_ACCESS_KEY }}
secret-key: ${{ secrets.CACHE_SECRET_KEY }}
endpoint: https://storage.googleapis.com
region: auto
run: |
npm test
cache-path: 'gs://my-bucket/test-cache/${{ hashFiles("**/*.test.js", "src/**/*.js") }}'The cache-path should include factors that affect the command outcome:
# Content-based cache key
cache-path: 'gs://bucket/tests/${{ hashFiles("src/**", "test/**") }}'
# Multi-factor cache key
cache-path: 'gs://bucket/build/${{ runner.os }}-${{ hashFiles("package*.json") }}-${{ hashFiles("src/**") }}'
# Branch-specific cache
cache-path: 'gs://bucket/lint/${{ github.ref_name }}-${{ hashFiles("**/*.js") }}'
# Shared cache across all branches
cache-path: 'gs://bucket/shared/tests-${{ hashFiles("**/*.test.js") }}'
# Branch-specific cache
cache-path: 'gs://bucket/branch/${{ github.ref_name }}/build-${{ hashFiles("src/**") }}'
# PR-specific but accessible to main
cache-path: 'gs://bucket/pr/${{ github.event.number || github.ref_name }}/tests-${{ hashFiles("**") }}'Use it with ./affected.md, sha, or ./affected-cache.md, path, that can be used as the cache-path.
| Input | Description | Required | Default |
|---|---|---|---|
access-key |
S3-compatible access key | Yes | - |
secret-key |
S3-compatible secret key | Yes | - |
endpoint |
S3-compatible endpoint URL | No | - |
region |
S3 region | No | auto |
run |
Command(s) to execute | Yes | - |
shell |
Shell to use for execution | No | bash |
working-directory |
Working directory for command | No | . |
cache-path |
S3-compatible path for cache marker | Yes | - |
include-stdout |
Include stdout in cache and return as output | No | false |
bash- Usesbash -c "command"sh- Usessh -c "command"pwsh/powershell- Usespwsh -Command "command"python- Usespython -c "command"node- Usesnode -e "command"
| Output | Description |
|---|---|
cache-hit |
"true" if cache found and command skipped, "false" if command executed |
stdout |
Command stdout (only when include-stdout=true) |
When include-stdout is set to true, the action will:
- Cache the command's stdout along with the success marker
- Return the cached stdout on cache hits via the
stdoutoutput - Allow you to use the action for state management and data persistence
This is particularly useful for:
- Commands that generate build information or metadata
- State that needs to be passed between workflow steps
- JSON output that other steps need to consume
- name: Generate build metadata
id: metadata
uses: ./apps/run-cache
with:
access-key: ${{ secrets.CACHE_ACCESS_KEY }}
secret-key: ${{ secrets.CACHE_SECRET_KEY }}
endpoint: https://storage.googleapis.com
region: auto
run: |
echo '{
"version": "'$(npm version --json | jq -r .version)'",
"buildTime": "'$(date -u +%Y-%m-%dT%H:%M:%SZ)'",
"commit": "'${GITHUB_SHA}'",
"artifacts": ["dist/app.js", "dist/app.css"]
}'
include-stdout: 'true'
cache-path: 'gs://build-cache/metadata/${{ hashFiles('package*.json', 'src/**') }}'
- name: Use build metadata
run: |
metadata='${{ steps.metadata.outputs.stdout }}'
version=$(echo "$metadata" | jq -r .version)
echo "Building version: $version"
# Parse artifacts list
echo "$metadata" | jq -r '.artifacts[]' | while read artifact; do
echo "Artifact: $artifact"
done- name: Run unit tests
id: tests
uses: ./apps/run-cache
with:
access-key: ${{ secrets.CACHE_ACCESS_KEY }}
secret-key: ${{ secrets.CACHE_SECRET_KEY }}
endpoint: https://storage.googleapis.com
region: auto
run: 'npm test'
cache-path: 'gs://ci-cache/unit-tests/${{ hashFiles("src/**/*.js", "test/**/*.js") }}'
- name: Check results
run: |
if [ "${{ steps.tests.outputs.cache-hit }}" = "true" ]; then
echo "Tests were skipped due to cache hit"
else
echo "Tests were executed"
fi- name: Build application
uses: ./apps/run-cache
with:
access-key: ${{ secrets.CACHE_ACCESS_KEY }}
secret-key: ${{ secrets.CACHE_SECRET_KEY }}
endpoint: https://storage.googleapis.com
region: auto
run: |
echo "Installing dependencies..."
npm ci
echo "Building application..."
npm run build
cache-path: 'gs://build-cache/${{ github.ref_name }}/build-${{ hashFiles("package*.json", "src/**") }}'- name: Setup development environment
uses: ./apps/run-cache
with:
access-key: ${{ secrets.CACHE_ACCESS_KEY }}
secret-key: ${{ secrets.CACHE_SECRET_KEY }}
endpoint: https://storage.googleapis.com
region: auto
run: |
# Install system dependencies
sudo apt-get update
sudo apt-get install -y build-essential python3-dev
# Setup Python environment
python -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt
# Run initial setup
python setup.py develop
cache-path: 'gs://setup-cache/dev-env/${{ runner.os }}-${{ hashFiles("requirements.txt", "setup.py") }}'- name: Expensive operation
id: operation
uses: ./apps/run-cache
with:
access-key: ${{ secrets.CACHE_ACCESS_KEY }}
secret-key: ${{ secrets.CACHE_SECRET_KEY }}
endpoint: https://storage.googleapis.com
region: auto
run: './scripts/expensive-task.sh'
cache-path: 'gs://task-cache/expensive-${{ github.sha }}'
- name: Only run if work was actually done
if: steps.operation.outputs.cache-hit == 'false'
run: |
echo "Expensive operation completed, doing follow-up work..."
./scripts/post-process.sh
- name: Always run regardless of cache
run: |
echo "This always runs, cache-hit: ${{ steps.operation.outputs.cache-hit }}"strategy:
matrix:
os: [ubuntu-latest, windows-latest, macos-latest]
steps:
- name: Build for platform
uses: ./apps/run-cache
with:
access-key: ${{ secrets.CACHE_ACCESS_KEY }}
secret-key: ${{ secrets.CACHE_SECRET_KEY }}
endpoint: https://storage.googleapis.com
region: auto
run: |
npm ci
npm run build:${{ runner.os }}
npm run test:integration
cache-path: 'gs://build-cache/platform/${{ matrix.os }}-${{ hashFiles("**") }}'- name: Setup Python environment
uses: ./apps/run-cache
with:
access-key: ${{ secrets.CACHE_ACCESS_KEY }}
secret-key: ${{ secrets.CACHE_SECRET_KEY }}
endpoint: https://storage.googleapis.com
region: auto
run: |
python -m pip install --upgrade pip
pip install -r requirements.txt
pip install -r requirements-dev.txt
shell: 'bash'
cache-path: 'gs://python-cache/env-${{ hashFiles("requirements*.txt") }}'# Fine-grained caching per test suite
- name: Unit tests
uses: ./apps/run-cache
with:
access-key: ${{ secrets.CACHE_ACCESS_KEY }}
secret-key: ${{ secrets.CACHE_SECRET_KEY }}
endpoint: https://storage.googleapis.com
region: auto
run: 'npm run test:unit'
cache-path: 'gs://test-cache/unit-${{ hashFiles("src/**", "test/unit/**") }}'
- name: Integration tests
uses: ./apps/run-cache
with:
access-key: ${{ secrets.CACHE_ACCESS_KEY }}
secret-key: ${{ secrets.CACHE_SECRET_KEY }}
endpoint: https://storage.googleapis.com
region: auto
run: 'npm run test:integration'
cache-path: 'gs://test-cache/integration-${{ hashFiles("src/**", "test/integration/**") }}'
- name: E2E tests
uses: ./apps/run-cache
with:
access-key: ${{ secrets.CACHE_ACCESS_KEY }}
secret-key: ${{ secrets.CACHE_SECRET_KEY }}
endpoint: https://storage.googleapis.com
region: auto
run: 'npm run test:e2e'
cache-path: 'gs://test-cache/e2e-${{ hashFiles("src/**", "test/e2e/**") }}'- name: Run tests with result caching
id: tests
uses: ./apps/run-cache
with:
access-key: ${{ secrets.CACHE_ACCESS_KEY }}
secret-key: ${{ secrets.CACHE_SECRET_KEY }}
endpoint: https://storage.googleapis.com
region: auto
run: |
npm test -- --reporter=json > test-results.json
cat test-results.json
include-stdout: 'true'
cache-path: 'gs://test-cache/results-${{ hashFiles("src/**", "test/**") }}'
- name: Process test results
run: |
results='${{ steps.tests.outputs.stdout }}'
passed=$(echo "$results" | jq '.stats.passes')
failed=$(echo "$results" | jq '.stats.failures')
echo "Tests passed: $passed"
echo "Tests failed: $failed"
if [ "$failed" -gt 0 ]; then
echo "Some tests failed, checking details..."
echo "$results" | jq '.failures[]'
fi- name: Get dependency info
id: deps
uses: ./apps/run-cache
with:
access-key: ${{ secrets.CACHE_ACCESS_KEY }}
secret-key: ${{ secrets.CACHE_SECRET_KEY }}
endpoint: https://storage.googleapis.com
region: auto
run: |
echo '{
"nodeVersion": "'$(node --version)'",
"npmVersion": "'$(npm --version)'",
"packageCount": '$(npm list --depth=0 --json | jq '.dependencies | length')',
"devPackageCount": '$(npm list --depth=0 --json | jq '.devDependencies | length')'
}'
include-stdout: 'true'
cache-path: 'gs://build-cache/deps-${{ hashFiles("package*.json") }}'
- name: Report dependency info
run: |
deps='${{ steps.deps.outputs.stdout }}'
echo "Node: $(echo "$deps" | jq -r .nodeVersion)"
echo "NPM: $(echo "$deps" | jq -r .npmVersion)"
echo "Dependencies: $(echo "$deps" | jq -r .packageCount)"
echo "Dev Dependencies: $(echo "$deps" | jq -r .devPackageCount)"