Skip to content

Commit 2e8c81e

Browse files
authored
hollow out github runner.
run on all branches, but do not s3 push. use this for testing, for now. when we have verified it works, we will update it to actually push again and only run on master.
1 parent 3b92935 commit 2e8c81e

File tree

1 file changed

+27
-5
lines changed

1 file changed

+27
-5
lines changed

.github/workflows/mirror_data_archive.yml

Lines changed: 27 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -2,8 +2,8 @@ name: mirror-archive-on-merge-to-default-branch
22

33
on:
44
push:
5-
branches:
6-
- master
5+
#branches:
6+
# - master
77

88
jobs:
99
mirror-archive:
@@ -12,6 +12,28 @@ jobs:
1212
BUCKET: attack-range-attack-data
1313
ATTACK_DATA_ARCHIVE_FILE: attack_data.tar.zstd
1414
steps:
15+
16+
# Use the following GitHub Action to free up disk space on a runner
17+
# This removes dependencies and packages that may be important to many
18+
# GitHub users, such as the Android and dotnet build tools, but are
19+
# not relevant to us
20+
# https://github.com/marketplace/actions/free-disk-space-ubuntu
21+
- name: Free Disk Space (Ubuntu)
22+
uses: jlumbroso/free-disk-space1.3.1
23+
with:
24+
# this might remove tools that are actually needed,
25+
# if set to "true" but frees about 6 GB
26+
tool-cache: false
27+
28+
# all of these default to true, but feel free to set to
29+
# "false" if necessary for your workflow
30+
android: true
31+
dotnet: true
32+
haskell: true
33+
large-packages: true
34+
docker-images: true
35+
#swap-storage: true
36+
1537
- name: Checkout Repo
1638
uses: actions/checkout@v4
1739
# We must EXPLICITLY specificy lfs: true. It defaults to false
@@ -40,6 +62,6 @@ jobs:
4062
# File size reductions are diminishing returns after this - determined experimentally.
4163
tar -c attack_data | zstd --compress -T0 -10 -o $ATTACK_DATA_ARCHIVE_FILE
4264
43-
- name: Upload Attack data archive file to S3 Bucket
44-
run: |
45-
aws s3 cp $ATTACK_DATA_ARCHIVE_FILE s3://$BUCKET/
65+
# - name: Upload Attack data archive file to S3 Bucket
66+
# run: |
67+
# aws s3 cp $ATTACK_DATA_ARCHIVE_FILE s3://$BUCKET/

0 commit comments

Comments
 (0)