diff --git a/attack_data_service/.dockerignore b/attack_data_service/.dockerignore deleted file mode 100644 index 6858c00db..000000000 --- a/attack_data_service/.dockerignore +++ /dev/null @@ -1,6 +0,0 @@ - -venv -security-content -attack_range -attack_range.log -Dockerfile diff --git a/attack_data_service/Dockerfile b/attack_data_service/Dockerfile deleted file mode 100644 index c9099d97a..000000000 --- a/attack_data_service/Dockerfile +++ /dev/null @@ -1,22 +0,0 @@ -FROM ubuntu:18.04 -MAINTAINER Patrick Bareiss - -RUN apt-get update -RUN DEBIAN_FRONTEND="noninteractive" apt-get -y install tzdata -RUN apt-get install -y python3-dev git python-dev unzip python3-pip awscli -RUN apt-get install -y python-gitdb -RUN apt-get install -y wget unzip curl - -RUN wget --quiet https://releases.hashicorp.com/terraform/0.13.1/terraform_0.13.1_linux_amd64.zip \ - && unzip terraform_0.13.1_linux_amd64.zip \ - && mv terraform /usr/bin \ - && rm terraform_0.13.1_linux_amd64.zip - -ADD config /root/.aws/config -ADD . /app - -WORKDIR /app -RUN pip3 install -r requirements.txt - -ENTRYPOINT ["python3", "attack_data_service.py"] -CMD ["-st", "T1003.002"] diff --git a/attack_data_service/README.md b/attack_data_service/README.md deleted file mode 100644 index 5ae3d096b..000000000 --- a/attack_data_service/README.md +++ /dev/null @@ -1,153 +0,0 @@ -# Attack Data Service -The attack data service allows you to run attacks using the [Attack Range](https://github.com/splunk/attack_range) as a service and the attack data will be collected. - -## Architecture -![Architecture](attack_data_service/static/architecture_attack_data_service.png) -The attack data service is using AWS Batch as execution engine. AWS batch allows you to run batch computing jobs, in our case the attack range for attack data generation. attack_data_service.py is the executable which controls the attack range execution. This executable is deployed in a docker container which is used by AWS Batch. - - -## Usage -``` -python attack_data_service.py -usage: attack_data_service.py [-h] -st SIMULATION_TECHNIQUE - [-sa SIMULATION_ATOMIC] - [-arr ATTACK_RANGE_REPO] - [-arb ATTACK_RANGE_BRANCH] - [-adr ATTACK_DATA_REPO] - [-adb ATTACK_DATA_BRANCH] - [-artr ATOMIC_RED_TEAM_REPO] - [-artb ATOMIC_RED_TEAM_BRANCH] - [-gt GITHUB_TOKEN] - [-smk SECRETS_MANAGER_KEY] -attack_data_service.py: error: the following arguments are required: -st/--simulation_technique -``` - -The attack_data_service.py has one mandatory parameter, which is --simulation_technique. Simulation technique expects a technique id form the Mitre ATT&CK Matrix with corresponding tests in [Atomic Red Team](https://github.com/redcanaryco/atomic-red-team), e.g. T1003.002. The other parameters are optional and can be used to specify forks of projects or specific branches. The attack_data_service.py is creating Pull Requests after a successful test. Therefore, it needs a Github OAUTH Token. This can be either added with the parameter --github_token or can be derived from the [AWS secrets manager](https://aws.amazon.com/secrets-manager/) through --secrets_manager_key. - -Let's have a look how to use the attack data service after you deployed it: - -### Using AWS CLI - -Example 1: -``` -aws batch submit-job --job-name attack_data_T1003_001 --job-definition attack_data_service_job --job-queue attack_data_service_queue --container-overrides '{"command": ["-st", "T1003.002"]}' -``` - -Example 2: -``` -aws batch submit-job --job-name attack_data_T1003_001 --job-definition attack_data_service_job --job-queue attack_data_service_queue --container-overrides '{"command": ["-st", "T1003.001", "-sa", "Dump LSASS.exe Memory using comsvcs.dll", "-adr", "P4T12ICK/attack-data", "-adb", "develop", "-smk", "github_token"]}' -``` - -### Using AWS Web Portal -The Attack Data Generation Service can be also triggered over the AWS Web Portal. You will first click on the service "Batch" and then click on the left side "Jobs". Then, you click on "submit new job". You will fill the variables according to the following screenshot and click on "Submit". -![AWS Batch Job](attack_data_service/static/aws_batch_submit_job.png) - -## Deployment -In order to deploy the Attack Data Generation Service to AWS Batch, please follow this guideline. This description assumes that you will deploy the Attack Data Generation Service to the region eu-central-1. - -### Prerequisites -- AWS account -- IAM user with administrative permissions -- AWS CLI -- Docker -- Attack Data Project Fork - -### Create GitHub Token -The GitHub Token allows the Automate Detection Testing Service to create Pull Requests. -- Create a Personal GitHub Acces Token according to the following [tutorial](https://docs.github.com/en/free-pro-team@latest/github/authenticating-to-github/creating-a-personal-access-token) - -### Upload GitHub Token to AWS Secrets Manager -- Connect to AWS Web Portal -- Go to the AWS Secrets Manager -- Choose region eu-central-1 -- Click on "Store a new secret" -- Click on "Other type of secrets" -- Add "github_token" as key -- Copy the github token as value -- Click on "Next" -- Use "github_token" as Secret name -- Click on "Next" -- Click on "Next" -- Click on "Store" - -### Create AWS ECR Repository -- Connect to AWS Web Portal -- Go to service "Elastic Container Registry" -- Click on "Repositories" under Amazon ECR on the left side. -- Click on "Create repository" -- Add "awsbatch/attack-data-service" as repository name -- Click on "Create repository" - -### Build and Upload Docker File -- Navgigate to the attack_data_service folder: -``` -cd attack_data_service -``` -- Build the docker container -``` -docker build --tag awsbatch/attack-data-service . -``` -- Tag the docker container (The aws account number can be found in the AWS ECR Repository path) -``` -docker tag awsbatch/detection-testing-service:latest [aws_account_number].dkr.ecr.eu-central-1.amazonaws.com/awsbatch/attack-data-service:latest -``` -- Login to AWS ECR -``` -aws ecr get-login-password --region eu-central-1 | docker login --username AWS --password-stdin [aws_account_number].dkr.ecr.eu-central-1.amazonaws.com -``` -- Upload Docker container -``` -docker push [aws_account_number].dkr.ecr.eu-central-1.amazonaws.com/awsbatch/attack-data-service:latest -``` - -### Configure AWS Batch -- Connect to AWS Web Portal -- Go to service "AWS Batch" -- Click on "Compute environments" on the left side -- Click on "Create" -- Use "attack_data_service_environment" as "Compute environment name" -- Define Instance Configuration according to your demand. You can choose small instance types, because the instance will run docker and docker will only run a python script. -- Define the vpc and subnets which you want to use in Networking -- Click on "create compute environment" - -- Click on "Job queues" on the left side -- Click on "Create" -- Use "attack_data_service_queue" as "Job queue name" -- Select "attack_data_service_environment" as "compute environment" -- Click on "Create" - -- Go to service "IAM" -- Create the following role with name: attack_data_service_role with the Policies AmazonEC2FullAccess, SecretsManagerReadWrite and AmazonS3FullAccess - -- Go to service "AWS Batch" -- Click on "Job definitions" on the left side -- Click on "Create" -- Use "attack_data_service" as Name -- Use 3000 as "Execution timeout" -- Container properties: -- Use "[aws_account_number].dkr.ecr.eu-central-1.amazonaws.com/awsbatch/attack_data_service:latest" as Image -- remove Command from Command field -- Use 2 in vCPUs -- Use 2048 in Memory -- Click on "Additional configuration" -- Use "attack_data_service_role" as Job Role -- Use root as "User" under Security -- Click on "Create" - -## Local Detection Testing -The Detection Testing Service can be also run locally. -- Navgigate to the attack_data_service folder: -``` -cd attack_data_service -``` -- Build the docker container -``` -docker build --tag awsbatch/attack_data_service . -``` -- Run the docker container -``` -docker run -v ~/.aws/credentials:/root/.aws/credentials:ro --name attackrange awsbatch/attack_data_service:latest -sa T1003.001 -adr P4T12ICK/attack_data -``` - -## Troubleshooting -AWS Batch will store the logs in Cloudwatch. Check the cloudwatch logs for Troubleshooting. diff --git a/attack_data_service/attack_data_service.py b/attack_data_service/attack_data_service.py deleted file mode 100644 index 77835c9cd..000000000 --- a/attack_data_service/attack_data_service.py +++ /dev/null @@ -1,301 +0,0 @@ -import os -from os import path -import sys -import argparse -import git -from shutil import copyfile -from shutil import which -import subprocess -import boto3 -from random import randrange -import yaml -from github import Github -from jinja2 import Environment, FileSystemLoader -import base64 -from botocore.exceptions import ClientError -import json -from datetime import datetime -import time -import shutil -from os import listdir -from os.path import isfile, join - - - -def main(args): - - parser = argparse.ArgumentParser(description="attack data service based on Attack Range.") - parser.add_argument("-st", "--simulation_technique", required=True, - help="specify the simulation technique to execute") - parser.add_argument("-sa", "--simulation_atomics", required=False, default="none", - help="specify a specific atomics to simulate") - parser.add_argument("-arr", "--attack_range_repo", required=False, default="splunk/attack_range", - help="specify the url of the atack range repository") - parser.add_argument("-arb", "--attack_range_branch", required=False, default="develop", - help="specify the atack range branch") - parser.add_argument("-adr", "--attack_data_repo", required=False, default="splunk/attack_data", - help="specify the url of the attack data repository") - parser.add_argument("-adb", "--attack_data_branch", required=False, default="master", - help="specify the attack data branch") - parser.add_argument("-artr", "--atomic_red_team_repo", required=False, default="splunk", - help="specify the url of the attack data repository") - parser.add_argument("-artb", "--atomic_red_team_branch", required=False, default="local-master", - help="specify the attack data branch") - parser.add_argument("-gt", "--github_token", required=False, - help="specify the github token for the PR") - parser.add_argument("-smk", "--secrets_manager_key", required=False, default="github_token", - help="specify the key in AWS secrets manager for your github token") - parser.add_argument("-sbu", "--s3_bucket_url", required=False, default="https://attack-range-attack-data.s3-us-west-2.amazonaws.com", - help="specify the S3 bucket to store the Attack Data") - - - args = parser.parse_args() - simulation_technique = args.simulation_technique - simulation_atomics = args.simulation_atomics - attack_range_repo = args.attack_range_repo - attack_range_branch = args.attack_range_branch - attack_data_repo = args.attack_data_repo - attack_data_branch = args.attack_data_branch - atomic_red_team_repo = args.atomic_red_team_repo - atomic_red_team_branch = args.atomic_red_team_branch - github_token = args.github_token - secrets_manager_key = args.secrets_manager_key - s3_bucket_url = args.s3_bucket_url - - # get github token - if github_token: - O_AUTH_TOKEN_GITHUB = github_token - else: - O_AUTH_TOKEN_GITHUB = get_secret(secrets_manager_key) - - os.system('curl -s https://packagecloud.io/install/repositories/github/git-lfs/script.deb.sh | bash') - os.system('apt-get install git-lfs') - os.system('git lfs install --skip-smudge') - - # clone repositories - git.Repo.clone_from('https://github.com/' + attack_range_repo, "attack_range", branch=attack_range_branch) - os.system('git clone --single-branch --branch ' + attack_data_branch +' https://' + O_AUTH_TOKEN_GITHUB + ':x-oauth-basic@github.com/' + attack_data_repo + '.git') - #attack_data_repo_obj = git.Repo.clone_from('https://' + O_AUTH_TOKEN_GITHUB + ':x-oauth-basic@github.com/' + attack_data_repo, "attack_data", branch=attack_data_branch) - - - sys.path.append(os.path.join(os.getcwd(),'attack_range')) - copyfile('attack_range/attack_range.conf.template', 'attack_range/attack_range.conf') - - epoch_time = str(int(time.time())) - ssh_key_name = 'ads-key-pair-' + epoch_time - # create ssh keys - ec2 = boto3.client('ec2') - response = ec2.create_key_pair(KeyName=ssh_key_name) - with open(ssh_key_name, "w") as ssh_key: - ssh_key.write(response['KeyMaterial']) - os.chmod(ssh_key_name, 0o600) - - - with open('attack_range/attack_range.conf', 'r') as file : - filedata = file.read() - - filedata = filedata.replace('attack_range_password = Pl3ase-k1Ll-me:p', 'attack_range_password = I-l1ke-Attack-Range!') - filedata = filedata.replace('region = us-west-2', 'region = eu-central-1') - filedata = filedata.replace('art_repository = splunk', 'art_repository = ' + atomic_red_team_repo) - filedata = filedata.replace('art_branch = local-master', 'art_branch = ' + atomic_red_team_branch) - filedata = filedata.replace('key_name = attack-range-key-pair', 'key_name = ' + ssh_key_name) - filedata = filedata.replace('private_key_path = ~/.ssh/id_rsa', 'private_key_path = /app/' + ssh_key_name) - - with open('attack_range/attack_range.conf', 'w') as file: - file.write(filedata) - - # check if terraform is installed - if which('terraform') is None: - sys.exit(1) - else: - # init terraform - os.system('cd attack_range/terraform/aws && terraform init && cd ../../..') - - module = __import__('attack_range') - module.sys.argv = ['attack_range', '--config', 'attack_range/attack_range.conf', 'build'] - - execution_error = False - - # build Attack Range - try: - results_build = module.main(module.sys.argv) - except Exception as e: - print('Error: ' + str(e)) - module.sys.argv = ['attack_range', '--config', 'attack_range/attack_range.conf', 'destroy'] - module.main(module.sys.argv) - execution_error = True - - # simulate Technique - if simulation_atomics == 'none': - module.sys.argv = ['attack_range', '--config', 'attack_range/attack_range.conf', 'simulate', '-st', simulation_technique, '-t', str('ar-win-dc-default-' + ssh_key_name)] - else: - module.sys.argv = ['attack_range', '--config', 'attack_range/attack_range.conf', 'simulate', '-st', simulation_technique, '-t', str('ar-win-dc-default-' + ssh_key_name), '--simulation_atomics', simulation_atomics] - - try: - results_simulate = module.main(module.sys.argv) - except Exception as e: - print('Error: ' + str(e)) - module.sys.argv = ['attack_range', '--config', 'attack_range/attack_range.conf', 'destroy'] - module.main(module.sys.argv) - execution_error = True - - # wait - print('Wait for 200 seconds') - time.sleep(200) - - # dump attack data - module.sys.argv = ['attack_range', '--config', 'attack_range/attack_range.conf', 'dump', '--dump_name', simulation_technique] - try: - results_dump = module.main(module.sys.argv) - except Exception as e: - print('Error: ' + str(e)) - module.sys.argv = ['attack_range', '--config', 'attack_range/attack_range.conf', 'destroy'] - module.main(module.sys.argv) - execution_error = True - - # destroy Attack Range - module.sys.argv = ['attack_range', '--config', 'attack_range/attack_range.conf', 'destroy'] - try: - results_destroy = module.main(module.sys.argv) - except Exception as e: - print('Error: ' + str(e)) - module.sys.argv = ['attack_range', '--config', 'attack_range/attack_range.conf', 'destroy'] - module.main(module.sys.argv) - execution_error = True - - # delete ssh key - response = ec2.delete_key_pair(KeyName=ssh_key_name) - - # check if was succesful - if not execution_error: - - random_number = epoch_time - - # Create GitHub PR attack data - branch_name = "attack_data_service_" + random_number - os.system('cd attack_data && git checkout -b ' + branch_name + ' && cd ..') - #attack_data_repo_obj.git.checkout(attack_data_branch, b=branch_name) - - dataset_obj = {} - dataset_obj['author'] = 'Automated Attack Data Service' - dataset_obj['date'] = str(datetime.today().strftime('%Y-%m-%d')) - descr_str = 'Atomic Test Results: ' - for output in results_simulate: - descr_str += output + ' ' - - dataset_obj['description'] = descr_str - dataset_obj['environment'] = 'attack_range' - dataset_obj['technique'] = [simulation_technique] - - #list files - mypath = 'attack_range/attack_data/' + simulation_technique - onlyfiles = [f for f in listdir(mypath) if isfile(join(mypath, f))] - - #copy files from dump - parent_folder = 'attack_data/datasets/attack_techniques/' + simulation_technique - if not path.exists(parent_folder): - os.mkdir(parent_folder) - - folder = 'attack_data/datasets/attack_techniques/' + simulation_technique + '/atomic_red_team' - if not path.exists(folder): - os.mkdir(folder) - - for f in onlyfiles: - shutil.copy(mypath + '/' + f, folder + '/' + f) - #attack_data_repo_obj.index.add(['datasets/attack_techniques/' + simulation_technique + '/atomic_red_team/' + f]) - - dataset_urls = [] - for file in onlyfiles: - dataset_urls.append('https://media.githubusercontent.com/media/splunk/attack_data/master/datasets/attack_techniques/' + simulation_technique + '/atomic_red_team/' + file) - - dataset_obj['dataset'] = dataset_urls - dataset_obj['references'] = ['https://github.com/redcanaryco/atomic-red-team/blob/master/atomics/' + simulation_technique + '/' + simulation_technique + '.md'] - dataset_obj['sourcetypes'] = ['XmlWinEventLog:Microsoft-Windows-Sysmon/Operational', 'WinEventLog:Microsoft-Windows-PowerShell/Operational', 'WinEventLog:System', 'WinEventLog:Security'] - - - if simulation_atomics == 'none': - with open(folder + '/atomic_red_team.yml', 'w+' ) as outfile: - yaml.dump(dataset_obj, outfile , default_flow_style=False, sort_keys=False) - #attack_data_repo_obj.index.add(['datasets/attack_techniques/' + simulation_technique + '/atomic_red_team/atomic_red_team.yml']) - else: - filename = simulation_atomics.replace(' ', '_').replace('-','_').replace('.','_').replace('/','_').lower() + '.yml' - with open(folder + '/' + filename, 'w+' ) as outfile: - yaml.dump(dataset_obj, outfile , default_flow_style=False, sort_keys=False) - #attack_data_repo_obj.index.add(['datasets/attack_techniques/' + simulation_technique + '/atomic_red_team/' + filename]) - - #attack_data_repo_obj.index.commit('Added attack data') - - j2_env = Environment(loader=FileSystemLoader('templates'),trim_blocks=True) - template = j2_env.get_template('PR_template_attack_data.j2') - body = template.render() - - #attack_data_repo_obj.git.push('--set-upstream', 'origin', branch_name) - os.system('git config --global user.email "research@splunk.com"') - os.system('git config --global user.name "Attack Service"') - os.system('cd attack_data && git add --all && cd ..') - os.system('cd attack_data && git commit -m "Automated Attack Data Service" && cd ..') - os.system('cd attack_data && git push origin ' + branch_name + ' && cd ..') - g = Github(O_AUTH_TOKEN_GITHUB) - repo = g.get_repo("splunk/attack_data") - pr = repo.create_pull(title="Attack Data Service PR " + random_number, body=body, head=branch_name, base="master") - - - -def load_file(file_path): - with open(file_path, 'r') as stream: - try: - file = list(yaml.safe_load_all(stream))[0] - except yaml.YAMLError as exc: - print(exc) - sys.exit("ERROR: reading {0}".format(file_path)) - return file - - -def get_secret(secret_name): - - region_name = "eu-central-1" - - # Create a Secrets Manager client - session = boto3.session.Session() - client = session.client( - service_name='secretsmanager', - region_name=region_name - ) - - try: - get_secret_value_response = client.get_secret_value( - SecretId=secret_name - ) - except ClientError as e: - if e.response['Error']['Code'] == 'DecryptionFailureException': - # Secrets Manager can't decrypt the protected secret text using the provided KMS key. - # Deal with the exception here, and/or rethrow at your discretion. - raise e - elif e.response['Error']['Code'] == 'InternalServiceErrorException': - # An error occurred on the server side. - # Deal with the exception here, and/or rethrow at your discretion. - raise e - elif e.response['Error']['Code'] == 'InvalidParameterException': - # You provided an invalid value for a parameter. - # Deal with the exception here, and/or rethrow at your discretion. - raise e - elif e.response['Error']['Code'] == 'InvalidRequestException': - # You provided a parameter value that is not valid for the current state of the resource. - # Deal with the exception here, and/or rethrow at your discretion. - raise e - elif e.response['Error']['Code'] == 'ResourceNotFoundException': - # We can't find the resource that you asked for. - # Deal with the exception here, and/or rethrow at your discretion. - raise e - else: - # Decrypts secret using the associated KMS CMK. - # Depending on whether the secret is a string or binary, one of these fields will be populated. - if 'SecretString' in get_secret_value_response: - secret = get_secret_value_response['SecretString'] - secret_obj = json.loads(secret) - - return secret_obj['github_token'] - - -if __name__ == "__main__": - main(sys.argv[1:]) diff --git a/attack_data_service/config b/attack_data_service/config deleted file mode 100644 index a5220e02d..000000000 --- a/attack_data_service/config +++ /dev/null @@ -1,2 +0,0 @@ -[default] -region = eu-central-1 diff --git a/attack_data_service/requirements.txt b/attack_data_service/requirements.txt deleted file mode 100644 index 46adb75c7..000000000 --- a/attack_data_service/requirements.txt +++ /dev/null @@ -1,80 +0,0 @@ -ansible==4.6.0 -ansible-runner==2.0.3 -apipkg==1.5 -aspy.yaml==1.3.0 -atomicwrites==1.4.0 -attrs==24.2.0 -azure-common==1.1.27 -azure-core==1.31.0 -azure-identity==1.7.0 -azure-mgmt-compute==18.2.0 -azure-mgmt-core==1.3.0 -azure-mgmt-network==25.1.0 -azure-mgmt-resource==19.0.0 -bcrypt==3.2.0 -boto3==1.20.17 -botocore==1.21.18 -certifi==2021.5.30 -cffi==1.15.0 -cfgv==3.3.1 -chardet==5.2.0 -configparser==5.1.0 -contextlib2==0.6.0.post1 -cryptography==41.0.1 -Deprecated==1.2.13 -dnspython==2.1.0 -docutils==0.21.2 -execnet==2.1.1 -gitdb==4.0.9 -GitPython==3.1.13 -identify==2.2.13 -idna==2.10 -importlib-metadata==8.6.1 -Jinja2==3.0.2 -jmespath==0.10.0 -lockfile==0.12.2 -MarkupSafe==2.1.3 -mock==5.1.0 -more-itertools==10.1.0 -mysql-connector-python==8.0.29 -nodeenv==1.6.0 -ntlm-auth==1.5.0 -packaging==21.2 -packer.py==0.3.0 -paramiko==2.10.1 -path==15.0.0 -path.py==12.5.0 -pexpect==4.8.0 -pluggy==0.13.1 -pre-commit==2.13.0 -protobuf==3.17.3 -psutil==5.8.0 -ptyprocess==0.7.0 -py==1.11.0 -pycparser==2.20 -PyGithub==2.1.1 -PyJWT==2.10.1 -PyNaCl==1.4.0 -pyparsing==3.2.1 -pytest==6.2.5 -python-daemon==2.3.0 -python-dateutil==2.8.2 -python-terraform==0.10.1 -pywinrm==0.4.2 -PyYAML==6.0 -requests==2.25.1 -requests-ntlm==1.1.0 -s3transfer==0.5.0 -smmap==5.0.1 -six==1.16.0 -splunk-sdk==2.0.2 -tabulate==0.8.9 -termcolor==1.1.0 -toml==0.10.2 -urllib3==2.3.0 -virtualenv==20.29.2 -wcwidth==0.2.5 -wget==3.2 -wrapt==1.13.3 -xmltodict==0.12.0 -zipp==3.6.0 diff --git a/attack_data_service/static/architecture_attack_data_service.png b/attack_data_service/static/architecture_attack_data_service.png deleted file mode 100644 index 908159692..000000000 Binary files a/attack_data_service/static/architecture_attack_data_service.png and /dev/null differ diff --git a/attack_data_service/static/aws_batch_submit_job.png b/attack_data_service/static/aws_batch_submit_job.png deleted file mode 100644 index d00f02828..000000000 Binary files a/attack_data_service/static/aws_batch_submit_job.png and /dev/null differ diff --git a/attack_data_service/templates/PR_template_attack_data.j2 b/attack_data_service/templates/PR_template_attack_data.j2 deleted file mode 100644 index 861ac58ad..000000000 --- a/attack_data_service/templates/PR_template_attack_data.j2 +++ /dev/null @@ -1,3 +0,0 @@ -This PR was created by Attack Data Service :robot: - -Please review the dataset.yml and add your data. diff --git a/bin/replay_all.py b/bin/replay.py similarity index 100% rename from bin/replay_all.py rename to bin/replay.py