Skip to content

Conversation

@mimischi
Copy link
Contributor

@mimischi mimischi commented Nov 6, 2019

Changes made in this pull request:

  • Benchmarks can now be generated from a config file (TOML format).
  • Add new --config option to mdbenchmark generate.

Currently builds upon #155 to ease the development.

Example TOML

# Name of TPR file
input = "md.tpr"

# Optional name of the job
# [defaults to filename of input]
job_name = "bench"

# MD engine(s) to benchmark
modules = ["gromacs/2018.8"]

# Skip validation of MD engine
# [default: skip_validation=false]
skip_validation = false

# Number of nodes to scale benchmarks on
# [default: min_nodes=1, max_nodes=5]
min_nodes = 1
max_nodes = 5

# Run time of each benchmark
# [default: time=15]
time = 15

# Name of the job template
host = "draco"

# Create benchmarks for CPUs and or GPUs?
# cpu=true & gpu=false: Generate benchmarks for CPU-only runs
# cpu=false & gpu=true: Generate benchmarks for mixed CPU-GPU runs
# cpu=true & gpu=true: Generate benchmarks for a) CPU-only and b) mixed CPU-GPU runs
# [default: cpu=true, gpu=false]
cpu = true
gpu = false

#########################################################
## Skip prompts and generate benchmarks without questions
## This WILL write to your file system.
#########################################################
skip_prompts = true

PR Checklist

  • Added changelog fragment in ./changelog/ (more information)?
  • Added unit tests.
  • Added example TOML to documentation and README.rst.
  • Rebase onto develop, as soon as Add Poetry #155 was merged.
  • Issue raised/referenced?

@kain88-de
Copy link
Contributor

Maybe you can add the example configuration to the repository, or documentation

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants