Skip to content

Feature Request: Validation Runs for Workflow #2

@lispandfound

Description

@lispandfound

This is kinda separate from the workflow code, but I think it's a good idea. With the upcoming changes to workflow focusing on testing and validation, it would be good to run a continuous series of integration tests for all changes to our repositories that test the workflow to ensure the output matches the a "known good" output. The idea would be as follows:

  1. Select or construct a limited set of realisations whose outputs we verify as known good. @sungeunbae suggests one realisation of each source type, which I think makes sense. Then somebody like Morteza could validate these outputs and tell us they look good.
  2. Add automation to run the entire workflow for this realisation set and compare the outputs with the known good outputs. This could be done perhaps on a weekly basis on Hypocentre with a low resolution (Sung suggests: 200m) but it should be infrequently run on NeSI too (perhaps once a month) to ensure that the target environment runs correctly too.
  3. In the case that the output no longer matches, report this to us somehow (perhaps a Slack message, or an email). Then we can review the differences. If the differences are positive, we can accept the new output as our known good reference output, otherwise we can investigate and fix the bugs before they cause us to have to regenerate Cybershake outputs.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions