generated from ucgmsim/template
-
Notifications
You must be signed in to change notification settings - Fork 0
Open
Description
This is kinda separate from the workflow code, but I think it's a good idea. With the upcoming changes to workflow focusing on testing and validation, it would be good to run a continuous series of integration tests for all changes to our repositories that test the workflow to ensure the output matches the a "known good" output. The idea would be as follows:
- Select or construct a limited set of realisations whose outputs we verify as known good. @sungeunbae suggests one realisation of each source type, which I think makes sense. Then somebody like Morteza could validate these outputs and tell us they look good.
- Add automation to run the entire workflow for this realisation set and compare the outputs with the known good outputs. This could be done perhaps on a weekly basis on Hypocentre with a low resolution (Sung suggests: 200m) but it should be infrequently run on NeSI too (perhaps once a month) to ensure that the target environment runs correctly too.
- In the case that the output no longer matches, report this to us somehow (perhaps a Slack message, or an email). Then we can review the differences. If the differences are positive, we can accept the new output as our known good reference output, otherwise we can investigate and fix the bugs before they cause us to have to regenerate Cybershake outputs.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels