-
Notifications
You must be signed in to change notification settings - Fork 58
Description
Is there an existing feature request for this?
- I have searched the existing issues.
Command
Tast semantic tree via snapshots
Description
I would like to pitch an idea of adding support for semantic based snapshot tests. Let's say we have a test that produces this snapshot:
during snapshotting phase, we could also scan the semantics tree and produce some kind of output. For example:
Login screen
Username. Text Field.
Password. Text Field.
Submit. Button.
During comparison phase, we would then test both text and image data.
Reasoning
-
Semantic tests are more rebust than pixel tests. If semantic check passes, but image check fails, we know it's probably some minor visual change, like a different color of padding.
-
By testing semantics, we provide a better product who people who use screen readers.
-
Because semantics are encoded as text, they are handled more gracefully by common tools such as git or LLMs.
-
If the tested widget has some kind of styling/theming, we could have multiple images and require that they each have exact same semantics.
Additional context and comments
Obstacles
The biggest I have found is that matchesGoldenFile only supports images.
svg/html output
Instead of print a flat list of accessibility labels, we could perhaps make an SVG or HTML that shows them in a more visual form. But this would require encoding exact pixel positions into the text, which in turn would make those tests less resilient against change.
Image based tests
Semantic snapshot testing is not a new idea, but I have only seen it done via images.
Presentation about image based testing in flutter using alchemist
https://www.youtube.com/live/H7mzY_E9MKs?t=20463s
A library for image based accessibility snapshots for native iOS
github.com/cashapp/AccessibilitySnapshot