Merged
Conversation
This is for issue gren-lang#359 It's a bit fragile, as it directly builds up an AST, instead of relying on the parser to create one. It checks that the "check" fucntion returns an Incomplete. Today, this test fails, of course.
Also, use A.zero as a "zero" Region, since we don't care about the position of the token within the source code.
Test the original reported case, and the case which failed after an incorrect fix was attempted. This unit test now takes the [Error] returned by checkPattern, and converts it into a [[String]], making it very easy to test.
Kudos to ChatGPT, which explains:
The bug is in the “record” branch of `isExhaustive`. Right now,
when the first column contains (only) records (i.e. `numSeen == 0` and
`extractRecordPatterns` succeeds), the code does this:
1. Collect all field names seen anywhere in the matrix (`baseRecord`).
2. For **each field name independently**, check exhaustiveness of that one
field (`specializeRowByRecordField fieldName`), then wrap the resulting
missing patterns back into a single-field record.
That is logically wrong for records: it’s checking **each field in
isolation** (like an OR), but exhaustiveness requires the **cartesian
product of fields** (like an AND across fields). So your first example
slips through because:
- for field `a`, both `True` and `False` appear somewhere
- for field `b`, both `True` and `False` appear somewhere
…but not all **combinations** appear.
This is already handled correctly by `simplify`: `Can.PVar _` becomes
`Anything`. So destructured names like `{ key = lKey }` do *not* introduce
additional constraints. That’s why the `Just { first = { key = lKey,
value = lValue }, rest }` pattern should not force enumeration of all
`key/value/rest` combinations—those are `Anything` patterns. The fix
below preserves that behavior.
---
Instead of iterating fields one-by-one with `specializeRowByRecordField`,
we must treat a record pattern as a product of its fields: build a
consistent “base” field set and specialize the matrix into a vector
of field-patterns (one per field) and recurse on that vector.
You already have the machinery for this in `isUseful`:
- it uses `collectRecordFieldsWithAnyPattern` to get a base map `{fieldName -> Anything}`
- then uses `specializeRowByRecord baseMap` to expand a record into `Map.elems specializedMap`
`isExhaustive` should do the same.
No other code changes are required.
---
Patterns:
- `{ a = False, b = True }`
- `{ a = True, b = False }`
After specialization by the base record `{a = _, b = _}`, the algorithm
checks exhaustiveness on the **2-column** matrix `[aPattern, bPattern]`
and will find missing rows such as:
- `{ a = False, b = False }`
- `{ a = True, b = True }`
So it correctly reports `Incomplete`.
Inside `Just { first = { key = lKey, value = lValue }, rest }` all
those `lKey/lValue/rest` are `PVar`, so `simplify` turns them into
`Anything`. That makes the record subpatterns maximally general, so
`Ctor Just [...]` covers all `Just _` cases. Together with `Nothing`,
the `Maybe` match is exhaustive, and the record-product recursion won’t
introduce spurious missing combinations.
robinheghan
reviewed
Jan 25, 2026
tests/Nitpick/PatternMatchesSpec.hs
Outdated
Comment on lines
19
to
33
| -- Create a Pkg.Name | ||
| packageName :: String -> String -> Pkg.Name | ||
| packageName pkgName authorName = | ||
| Pkg.Name | ||
| { Pkg._author = Utf8.fromChars authorName | ||
| , Pkg._project = Utf8.fromChars pkgName | ||
| } | ||
|
|
||
| -- Create a ModuleName.Canonical | ||
| moduleNameCanonical :: String -> String -> String -> ModuleName.Canonical | ||
| moduleNameCanonical pkgName authorName modName = | ||
| ModuleName.Canonical | ||
| { ModuleName._package = packageName pkgName authorName | ||
| , ModuleName._module = N.fromChars modName | ||
| } |
Member
There was a problem hiding this comment.
Don't we already have helper functions for this? Or maybe you can just treat Pkg.Name and ModuleName.Canonical as constructor functions directly?
Either way, I believe these helper functions to be uneccessary
Contributor
Author
There was a problem hiding this comment.
Oh, ok! I'll explore that.
Member
|
Cool! This was the sort of approach I was thinking of exploring myself (it mirrors the way custom types are checked). Thank you for exploring this :) |
Member
|
You need to run the formatter. Either run |
Use Pkg.Name and ModuleName.Canonical as contructors, directly.
Contributor
Author
|
Both of the PR comments were addressed in this latest push. |
Member
|
Thank you! |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Fixes #359
This fix comes from having ChatGPT look at the problem. It's complete report is in the commit message. Summary:
Fix: specialize whole records, not one field at a time
Instead of iterating fields one-by-one with
specializeRowByRecordField, we must treat a record pattern as a product of its fields: build a consistent “base” field set and specialize the matrix into a vector of field-patterns (one per field) and recurse on that vector.You already have the machinery for this in
isUseful:collectRecordFieldsWithAnyPatternto get a base map{fieldName -> Anything}specializeRowByRecord baseMapto expand a record intoMap.elems specializedMapisExhaustiveshould do the same.All unit tests pass. I have used a compiler built from this source to build gren-lang/core, gren-lang/node, and also my own application.