-
Notifications
You must be signed in to change notification settings - Fork 41
Removed inputs from Pauli presimulation #392
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Codecov Report✅ All modified and coverable lines are covered by tests. Additional details and impacted files@@ Coverage Diff @@
## master #392 +/- ##
==========================================
+ Coverage 84.87% 85.02% +0.14%
==========================================
Files 46 46
Lines 6639 6649 +10
==========================================
+ Hits 5635 5653 +18
+ Misses 1004 996 -8 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @emlynsg for this PR, it addresses a crucial issue!
I made some small code remarks and some comment about the API below. We can discuss it.
We can think of a strongly deterministic pattern
with
With the current API, Pattern.perform_pauli_measurements can be understood as a transformation
where
This implies that
I understand that this is a limitation of the Pauli presimulator, and it's a longterm goal to improve it, but my concern is that this dimensionality reduction occurs inadvertently. In particular, the example in #168 (comment) still fails. At the same time, I think it's natural to expect that the Pauli presimulation preserves the unitary the pattern represents.
I propose two alternative solutions:
-
Defensive approach:
Pattern.perform_pauli_measurementsraises an exception if called on a pattern with a finite number of inputs. -
Flexible approach:
Pattern.perform_pauli_measurementsraises an warning when the number of inputs is reduced.
In either case, this behavior should be clearly explained in the docstring.
Thanks Mateo. I agree, and was thinking along the same lines (but forgot to mention anything in the commit text for discussion. It seems in general that we have taken the defensive approach for other similar parts of the API. One idea is to add some sort of |
73ea54f to
a6b35db
Compare
43081a4 to
6a235e1
Compare
|
could you also update the tutorial before merging? thanks. |
graphix/pattern.py
Outdated
| if pat._pauli_preprocessed is True: | ||
| return pat |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This test is fragile:
- If
copyisTrue,pat._pauli_preprocessedwill always beFalse, even ifpatternhas been Pauli-processed. - Even if
copyisFalse, if we Pauli-preprocess a pattern and then add an element to it,_pauli_preprocessedwill be reset, butresultsstill contains measurements that will be discarded.
I think properly handling the former results contents is the way to go.
Co-authored-by: thierry-martinez <thierry.martinez@inria.fr>
6752116 to
0608999
Compare
matulni
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks! In the future we can reconsider removing self._pauli_preprocessed altogether, but I think that's outside the scope of this PR.
LGTM!
|
Regarding e9eeafd (disabling the QASM parser): we can keep the QASM parser enabled with PR TeamGraphix/graphix-qasm-parser#7, which makes graphix-qasm-parser use graphix master branch while the next version is not released. This will fix the graphix CI without disabling any tests. Currently, the graphix-qasm-parser main branch fails to pass the CI, and this issue will be fixed as well. |
This commit updates the baseline while TeamGraphix#392 is not merged.
thierry-martinez
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM!
PR correcting issues identified in discussion of issue #363 . Opening PR with a couple of remaining todos to allow for discussion about the merits of removing the
keep_inputsfield across the Graphix codebase.This PR removes the
keep_inputsfield and related code from Pauli presimulation components of patterns and tests.This also results in the removal of any inputs passed through the Pauli presimulator, and new inputs being generated for output patterns.
Todo: