Conversation
…d has been deprecated, functionality tested and no changes on tests 2) Slight wording change on comment of EnumSymbs class 3) Added __getstate__() and __setstate__() functions to EnumSymbs class as pickling was using default __getstate__() provided by basic which just returns null. This allows pickling to account for and store slotted variables and fixes basic_wick_test.py::test_ancr_character_has_basic_properties
…nal did not run in parallel and so did not show the issue, running tests in vscode runs in parallel and thus more than half the tests failed when batch run (which some vscode users would do) but passed perfectly fine when run individually. Removing the scope='module' from the free algebra fixture disallows sharing state between tests and thus prevents collisions. I also did not notice any slowdown from forcing individual environment creations for each test.
…ly resolve a clebsch-gordan coefficient but the test was made to check a symbolic simplification, the clebsch-gordan function failed attempting to resolve a number to these symbols. Altering simplify->simplify_am passes test
…tempting to resolve a symbolic clebsch-gordan coefficient. Also added some comments to the drs function retrieval function
…omplexity of dealing with where it's been run from is simply too much
install.sh
Outdated
| # Copy the cpp files where needed | ||
| echo "Moving cpp files" | ||
| cp build/lib.linux-x86_64-cpython-39/drudge/wickcore.cpython-39-x86_64-linux-gnu.so drudge/ | ||
| cp build/lib.linux-x86_64-cpython-39/drudge/canonpy.cpython-39-x86_64-linux-gnu.so drudge/ |
| echo "Moving Dummy Spark" | ||
| cp -r ../dummyRDD/dummy_spark . | ||
|
|
||
| # Remove unneeded folder | ||
| rm -rf ../dummyRDD/ |
There was a problem hiding this comment.
why do we want to avoid installation of dummyRDD?
I agree that this can simplify things by reducing number of repos/packages to be installed, but it introduces another manual copy-paste. Either way is fine with me.
install.sh
Outdated
| rm -rf ../dummyRDD/ | ||
|
|
||
| echo "Installation Complete!" | ||
| echo 'Assuming you ran this with '\''source'\'' you'\''re now inside the drudge folder. To get started with development simply type '\''code .'\'' and you'\''ll open a vscode window at this location' |
There was a problem hiding this comment.
Assumes user has VSCode. This is okay, but we should specify it explicitly. For instance, people using terminals to log-in to remote machines do not always have VSCode.
gauravharsha
left a comment
There was a problem hiding this comment.
Minor comments, but otherwise ready to merge after the tests pass.
|
Looks like the tests and checks fail because Python 3.7 (enforced in Github actions) is no longer supported. We might as well fast forward to 3.9. |
chenpeizhi
left a comment
There was a problem hiding this comment.
It would be better to figure out what is causing the failed test instead of just modifying the test to let it pass.
I assume merge_j in deep_simplify is the culprit. According to your description, it seems that the expression didn't simplify to 0 because the types of amplitudes of two terms do not match each other. We should probably make sure that the Clebach-Gordan coefficients are computed as Sympy Integer, Rational, or Pow instead of Python float.
|
I was trying to refer to this particular commit, where the tests were modified to let one particular test pass. The other commits seem to be fine. |
|
Yes, I misunderstood it initially, so removed my comment. |
No worries. I didn't figure out how to quote a commit in my initial comment, so it might seem a little out of context. |
… installfrom environment file. I think this is some kind of memory overrun or something, separating every install out into single lines works on a fresh conda install. We'll see if it works generally
|
I have cherry-picked the fixes to the new PR #22, which supersedes this one. |
This is the catch all PR for updates to function on Python 3.9
There are small code changes to resolve deprecations and errors, and updates to documentation. At time of this comment, full Apache Spark is still nonfunctional due to pickling errors with some of our more complicated object types.