Improve kernel type annotations#242
Improve kernel type annotations#242alexfikl wants to merge 11 commits intoinducer:dataclass-kernelfrom
Conversation
| # {{{ DifferentiatedExprDerivativeTaker | ||
|
|
||
| DerivativeCoeffDict = dict[tuple[int, ...], Any] | ||
| DerivativeCoeffDict = dict[tuple[int, ...], int | float | complex | sym.Expr] |
There was a problem hiding this comment.
Not sure this is correct, but it seems to match most usage?
| # Before adding a function here, make sure it's present in both modules. | ||
| Add = sym.Add | ||
| Basic = sym.Basic | ||
| Expr = sym.Expr |
There was a problem hiding this comment.
Mostly added this for typing. Not sure it's a good idea?
91f34c1 to
983fff0
Compare
|
|
||
| def __repr__(self): | ||
| @override | ||
| def __str__(self) -> str: | ||
| return f"ExprKnl{self.dim}D" |
There was a problem hiding this comment.
This seemed more of a __str__ than __repr__. Is that ok? I think some of the derivative wrappers already just defined __str__ like this.
| helmholtz_k_name: str | ||
| """The argument name to use for the Helmholtz parameter when generating | ||
| functions to evaluate this kernel. | ||
| """ |
There was a problem hiding this comment.
Is allow_evanescent actually used anywhere? Couldn't find what it's for..
| def get_derivative_taker( | ||
| self, | ||
| dvec: sp.Matrix, | ||
| rscale: ArithmeticExpr, |
There was a problem hiding this comment.
Is this a good type for rscale? Most places I could find just set it to 1 (or some level dependent float?).
9c8fe87 to
111794c
Compare
|
This should mostly have the same failures as #240, minus the |
111794c to
e267a48
Compare
|
|
Ah nvm, you just answered that. :) |
Yeah 😁 I looked at those |
52769d7 to
eae30ab
Compare
eae30ab to
812e1d0
Compare
548105d to
d9fc1eb
Compare
|
Argh. It looks like merging #240 autoclosed this. That was not the intent. Github, are you OK? |
This is work on top of #240 that adds more types and improves some docs.
I mostly wanted to type the derivative removers for some stuff in
pytential, but started adding more.. and then realized you're already working on it in #240. Hopefully this isn't interfering too much 😟