Skip to content

[WIP] Interpolation overhaul #1203

Open
alanlujan91 wants to merge 97 commits intoecon-ark:mainfrom
alanlujan91:map_coordinates
Open

[WIP] Interpolation overhaul #1203
alanlujan91 wants to merge 97 commits intoecon-ark:mainfrom
alanlujan91:map_coordinates

Conversation

@alanlujan91
Copy link
Member

@alanlujan91 alanlujan91 commented Jan 9, 2023

Please ensure your pull request adheres to the following guidelines:

  • Tests for new functionality/models or Tests to reproduce the bug-fix in code.
  • Updated documentation of features that add new functionality.
  • Update CHANGELOG.md with major/minor changes.

@sbenthall
Copy link
Contributor

#1242 is about the EconForge interpolator as an additional backend to multivariate interpolation?

Copy link
Collaborator

@llorracc llorracc left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Small comment: My pref is to be more systematic about naming, and have the general followed by the specific. Like, it should be interp_bilinear, interp_decay, interp_linear. Also, is LinearFast an interpolator? If so it should be interp_linear_fast

@codecov
Copy link

codecov bot commented May 16, 2023

Codecov Report

Patch coverage: 53.90% and project coverage change: -0.54 ⚠️

Comparison is base (2209109) 72.55% compared to head (18f4329) 72.01%.

Additional details and impacted files
@@            Coverage Diff             @@
##           master    #1203      +/-   ##
==========================================
- Coverage   72.55%   72.01%   -0.54%     
==========================================
  Files          78       85       +7     
  Lines       13009    13391     +382     
==========================================
+ Hits         9439     9644     +205     
- Misses       3570     3747     +177     
Impacted Files Coverage Δ
HARK/interpolation/_econforge.py 90.36% <ø> (ø)
HARK/interpolation/_hark.py 43.31% <ø> (ø)
HARK/interpolation/tests/test_hark.py 100.00% <ø> (ø)
HARK/interpolation/_sklearn.py 39.82% <39.82%> (ø)
HARK/interpolation/_multi.py 41.98% <41.98%> (ø)
HARK/interpolation/_scipy.py 85.18% <85.18%> (ø)
HARK/interpolation/__init__.py 100.00% <100.00%> (ø)
HARK/interpolation/tests/test_econforge.py 100.00% <100.00%> (ø)
HARK/interpolation/tests/test_multi.py 100.00% <100.00%> (ø)
HARK/interpolation/tests/test_scipy.py 100.00% <100.00%> (ø)
... and 1 more

☔ View full report in Codecov by Sentry.
📢 Do you have feedback about the report comment? Let us know in this issue.

Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This pull request performs a major refactoring of the interpolation functionality in HARK, reorganizing code from a single module (HARK.econforgeinterp) into a structured package (HARK.interpolation) with multiple submodules. The refactoring introduces new interpolation backends using scipy and scikit-learn, alongside the existing econforge and HARK-native implementations.

Changes:

  • Reorganizes interpolation code into a package structure with submodules: _econforge, _hark, _multi, _scipy, and _sklearn
  • Adds new dependencies: scikit-learn, recommonmark>=0.7, and version constraints on quantecon>=0.6
  • Updates import paths in examples and documentation from HARK.econforgeinterp to HARK.interpolation
  • Creates comprehensive test suite for each interpolation backend

Reviewed changes

Copilot reviewed 15 out of 17 changed files in this pull request and generated 9 comments.

Show a summary per file
File Description
requirements/base.txt Adds scikit-learn, recommonmark, and version constraint for quantecon
environment.yml New conda environment file with package dependencies
examples/Interpolation/README.md New README linking to external examples
examples/Interpolation/DecayInterp.py Updates import from econforgeinterp to interpolation
examples/Interpolation/DecayInterp.ipynb Updates import path in notebook
HARK/interpolation/init.py New package init with wildcard imports from submodules
HARK/interpolation/_econforge.py Refactored econforge interpolation classes
HARK/interpolation/_hark.py Refactored HARK-native interpolation classes (5192 lines)
HARK/interpolation/_multi.py New multivariate interpolation implementations
HARK/interpolation/_scipy.py New scipy-based interpolation wrappers
HARK/interpolation/_sklearn.py New scikit-learn based interpolation classes
HARK/interpolation/tests/test_*.py Comprehensive test suite for all backends
Documentation/reference/tools/econforgeinterp.rst Updates documentation module path

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

pandas>=1.5
quantecon
quantecon>=0.6
recommonmark>=0.7
Copy link

Copilot AI Jan 28, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The dependency recommonmark>=0.7 appears to be a documentation-related package. It should likely be in a separate documentation requirements file rather than in base.txt, as it's not needed for runtime functionality.

Suggested change
recommonmark>=0.7

Copilot uses AI. Check for mistakes.
quantecon
quantecon>=0.6
recommonmark>=0.7
scikit-learn
Copy link

Copilot AI Jan 28, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The scikit-learn dependency is added without a version constraint. This could lead to compatibility issues. Consider specifying a minimum version (e.g., scikit-learn>=1.0).

Suggested change
scikit-learn
scikit-learn>=1.0

Copilot uses AI. Check for mistakes.
- numba>=0.56
- numpy>=1.23
- pandas>=1.5
- quantecon
Copy link

Copilot AI Jan 28, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In environment.yml, quantecon does not have a version constraint while base.txt specifies quantecon>=0.6. For consistency and reproducibility, consider adding the same version constraint here.

Suggested change
- quantecon
- quantecon>=0.6

Copilot uses AI. Check for mistakes.
- pandas>=1.5
- quantecon
- scipy>=1.10
- scikit-learn
Copy link

Copilot AI Jan 28, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In environment.yml, scikit-learn is listed without a version constraint, but it's a critical dependency for the sklearn interpolation module. Consider adding a minimum version constraint for consistency with other dependencies.

Suggested change
- scikit-learn
- scikit-learn>=1.2

Copilot uses AI. Check for mistakes.
---------------

.. automodule:: HARK.econforgeinterp
.. automodule:: HARK.interpolation.econforgeinterp
Copy link

Copilot AI Jan 28, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The documentation reference path is updated from HARK.econforgeinterp to HARK.interpolation.econforgeinterp, but based on the new module structure, it should be HARK.interpolation._econforge since the submodules use underscore prefixes. Verify this path is correct or if the automodule directive should reference the public API through HARK.interpolation.

Suggested change
.. automodule:: HARK.interpolation.econforgeinterp
.. automodule:: HARK.interpolation._econforge

Copilot uses AI. Check for mistakes.
Comment on lines +1 to +5
from HARK.interpolation._econforge import *
from HARK.interpolation._hark import *
from HARK.interpolation._multi import *
from HARK.interpolation._scipy import *
from HARK.interpolation._sklearn import *
Copy link

Copilot AI Jan 28, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Using wildcard imports (from module import *) in init.py can lead to namespace pollution and make it unclear which symbols are being exported. Consider either: 1) explicitly listing the imports, or 2) defining __all__ in each submodule to control what gets exported. This is especially important when combining multiple modules that might have overlapping names.

Suggested change
from HARK.interpolation._econforge import *
from HARK.interpolation._hark import *
from HARK.interpolation._multi import *
from HARK.interpolation._scipy import *
from HARK.interpolation._sklearn import *
import importlib
_submodules = [
"HARK.interpolation._econforge",
"HARK.interpolation._hark",
"HARK.interpolation._multi",
"HARK.interpolation._scipy",
"HARK.interpolation._sklearn",
]
# Dynamically re-export public names from the submodules.
# This mimics "from <module> import *" behavior:
# - If a submodule defines __all__, use that.
# - Otherwise, export all attributes that do not start with "_".
for _mod_name in _submodules:
_mod = importlib.import_module(_mod_name)
_names = getattr(_mod, "__all__", None)
if _names is None:
_names = [n for n in dir(_mod) if not n.startswith("_")]
for _name in _names:
globals()[_name] = getattr(_mod, _name)
# Optionally, define __all__ for this package to list exported names.
__all__ = [n for n in globals() if not n.startswith("_")]

Copilot uses AI. Check for mistakes.
Comment on lines +119 to +120
else:
raise AttributeError(f"Feature {feature} not recognized.")
Copy link

Copilot AI Jan 28, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The logic at lines 108-120 has a problematic flow. When feature is None or not a string (line 108), the code jumps to the else block at line 119 which raises an AttributeError. However, when feature is None, this error is misleading since None is a valid case according to the docstring. The else block at line 119 should be removed, and only the inner else at line 117 should raise the error for unrecognized string features.

Suggested change
else:
raise AttributeError(f"Feature {feature} not recognized.")

Copilot uses AI. Check for mistakes.
numpy>=1.23
pandas>=1.5
quantecon
quantecon>=0.6
Copy link

Copilot AI Jan 28, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The version constraint quantecon>=0.6 may be too loose. Consider specifying a more precise version range (e.g., quantecon>=0.6,<1.0) to avoid potential breaking changes in future major versions.

Suggested change
quantecon>=0.6
quantecon>=0.6,<1.0

Copilot uses AI. Check for mistakes.
self.pipeline = pipeline

X_train = np.reshape(self.grids, (self.ndim, -1))
y_train = np.mgrid[[slice(0, dim) for dim in self.shape]]
Copy link

Copilot AI Jan 28, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This instance of list is unhashable.

Suggested change
y_train = np.mgrid[[slice(0, dim) for dim in self.shape]]
y_train = np.mgrid[tuple(slice(0, dim) for dim in self.shape)]

Copilot uses AI. Check for mistakes.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

Status: Stale

Development

Successfully merging this pull request may close these issues.

5 participants