Skip to content

Approximate share in Portfolio Model#1122

Open
alanlujan91 wants to merge 7 commits intoecon-ark:mainfrom
alanlujan91:ApproxShare
Open

Approximate share in Portfolio Model#1122
alanlujan91 wants to merge 7 commits intoecon-ark:mainfrom
alanlujan91:ApproxShare

Conversation

@alanlujan91
Copy link
Member

Linear approximation of risky share given next period's cFunc.

@alanlujan91 alanlujan91 requested a review from llorracc March 8, 2022 01:28
@alanlujan91
Copy link
Member Author

image

Blue line is HARK's numeric solver. Orange line is approximation.


return r_diff, r_diff ** 2

prem_mean, prem_var = calc_expectation(self.RiskyDstn, premium)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@alanlujan91 not sure if this is a bug given your notation, but I think that what your are calling prem_var is not the variance of the premium, but rather E[premium^2].

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It would be interesting to see what happens when you make the maximum amount of assets something much larger, like 2000 or 10000 or something. Out that far, the consumption function is essentially linear, so the error coming from the approximation's assumption that $c^{''}$ is zero should be inconsequential. If the orange does converge to the blue, that would suggest that maybe using the 2nd order Taylor approximation would make the results considerably closer for small values of wealth. If orange does not converge to blue, I'd guess that there's a bug somewhere.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@Mv77 you're right, I have called this prem_sqrd instead.

@alanlujan91
Copy link
Member Author

  1. large value a_max
  2. second order approximation (1+\epsilon)^\nu in Chris's notes (MathFacts TaylorTwo)
  3. Compare second order to true exp w/ large value of a_max, are they close?
  4. implement backsolving using only approximations

@alanlujan91
Copy link
Member Author

alanlujan91 commented Mar 23, 2022

Did a little bit of work on this. It seems the second order approximation with TaylorTwo rule is not particularly better than the first order approximation. Where first order approx is below true function, second order approx is above the function.

image

@llorracc
Copy link
Collaborator

I'm pretty sure there's a more basic error, because as assets go to infinity the consumption function becomes linear and the approximation should become very good.

Have you done the version where you increase the maximum value of a and the number of a gridpoints?

PS. You should be able to look at just the last period of a 2-period problem; Samuelson showed that the portfolio share is the same no matter how many periods back you are from a terminal period in his model. So, that should let you go to town with lots of points and a very large max point like 2000 or 10000.

PPS. If there's still a discrepancy, my guess is that it will be because the approximation to the shock that uses the variance is not very good.

@llorracc
Copy link
Collaborator

@alanlujan91

Do you know why the checks are failing?

@alanlujan91
Copy link
Member Author

@llorracc this is with aMax = 1000 (instead of 100).

image

@alanlujan91
Copy link
Member Author

@alanlujan91

Do you know why the checks are failing?

Yes, the tests for Discrete Solver and Joint Distribution solver are failing, but this is not supposed to be a feature of those models. I have to reorganize where the approximations happen.

@llorracc
Copy link
Collaborator

To debug, I suggest you evaluate the last expression in (15) in the appendix to

https://github.com/ccarrollATjhuecon/EquityPremiumPuzzle/blob/master/EquityPremiumPuzzle.pdf

at $a=1000$ and the first expression in equation (16) at the same point.

If the first expression in (16) is not close to zero

  • you can break both 15 and 16 down using the $E[xy] = E[x]E[y]+cov(x,y)$ formula and see whether it is the $E[x]E[y]$ term or the $cov(x,y)$ term or both that are bad.
  • You could also evaluate 15 using a linear consumption function identical to the one used in (16);
  • if the discrepancy remains largely the same, that would say that the problem is not coming from the nonlinearity of the consumption function but from the approximation of the shock.

If the first expression in (16) IS close to zero

  • That would seem to indicate that the problem is the approximation between the penultimate and the last line in (16). You could see whether a 2nd (or even a 3rd) order approximation at that point fixes things.

PS. Whether or not you manage to track down the problem as above, it would be useful if you could edit the GitHub document to incorporate any higher order approximations that you are using so we can see exactly what you are doing.

@codecov-commenter
Copy link

Codecov Report

Merging #1122 (91fb9ae) into master (e701558) will decrease coverage by 0.23%.
The diff coverage is 24.00%.

@@            Coverage Diff             @@
##           master    #1122      +/-   ##
==========================================
- Coverage   73.96%   73.72%   -0.24%     
==========================================
  Files          70       70              
  Lines       10761    10809      +48     
==========================================
+ Hits         7959     7969      +10     
- Misses       2802     2840      +38     
Impacted Files Coverage Δ
HARK/ConsumptionSaving/ConsPortfolioModel.py 82.59% <24.00%> (-6.33%) ⬇️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update e701558...91fb9ae. Read the comment docs.

@alanlujan91 alanlujan91 self-assigned this Aug 9, 2022
@sbenthall sbenthall added this to the 1.0.0 milestone Jan 4, 2023
@review-notebook-app
Copy link

Check out this pull request on  ReviewNB

See visual diffs & provide feedback on Jupyter Notebooks.


Powered by ReviewNB

Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR adds an option to construct approximate risky-share policies based on a linear/quadratic approximation using the next-period consumption function, and propagates these approximations into the portfolio model’s per-period solution.

Changes:

  • Introduces the ApproxShareBool flag (with default in init_portfolio) and threads it through PortfolioConsumerType and ConsPortfolioSolver as a time-invariant parameter.
  • Adds make_share_func_approx, which builds first- and second-order approximate share functions over the asset grid using expectations over the joint shock distribution and cFuncAdj.eval_with_derivative, and stores them (along with a new ShareEndOfPrdFunc) in PortfolioSolution when enabled.
  • Adjusts the terminal-period solution to use a linear interpolation for cFuncAdj_terminal and to define ShareEndOfPrdFunc in the terminal solution, ensuring compatibility with the new approximation logic.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

AdjPrb=self.AdjustPrb,
)

if self.ApproxShareBool:
Copy link

Copilot AI Jan 28, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When ApproxShareBool is True, make_porfolio_solution unconditionally assigns self.solution.ShareEndOfPrdFunc, self.solution.ApproxFirstOrderShareFunc, and self.solution.ApproxSecondOrderShareFunc, but these attributes are only created by make_share_func_approx, which is never called in ConsPortfolioDiscreteSolver.solve or ConsPortfolioJointDistSolver.solve. As a result, using ApproxShareBool=True with either the discrete-share solver or the joint-distribution solver will raise an AttributeError at this point. Consider either guarding this block to run only for solver types that define the approximation functions, or ensuring that the discrete and joint-dist solvers also construct ShareEndOfPrdFunc and the approximation functions when ApproxShareBool is enabled.

Suggested change
if self.ApproxShareBool:
if self.ApproxShareBool and all(
hasattr(self, attr_name)
for attr_name in (
"ShareEndOfPrdFunc",
"ApproxFirstOrderShareFunc",
"ApproxSecondOrderShareFunc",
)
):

Copilot uses AI. Check for mistakes.
Comment on lines +920 to +922
temp = np.sqrt(b ** 2 - 4 * a * c)

roots = np.array([(-b + temp) / (2 * a), (-b - temp) / (2 * a)])
Copy link

Copilot AI Jan 28, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The quadratic used for the second-order share approximation (a = -self.CRRA * prem_cube * MPC ** 2 * (-self.CRRA - 1) / 2, b = -self.CRRA * MPC * prem_sqrd, c = prem_mean) does not guarantee a non-negative discriminant b ** 2 - 4 * a * c. For admissible parameter combinations where 4 * a * c > b ** 2, temp = np.sqrt(b ** 2 - 4 * a * c) will be nan, and the resulting roots (and thus ApproxSecondOrderShareFunc) will contain nan values. It would be safer to guard against a negative discriminant (e.g., by falling back to the first-order approximation or clipping the discriminant at zero) so that enabling the second-order approximation cannot silently produce nan policies.

Suggested change
temp = np.sqrt(b ** 2 - 4 * a * c)
roots = np.array([(-b + temp) / (2 * a), (-b - temp) / (2 * a)])
# Compute discriminant and guard against invalid quadratic cases.
disc = b ** 2 - 4 * a * c
valid = np.logical_and(disc >= 0.0, a != 0.0)
temp = np.zeros_like(disc)
temp[valid] = np.sqrt(disc[valid])
# Initialize roots with first-order approximation as a safe fallback.
roots = np.vstack((approx_share.copy(), approx_share.copy()))
# For valid entries, overwrite with second-order quadratic roots.
roots[0, valid] = (-b[valid] + temp[valid]) / (2 * a[valid])
roots[1, valid] = (-b[valid] - temp[valid]) / (2 * a[valid])

Copilot uses AI. Check for mistakes.
Comment on lines +1103 to +1105
if self.ApproxShareBool:
self.make_share_func_approx()

Copy link

Copilot AI Jan 28, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The new approximation path controlled by ApproxShareBool (i.e., make_share_func_approx and the extra solution fields it populates) is not covered by the existing tests in tests/test_ConsPortfolioModel.py, which only exercise the default (ApproxShareBool=False) behavior. Given that this branch changes both the solver’s expectations (uses solution_next.ShareEndOfPrdFunc and eval_with_derivative) and the contents of the per-period PortfolioSolution, it would be valuable to add at least a smoke test that solves the model with ApproxShareBool=True (for the continuous-share, independent-shocks case) and verifies that the approximate share functions are well-defined over the asset grid and free of nan values.

Copilot uses AI. Check for mistakes.
init_portfolio["AdjustPrb"] = 1.0
# Flag for whether to optimize risky share on a discrete grid only
init_portfolio["DiscreteShareBool"] = False
# Flat for wether to approximate risky share
Copy link

Copilot AI Jan 28, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Comment typo: "Flat for wether to approximate risky share" should read "Flag for whether to approximate risky share" to match the surrounding parameter comments.

Suggested change
# Flat for wether to approximate risky share
# Flag for whether to approximate risky share

Copilot uses AI. Check for mistakes.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

Status: Stale

Development

Successfully merging this pull request may close these issues.

6 participants