-
Notifications
You must be signed in to change notification settings - Fork 13
Description
Currently we have this as our VaR definition:
def VaR (P : Findist n) (X : FinRV n ℚ) (α : ℚ) : ℚ :=
let S : Finset ℚ := (rangeOfRV X).filter (fun t => cdf P X t ≥ α)
if h : S.Nonempty then
S.min' h
else
0 --this is illegal i know -- Keith is going to fix it :)
But I'm not sure if we want to keep it like this, or if we want to do this instead:
def VaR (P : Findist n) (X : FinRV n ℝ) (α : ℝ) : ℝ :=
sInf { t : ℝ | cdf P X t ≥ α }
The advantage of the first one is that the r.v.s are in terms of ℚ, but seems more difficult to prove theorems about.
For example, I'm working on the monotonicity proof for VaR and I'm having trouble because the minimization is over the rangeOfRV X, so it’s not always true that VaR Y is in the rangeOfRV X, even if X ≤ Y. I do think I can figure out how to get around this to finish the proof but it would be a lot easier if we were using the inf definition. But then we would need to redefine some things in terms of ℝ which creates other issues.
Before we get too far into the VaR proofs, I wanted to know if anyone had some intuition about which definition might be better for us in the long run!