Skip to content
This repository was archived by the owner on Nov 17, 2025. It is now read-only.

Add Student's t RandomVariable#1211

Merged
brandonwillard merged 1 commit intoaesara-devs:mainfrom
rlouf:add-studentt
Nov 3, 2022
Merged

Add Student's t RandomVariable#1211
brandonwillard merged 1 commit intoaesara-devs:mainfrom
rlouf:add-studentt

Conversation

@rlouf
Copy link
Member

@rlouf rlouf commented Sep 24, 2022

Numpy only defines standard_t (it has inconsistent naming with the location-scale library), so I defined it as a ScipyRandomVariables instead. I followed what was done for the other members of the location-scale family.

Here are a few important guidelines and requirements to check before your PR can be merged:

  • There is an informative high-level description of the changes.
  • The description and/or commit message(s) references the relevant GitHub issue(s).
  • pre-commit is installed and set up.
  • The commit messages follow these guidelines.
  • The commits correspond to relevant logical changes, and there are no commits that fix changes introduced by other commits in the same branch/BR.
  • There are tests covering the changes introduced in the PR.

Ticks one off of #1093

@codecov
Copy link

codecov bot commented Sep 24, 2022

Codecov Report

Merging #1211 (4be8bc0) into main (1390cc3) will increase coverage by 0.00%.
The diff coverage is 100.00%.

Additional details and impacted files

Impacted file tree graph

@@           Coverage Diff           @@
##             main    #1211   +/-   ##
=======================================
  Coverage   74.10%   74.11%           
=======================================
  Files         174      174           
  Lines       48624    48636   +12     
  Branches    10351    10351           
=======================================
+ Hits        36035    36047   +12     
  Misses      10301    10301           
  Partials     2288     2288           
Impacted Files Coverage Δ
aesara/tensor/random/basic.py 99.03% <100.00%> (+0.02%) ⬆️

@rlouf rlouf changed the title Add the Student's t RandomVariable Add Student's t RandomVariable Sep 25, 2022
gamma = GammaRV()


class StandardGammaRV(GammaRV):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If we only intend to add standard_gamma for NumPy interface compatibility, then the GammaRV Op alone should suffice, no? For instance, standard_gamma could be an Op constructor function that effectively removes the shape and rate arguments from GammaRV.

As always, we want to avoid adding new Ops whenever we reasonably can.

N.B. This is an example of the concern stated in aesara-devs/aemcmc#67 (comment).

Copy link
Contributor

@ricardoV94 ricardoV94 Oct 1, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think the issue is that you can't use RandomStream if you don't have a specific Op. That's why the standard_nornal was made a subclass as well IIRC

Copy link
Member Author

@rlouf rlouf Oct 1, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If we only intend to add standard_gamma for NumPy interface compatibility, then the GammaRV Op alone should suffice, no? For instance, standard_gamma could be an Op constructor function that effectively removes the shape and rate arguments from GammaRV.

As always, we want to avoid adding new Ops whenever we reasonably can.

N.B. This is an example of the concern stated in aesara-devs/aemcmc#67 (comment).

I agree with all this.

@ricardoV94 we can monkey patch the base classes. I'm not a big fan of this approach, but it should work:

import aesara.tensor as at
from aesara.tensor.random.basic import NormalRV

def create_standard_normal():

    def standard_call(self, shape, size=None, **kwargs):
        return self.general_call(0., 1., size, **kwargs)

    RV = NormalRV
    RV.general_call = RV.__call__
    RV.__call__ = standard_call
    return RV()


standard_normal = create_standard_normal()

print(type(normal))
# <class 'aesara.tensor.random.basic.NormalRV'>
print(type(standard_normal))
# <class 'aesara.tensor.random.basic.NormalRV'>

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does it work with RandomStream like that?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it will, RandomStream checks that the attribute is both in aesara.tensor.random.basic and is an instance of RandomVariable. We'll soon know for sure.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This quickly gets complicated, monkey patching affects the class globally so i would need to replace the method of the instance directly.

By the way, I've noticed that NumPy's Generator has a a method for every RV

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe we can add a canonical rewrite that replaces the standard versions immediately?

Or can we tweak RandomStream perhaps?

Copy link
Member Author

@rlouf rlouf Oct 1, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe we can add a canonical rewrite that replaces the standard versions immediately?

That would certainly work. But I'd rather not rely on rewrites to "fix" a problem in the representation of objects in the IR, which is also my concern in #1213. It would be better if they were represented by the same type in the original graph.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Adding / Replacing most methods in class instances works fine, for instance:

import aesara.tensor as at
from aesara.tensor.random.basic import t, StudentTRV

def create_standard_t():

    C = StudentTRV()

    def new_call(self, df, size=None, **kwargs):
        return self.__call__(df, 0, 1, size, **kwargs)

    C.call = new_call.__get__(C)
    return C

standard_t = create_standard_t()
print(standard_t.call(2.).owner.inputs)
# [RandomGeneratorSharedVariable(<Generator(PCG64) at 0x7F618591A420>), TensorConstant{[]}, TensorConstant{11}, TensorConstant{2.0}, TensorConstant{0}, TensorConstant{1}]

However, special methods like __call__ are looked up with respect to the class of the object, and not its instance. So if we monkey patch the method it will end up affecting all the instances of the class, and thus instances of the non-standard RandomVariable. So I think that's a dead end.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, I think we should be able to update RandomStream to handle these cases without too much difficulty and/or compromise.

@rlouf rlouf force-pushed the add-studentt branch 2 times, most recently from ad83911 to 252a4b5 Compare October 1, 2022 08:11
@rlouf rlouf marked this pull request as draft October 17, 2022 20:12
@rlouf
Copy link
Member Author

rlouf commented Nov 3, 2022

Removed the commits related to StandardXRVs, I will open an issue for that and reference the relevant comments. This PR is only about StudentTRV.

@rlouf rlouf marked this pull request as ready for review November 3, 2022 14:43
@brandonwillard brandonwillard added the random variables Involves random variables and/or sampling label Nov 3, 2022
@brandonwillard brandonwillard merged commit eadc6e3 into aesara-devs:main Nov 3, 2022
@rlouf rlouf deleted the add-studentt branch November 3, 2022 16:46
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

random variables Involves random variables and/or sampling

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants