-
Notifications
You must be signed in to change notification settings - Fork 1
Open
Description
I have two questions about this line of code:
https://github.com/csmfindling/behavior_models/blob/master/models/expSmoothing_prevAction.py#L52
- If I'm understanding correctly, the goal is to compute a likelihood to combine with prior formed from the exponentially smoothed history of actions. The likelihood that is needed is specifically p(observation | stimulus side = right) so that we can end up with a posterior p(stimulus side = right | observations). However, this code seems to compute p(observation < 0 | signed stimulus contrast strength). Are the two distributions equivalent? I would think not, but if they aren't interchangeable, why is p(observation < 0 | signed stimulus contrast strength) the correct likelihood?
I would think that if the model assumes the mouse knows the true signed stimulus contrasts strengths and their variances, then the mouse should compute \sum_{signed stimulus contrast strength} p(o|signed stimulus contrast strength) p(signed stimulus contrast strength | stimulus side = right)
- Why don't the minimum and maximum truncations introduce truncation errors?
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels