Skip to content
This repository was archived by the owner on Mar 31, 2025. It is now read-only.
This repository was archived by the owner on Mar 31, 2025. It is now read-only.

objax.Jacobian and objax.Hessian similar to objax.Grad #234

@gkaissis

Description

@gkaissis

It would be amazing to have a direct op to compute jacobians and hessians w.r.t. model parameters like we have for objax.Grad. I suppose that these would require an unreduced loss value (i.e. raise an exception if the loss value is scalar).
The Jacobian would then essentially be a stand-in for "per-sample" gradients. Understandably, the Hessian is probably not tractable from a memory point-of-view for NNs. However, it would still make sense to add if the op can be jitted into some function to e.g. compute the condition number of the Hessian and never ends up materialising the matrix itself.
What's the developers' opinion on this?

Thank you and congratulations on the amazing package. Objax is making transitioning from PyTorch very easy!

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions