Skip to content

Self-contained preprocessing code#15

Merged
brentyi merged 10 commits intomainfrom
brent/self_contained_preprocessing
May 6, 2025
Merged

Self-contained preprocessing code#15
brentyi merged 10 commits intomainfrom
brent/self_contained_preprocessing

Conversation

@brentyi
Copy link
Copy Markdown
Owner

@brentyi brentyi commented May 6, 2025

Preprocessing for training is a recurring issue: #11, #12, #14.

The cause of this is that our method was originally trained using data processed for another project, where there was some historical complexity. Including:

  • A fork of the HuMoR preprocessing script integrated in a much larger system (which had a Hydra-based YAML config and other complexities)
  • A custom version of the SMPL-H model (gender-neutral, with hand PCA information concatenated)
  • Logic for converting male/female SMPL-H parameters to gender-neutral ones
  • Possibly other changes or forks I'm not aware / that have been lost to the sands of time

I haven't been thorough enough with all of the details here. This PR aims to correct that.

To-dos:

  • Self-contained version of the HuMoR preprocessing script
    • Updated dependencies
    • Monkeypatch smplx package to prevent shape + PCA errors
    • Tested
    • Gender conversion utilities in repository
  • Document in README
  • Check whole pipeline end-to-end

@brentyi brentyi merged commit 83257f6 into main May 6, 2025
1 check passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant