Skip to content

Conversation

@lessw2020
Copy link

add buf_init_callback.
This is an extension of the modeling_llama.py PR here:
https://github.com/huggingface/transformers/pull/32428/files

What does this PR do?

This adds the same class level buf_init_callback dict that allows someone to regenerate the rope embeddings that are destroyed during meta and then fake tracing. (meta can be resolved via intercepting buffers, but running a fake tensor to trace for PP then destroys the rope embeddings as they become fake tensors.

The buf_init_callbacks allows you to quickly find the entry point to regenerate all the non-persistent embeddings after tracing ensuring you have a ready to run model graph.

Fixes # (issue)

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you read the contributor guideline,
    Pull Request section?
  • Was this discussed/approved via a Github issue or the forum? Please add a link
    to it if that's the case.
  • Did you make sure to update the documentation with your changes? Here are the
    documentation guidelines, and
    here are tips on formatting docstrings.
  • Did you write any new necessary tests?

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

add buf_init_callback
prefix with "pattern." to trigger layer pattern matching rather than exact match.
@amyeroberts
Copy link
Contributor

cc @ArthurZucker @gante

Copy link
Collaborator

@ArthurZucker ArthurZucker left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Answered on #32428

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants