Skip to content

Conversation

@victorlee0505
Copy link

@victorlee0505 victorlee0505 commented Oct 4, 2023

this PR is for #154.

I believe PreTrainedTokenizer super().__init__(**kwargs) attempted to use get_vocab from LLM but self._llm = llm is not yet set, hence it can not access to the LLM.

So I move it down and it works now.

*force-pushed to change author

Signed-off-by: Victor Lee <victorlee0505@gmail.com>
Signed-off-by: Victor Lee <victorlee0505@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant