Skip to content

Problem with TrainingHelper's parameter sequence length  #3

@yangshao

Description

@yangshao

hi, I'm trying to implement a seq2seq model. when I use the actual sequence length as TrainingHelper's parameter, I get an error for the loss calculation:
InvalidArgumentError (see above for traceback): logits and labels must have the same first dimension, got logits shape [960,15038] and labels shape [1280]
But if I use all the same sequence length, the code works fine. what's the effects of using actual sequence length instead of using max sequence length?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions