Skip to content

Is Fine - tuning feasible on the DTITR model? #2

@ChenaoB

Description

@ChenaoB

Hi

In the training task after creating the model

dtitr_model = build_dtitr_model(FLAGS, FLAGS.prot_transformer_depth[0], FLAGS.smiles_transformer_depth[0], FLAGS.cross_block_depth[0] ............... ............... FLAGS.out_mlp_depth[0], FLAGS.out_mlp_hdim[0], optimizer_fun)

Is it possible to load the weights of a previously model using the load_weights function and then train a new model with these initial parameters (do fine-tuning)?

dtitr_model.load_weights('Path')

Where Path is the address of the previous model.

Regards!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions