Skip to content

[FEA] Support feeding pre-trained embeddings to TF4Rec model with high-level api #475

@rnyak

Description

@rnyak

🚀 Feature request

Currently we do not have out of the box support for adding pre-trained embeddings to embedding layer, and ability to freeze them, and train a TF4Rec model. We have embedding_initializer but we never tested if it works accurately and as expected. May be we can create in PyTorch a class like TensorInitializer (TF) as we did in Merlin Models and expose the embedding initializer and trainable args to the user.

We need to

  • Expose definition of embeddings module in the input blocks: TabularFeatures and TabularSequenceFeatures
  • Support feeding pre-trained embeddings to TF4Rec model with high-level api (users should be add them to the embedding layer, and freeze them, i.e., set trainable=False (TF Api) or requires_grad=False (PyTorch API))
  • create an example notebook for showcasing that functionality

Motivation

This is a FEA coming from our customers and users.

Metadata

Metadata

Assignees

Type

No type

Projects

No projects

Relationships

None yet

Development

No branches or pull requests

Issue actions