-
Notifications
You must be signed in to change notification settings - Fork 2
Guess a good encoded layer size #48
Copy link
Copy link
Open
Description
Seems like 200 worked fine for the MNIST digits (784 features), but it doesn't work as well for microarray data (~20,000 features). Is there some theory on how big an encoding layer should be with respect to the input & output?
If not, it may be good to have it be something like math.ceil(input_size**0.5). Or I could run confounded on multiple datasets with multiple sized inputs & do a grid search for optimal encoding layer size, and then develop some heuristic based on results from that.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels