-
Notifications
You must be signed in to change notification settings - Fork 1
Open
Description
Most of the Params used in our Transformers and Estimators are going to be shared among them. Spark ML solves this issue by having small interfaces with the shared Params.
See apache/spark/mllib/.../sharedParams.scala
The proposal is to build our own set of shared params, e.g.:
public interface HasStudyId extends Params {
default Param<String> studyIdParam() {
return new Param<>(this, "studyId", "Id of the study to be used.");
}
default String getStudyId() {
return getOrDefault(studyIdParam());
}
default HasStudyId setStudyId(String study) {
set(studyIdParam(), study);
return this;
}
}Metadata
Metadata
Assignees
Labels
spark-analysisNew Spark analysisNew Spark analysis