Skip to content

Package and run Spark jobs on cluster #11

@DonDebonair

Description

@DonDebonair

It should be possible to package and submit Spark jobs to the cluster, and capture the output somehow(?). Do we also compile Scala/Java Spark projects? Or just package them (what does that mean?), ship them to the cluster and run them?

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions